Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Business
Sally Weale Education correspondent

Social media algorithms ‘amplifying misogynistic content’

A young person using the TikTok app on a smartphone.
Researchers said they detected a four-fold increase in the level of misogynistic content suggested by TikTok over a five-day period of monitoring. Photograph: Peter Byrne/PA

Algorithms used by social media platforms are rapidly amplifying extreme misogynistic content, which is spreading from teenagers’ screens and into school playgrounds where it has become normalised, according to a new report.

Researchers said they detected a four-fold increase in the level of misogynistic content suggested by TikTok over a five-day period of monitoring, as the algorithm served more extreme videos, often focused on anger and blame directed at women.

While this particular study looked at TikTok, researchers said their findings were likely to apply to other social media platforms and called for a “healthy digital diet” approach to tackling the problem, rather than outright bans on phones or social media which “are likely to be ineffective”.

The study, by teams at University College London and the University of Kent, comes at a time of renewed concern about the impact of social media on young people. Research last week found young men from generation Z – many of whom revere social media influencer Andrew Tate – are more likely than baby boomers to believe that feminism has done more harm than good.

Meanwhile, the mother of murdered teenager, Brianna Ghey, called for social media apps to be banned on smartphones for under-16s after hearing evidence about the online activities of her daughter’s killers.

The UCL/Kent study, called Safer Scrolling, argues that harmful content is presented as entertainment through the algorithmic processes of social media. Toxic, hateful or misogynistic material is “pushed” to young people, with boys who are suffering from anxiety and poor mental health at increased risk, it said.

“Harmful views and tropes are now becoming normalised among young people,” said principal investigator Dr Kaitlyn Regehr (UCL Information Studies). “Online consumption is impacting young people’s offline behaviours, as we see these ideologies moving off screens and into schoolyards.”

Researchers interviewed young people engaging with and producing radical online content to help create a number of archetypes of teenage boys who might be vulnerable to becoming radicalised. Accounts were set up on TikTok for each archetype, each with specific interests – they might be seeking content on masculinity or loneliness – and researchers then watched more than 1,000 videos that TikTok suggested on its “For You” page over seven days.

The initial suggested content was in line with the stated interests of each archetype, but after five days researchers said the TikTok algorithm was presenting four times as many videos with misogynistic content including objectification, sexual harassment or discrediting women, which increased from 13% of recommended videos to 56%.

“Algorithmic processes on TikTok and other social media sites target people’s vulnerabilities – such as loneliness or feelings of loss of control – and gamify harmful content,” said Regehr. “As young people microdose on topics like self-harm, or extremism, to them, it feels like entertainment.”

Researchers also interviewed young people and school leaders about the impact of social media and found that hateful ideologies and misogynistic tropes have moved off screens and into schools, and have become embedded in mainstream youth cultures.

Geoff Barton, general secretary of the Association of School and College Leaders, which collaborated with the research, said: “UCL’s findings show that algorithms – which most of us know little about – have a snowball effect in which they serve up ever-more extreme content in the form of entertainment.

“This is deeply worrying in general but particularly so in respect of the amplification of messages around toxic masculinity and its impact on young people who need to be able to grow up and develop their understanding of the world without being influenced by such appalling material.

“We call upon TikTok in particular and social media platforms in general to review their algorithms as a matter of urgency and to strengthen safeguards to prevent this type of content, and on the government and Ofcom to consider the implications of this issue under the auspices of the new Online Safety Act.”

Andy Burrows, adviser to the Molly Rose Foundation, which was set up in memory of Molly Russell, who killed herself after falling into a vortex of despair on social media, said: “This research reinforces how TikTok’s algorithms ruthlessly target and bombard young people with harmful content, and within days can serve teens a near constant barrage of unhealthy and sometimes dangerous videos.

“It couldn’t be clearer that the regulator Ofcom needs to take bold and decisive action to tackle high-risk algorithms that prioritise the revenue of social media companies over the safety and wellbeing of teens.”

Speaking on a trip to Northern Ireland, prime minister Rishi Sunak said: “As a parent, I am always worried about social media and what my young girls are exposed to. That’s why I’m pleased we have passed the Online Safety Act over the last year and that means the regulator now has tough new powers to control what is exposed to children online.

“And if the big social media companies do not comply with that, the regulator is able to levy very significant fines on them and the priority now is making sure that act is up and running.”

A TikTok spokesperson said: “Misogyny has long been prohibited on TikTok and we proactively detect 93% of content we remove for breaking our rules on hate. The methodology used in this report does not reflect how real people experience TikTok.”

An Ofcom spokesperson said: “Tackling violence against women and girls online is a priority for us. Our research shows that women are less confident about their personal online safety, and are more affected by harmful content like trolling.

“Under the Online Safety Act, online services such as social media and search services will have duties to protect users’ safety and their rights – understanding and addressing content which disproportionately affects women and girls online is central to this.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.