Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
Nicole Wootton-Cane

Half of girls exposed to harmful content online with teens twice as likely to see it on TikTok and X

Girls were twice as likely as boys to encounter harmful content, the Molly Rose Foundation said (PA) - (PA Wire)

Half of girls were fed harmful online content including posts about self-harm, suicide and eating disorders on social media apps during a single week, a new study has found.

Teenagers were twice as likely to encounter “high risk” content on TikTok and X than other major platforms, with girls encountering significantly more harmful posts than boys, analysis of data from nearly 2,000 youngsters found.

Suicide prevention charity the Molly Rose Foundation, who conducted the research weeks before the implementation of the Online Safety Act, said their findings suggest teenagers were being algorithmically recommended harmful content at an “incredibly disturbing scale”.

Children were being served high risk posts without searching for them, the study said, with over 50 per cent of teens surveyed reporting being exposed to potentially high risk content algorithmically in platforms’ recommender feeds such as TikTok’s “for you” page.

The charity accused algorithms of pushing potentially dangerous content to vulnerable teens and “targeting those at greatest risk of its effects”. It said 68 per cent of children categorised as having low wellbeing saw high-risk suicide, self-harm, depression or eating disorder content over the course of a week.

Those experiencing low wellbeing or with special educational needs and disabilities (SEND) were also more likely to encounter high risk content, the charity said, with two in five reporting it appearing in their feeds.

TikTok defended its safety features, claiming the study: “generalises health and wellness content as inherently risky.”

Named after 14-year-old Molly Russell, who died from an act of self-harm while suffering from depression and “the negative effects of online content” in 2017, the Molly Rose Foundation said the data showed exposure to the highest risk types of suicide and self-harm content before the Act was “much greater than previously understood”.

Molly Russell chose to end her life aged 14 after viewing harmful content online (family handout/PA) (PA Media)

Introduced in 2023, the Online Safety Act aims to regulate and curb harmful online content and requires major platforms to either prevent these high-risk types of content from appearing in children’s feeds or prevent them from appearing as often.

But the charity said their findings should act as a “wake-up call” for the “urgent need” to strengthen the legislation.

Andy Burrows, vhief rxecutive of Molly Rose Foundation, said: “This groundbreaking study shows that teenagers were being exposed to high-risk suicide, self-harm and depression content at an incredibly disturbing scale just weeks before the Online Safety Act took effect, with girls and vulnerable children facing markedly increased risk of harm.

“The extent to which girls were being bombarded with harmful content is far greater than we previously understood and heightens our concerns that Ofcom’s current approach to regulation fails to match the urgency and ambition needed to ultimately save lives.

“The Technology Secretary Liz Kendall must now seize the opportunity to act decisively to build on and strengthen the Online Safety Act and put children and families before the Big Tech status quo.”

An Ofcom spokesperson said under new measures designed to protect children in the Online Safety Act, any sites that allow suicide, self harm and eating disorder content must have highly effective age checks in place to stop children seeing it. It added tech firms must restrict other harmful content appearing in children’s feeds.

“Later this year, we’ll also publish new guidance on the steps sites and apps should take to help women and girls live safer lives online – recognising the harms that disproportionately affect them,” it said.

X declined to comment but pointed The Independent towards its policies which forbid promoting or encouraging self-harm.

A TikTok spokesperson said: “TikTok has 50+ safety features and settings specifically designed to help teens safely express themselves, discover, and learn, including from experts like teachers, scientists, and doctors.

“We proactively provide access to reliable well-being content while removing 99% of violating mental and behavioural health content before it's ever reported to us. This study draws a misleading conclusion about our platform by taking a simplistic view of a complex topic, generalising health and wellness content as inherently risky.”

A Department for Science, Innovation and Technology (DSIT) spokesperson said: “While this research pre-dates the enforcement of the Child Safety Duties on 25 July, we expect young people to now be protected from damaging content, including material promoting self-harm or suicide, as platforms comply with the legal requirements of the Act. That means safer algorithms and less toxic feeds.

“Services that fail to comply can expect tough enforcement from Ofcom. We are determined to hold tech companies to account and keep children safe.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.