
Social media platforms continue to push teenagers toward content about suicide, self-harm, and “intense depression,” a new report has found.
The UK-based Molly Rose Foundation created TikTok and Instagram accounts posing as a 15-year-old girl who had previously engaged with this kind of content. Nearly every video that came up on the two platforms were related to suicide, depression, or self-harm, the group said.
TikTok’s For You Page, for example, regularly recommended videos that “explicitly promoted and glorified suicide” and recommended specific suicide methods, the report said.
On Instagram, the fake users were most likely to see this kind of content on Reels, the platform’s short-form video feature.
“Harmful algorithms continue to bombard teenagers with shocking levels of harmful content, and on the most popular platforms for young people this can happen at an industrial scale,” said Andy Burrows, the Molly Rose Foundation’s chief executive.
The tests were run in the weeks before the UK Online Safety Act’s child safety rules came into effect in late July. Among other measures, the law requires social media sites to “rapidly remove illegal suicide and self-harm content” and “proactive protect users” from illegal content on these topics.
But the foundation said the latest findings indicate little has changed since 2017, when 14-year-old Molly Russell died by suicide in the UK. A coroner ruled that exposure to harmful content online contributed in a “more than minimal way” to her death.
The group called on the UK communications regulator Ofcom to take additional steps to protect young people from harmful content online, and for the government to strengthen the Online Safety Act.
A TikTok spokesperson disputed the findings, telling Euronews Next they “don’t reflect the real experience of people on our platform, which the report admits”. The spokesperson said TikTok proactively removes 99 per cent of content that violates its standards.
A spokesperson from Meta, the parent company of Instagram and Facebook, also disagreed with the report's conclusions, saying the methodology was "limited".
They added that "tens of thousands" of teenagers are now in Instagram's "Teen Accounts,” which the company rolled out last year. These accounts have built-in safety features such as restrictions on teens’ access to sensitive content.
"We developed Teen Accounts to help protect teens online and continue to work tirelessly to do just that," the spokesperson said.
If you are contemplating suicide and need to talk, please reach out to Befrienders Worldwide, an international organisation with helplines in 32 countries. Visit befrienders.org to find the telephone number for your location.
Updated August 21: This article has been updated to include comment from a Meta spokesperson.