Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Business
Dan Milmo Global technology editor

Minister attacks ‘despicable’ failure of social media firms on harmful content

A family photo of Molly Russell.
Molly Russell took her own life after viewing content related to suicide, self-harm and depression on Instagram and Pinterest. Photograph: Family handout/PA

Social media firms have been accused of a “despicable” failure to act on harmful content after new research showed that material related to suicide, self-harm and depression is still prevalent on Instagram, TikTok and Pinterest six years after Molly Russell’s death.

Michelle Donelan, the technology secretary, spoke after the Molly Rose Foundation, a suicide prevention charity established by Molly’s family, published a report showing a “clear and persistent problem” with readily available and discoverable harmful content on the platforms.

“It is despicable and indefensible that social media companies are still turning a blind eye to the scale of horrendous suicide and self-harm content on their platforms,” said Donelan.

The technology secretary added that she would be raising the issue at a meeting with tech executives soon.

Molly’s father, Ian Russell, said the report showed “little has changed” on social media platforms since his daughter’s death.

He said that social media companies must share “internal worries” over harms their platforms cause to teenagers, as court documents in the US revealed that Instagram’s owner, Meta, was concerned “similar incidents” to Molly’s death would occur because the platform’s algorithms were “[l]eading users to distressing content”.

Russell said: “What I would like to see is these companies being honest about their internal worries about the negative effects of their platforms.

“This is not to say that the platforms are wholly bad … But they have to focus on problems related to this harmful content and take measures that really separate vulnerable children from harmful material that might be on their platforms.”

Molly Russell, from Harrow, north-west London, took her own life at the age of 14 in November 2017 after viewing harmful content related to suicide, self-harm, depression and anxiety on Instagram and Pinterest. In a landmark ruling, last year an inquest into her death found that Molly “died from an act of self-harm while suffering from depression and the negative effects of online content”.

The MRF study analysed 1,181 of the most engaged with posts on TikTok and Instagram that used well-known hashtags relating to suicide, self-harm and depression and found a significant proportion of the material was harmful.

The research, produced in collaboration with non-profit organisation Bright Data, also searched for content under hashtags that Molly had used. It found that 48% of the most engaged with posts on Instagram were found to be harmful – defined as promoting or glorifying suicide or self-harm, or containing “relentless” themes of depression – while 49% of the posts examined on TikTok were also found to be harmful.

The report also found that Pinterest “actively recommended” suicide and self-harm content including a video with the caption “one day I will leave and never come back any more”.

The Online Safety Act became law last month and protecting children from harmful content is a key focus of the legislation. Its provisions include requiring that social media companies prevent children from encountering harmful content such as suicide-related material.

A company spokesperson for Meta said the company wanted teenagers to have “safe” experiences on Instagram and works with experts to develop guidelines on suicide and self-harm content in a bid to strike a balance “between preventing people seeing sensitive content while giving people space to talk about their own experiences and find support.”

A spokesperson for TikTok, which launched in the UK in 2018, said: “Content that promotes self-harm or suicide is prohibited on TikTok and, as the report highlights, we strictly enforce these rules by removing 98% of suicide content before it is reported to us.”

A spokesperson for Pinterest said the company was “constantly updating our policies and enforcement practices around self-harm content.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.