
Teens can be shown self-harm and suicide content on TikTok within minutes of joining the app, Amnesty International has warned, as families in France pursue legal action and MPs examine the platform’s impact on young users.
A French teenager called Emma (not her real name) told RFI she was quickly drawn into toxic content.
Two years ago, the now 18-year-old installed the app on her phone for the first time. At first, the videos matched the interests she had selected. But after a few minutes, a music video caught her attention.
“The song talked about the struggles the singer faced with mental distress. Since I stayed on the video for quite a while, I was shown more like it. That is when I started falling into that spiral. And it kept getting worse and worse”, Emma remembers.
Within a week – and without having actively looked for it – she was being exposed to content that "normalised death, that encouraged self-harm, all kinds of dangerous and harmful behaviors”.
Emma’s mental health deteriorated. Visits to a psychologist were no longer enough. She became depressed and was hospitalised six times.
What the rise of ChatGPT mastermind Sam Altman reveals about AI, tech and power
Self-harm and suicide
Her testimony echoes an Amnesty International report released this week. The NGO spent months looking into how TikTok affects the mental health of young users.
It created three fake accounts of 13-year-olds and found that dangerous content appeared very quickly, even before users expressed any interest.
“When we created the three fake teenage accounts, we did not like anything, share anything, comment or even search,” says Katia Roux, advocacy officer at Amnesty International France.
“We only watched two videos related to mental health. And yet, that was enough to see the feeds of those accounts filled, almost flooded, with this kind of content. And after 45 minutes, we had the first content related to self-harm and suicide on two of these three accounts.”
French parliament to probe psychological effects of TikTok on children
Legal Action
Amnesty says TikTok’s moderation policies remain inadequate. It wants the platform to rethink its business model, which keeps users on the app as long as possible, in order to protect them better.
That is also the view of Stéphanie, whose daughter died by suicide five years ago after watching toxic content on TikTok.
“We could have shown her more friendly content or sport programmes or told her: go for a walk," she says.
"But the problem is, if you show her that, she will not stay on the platform. And in fact, TikTok’s model is to maximise time on the social network. They do not care about childhood.”
TikTok says it has moderation systems, parental controls and mental health resources in place.
Together with 10 other families, Stéphanie has filed a complaint in France for incitement to suicide. The case is still being investigated.
As for Emma, progress has been slow.
“I saw those videos, and some of them remain burned in my retina. I will have those contents for a very long time,” she says.
Now, however, she reports all toxic content she comes across on social media.
Political scrutiny
In March 2025, a French parliamentary inquiry opened to examine TikTok’s impact on young people. It will not investigate ongoing cases but has looked at whether the app shows more dangerous content to vulnerable groups.
On a broader European level, the Digital Services Act now requires stronger oversight of online platforms.
This article was adapted from the original version in French and edited for clarity.