Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Dan Milmo Global technology editor

TikTok ‘directs child accounts to pornographic content within a few clicks’

Schoolboy uses smartphone.
Researchers set up TikTok accounts with birth dates of a 13-year-old, with ‘restricted mode’ also activated. Photograph: Matt Cardy/Getty Images

TikTok has directed children’s accounts to pornographic content within a small number of clicks, according to a report by a campaign group.

Global Witness set up fake accounts using a 13-year-old’s birth date and turned on the video app’s “restricted mode”, which limits exposure to “sexually suggestive” content.

Researchers found TikTok suggested sexualised and explicit search terms to seven test accounts that were created on clean phones with no search history.

The terms suggested under the “you may like” feature included “very very rude skimpy outfits” and “very rude babes” – and then escalated to terms such as “hardcore pawn [sic] clips”. For three of the accounts the sexualised searches were suggested immediately.

After a “small number of clicks” the researchers encountered pornographic content ranging from women flashing to penetrative sex. Global Witness said the content attempted to evade moderation, usually by showing the clip within an innocuous picture or video. For one account the process took two clicks after logging on: one click on the search bar and then one on the suggested search.

Global Witness, a climate organisation whose remit includes investigating big tech’s impact on human rights, said it conducted two batches of tests, with one set before the implementation of child protection rules under the UK’s Online Safety Act (OSA) on 25 July and another after.

It added that two videos featured someone who appeared to be under 16 years old and had been sent to the Internet Watch Foundation, which monitors online child sexual abuse material.

Global Witness claimed TikTok was in breach of the OSA, which requires tech companies to prevent children from encountering harmful content such as pornography.

A spokesperson for Ofcom, the UK communications regulator charged with overseeing the act, said: “We appreciate the work behind this research and will review its findings.”

Ofcom’s codes for adhering to the act state that tech companies that pose a medium or high risk of showing harmful content must “configure their algorithms to filter out harmful content from children’s feeds”. TikTok’s content guidelines ban pornographic content.

TikTok said after being contacted by Global Witness it had removed the offending videos and made changes to its search recommendations.

“As soon as we were made aware of these claims, we took immediate action to investigate them, remove content that violated our policies, and launch improvements to our search suggestion feature,” said a spokesperson.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.