Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - US
The Guardian - US
Technology
Dan Milmo Global technology editor

Meta taskforce to combat trade of child sex abuse materials after damning report

phone screen showing social media apps
Previously, users could search for terms related to child sexual abuse materials and Instagram would serve images with the warning ‘these results may contain images of child sexual abuse’. Photograph: Peter Byrne/PA

Mark Zuckerberg’s Meta has set up a taskforce to investigate claims that Instagram is hosting the distribution and sale of self-generated child sexual abuse material, with the platform’s algorithms helping advertise illicit content.

The move by the Facebook parent comes after a report from the Stanford Internet Observatory (SIO) that found a web of social media accounts, which appear to be operated by minors, advertising self-generated child sexual abuse material (SG-CSAM).

The SIO said that Instagram is “currently the most important platform for these networks” with enabling features such as recommendation algorithms and direct messaging that connects buyers and sellers of SG-CSAM.

The SIO said it acted on a tip from the Wall Street Journal, which detailed Instagram’s SG-CSAM problems, along with the SIO’s findings, in an investigation published on Wednesday.

The SIO reported that Instagram has allowed users to search for terms that its own algorithms know could be linked to SG-CSAM, with a pop-up screen for users warning that “these results may contain images of child sexual abuse”. The screen gives users the option to “see results anyway”. Instagram has removed the option for users to view the content after being contacted by the Journal.

In a statement, a Meta spokesperson said the company had set up an internal taskforce to deal with the claims in the reports.

“We’re continuously exploring ways to actively defend against this behaviour, and we set up an internal task force to investigate these claims and immediately address them,” said the spokesperson.

The SIO report follows a Guardian investigation in April that revealed how Meta is failing to report or detect the use of Facebook and Instagram for child trafficking. In response to the Guardian’s allegations at the time, a Meta spokesperson said: “The exploitation of children is a horrific crime – we don’t allow it and we work aggressively to fight it on and off our platforms.”

The SIO said its investigation found that large networks of social media accounts are openly advertising self-generated child sexual abuse material. It said Instagram’s popularity and “user-friendly interface” made it a preferred option among platforms.

“The platform’s recommendation algorithms effectively advertise SG-CSAM: these algorithms analyze user behaviours and content consumption to suggest related content and accounts to follow,” said the SIO.

The report said SG-CSAM can sometimes be distributed voluntarily but then become widely distributed publicly. It can also overlap with non-consensual intimate imagery, also referred to as “revenge porn”, while minors can also be coerced into producing sexual content. The SIO added that in recent years SG-CSAM has increasingly become a commercial venture including the posting of content “menus” online.

Researchers said they looked at one network in particular in which there were 405 accounts advertising the sale of SG-CSAM on Instagram as well as 128 seller accounts on Twitter. They said 58 accounts within the Instagram follower network appeared to be content buyers. The accounts were referred to the National Center for Missing and Exploited Children (NCMEC), which processes reports of online sexual child exploitation from US tech platforms. The SIO report said one month after they were reported to the NCMEC, 31 of the Instagram seller accounts were still active, along with 28 of the likely buyer accounts. On Twitter, 22 out of the 128 accounts identified in the report were still active. Twitter has been contacted for comment.

Meta said it had already addressed some of the investigation findings, saying in a statement it had fixed a technical issue that prevented reports of SG-CSAM from reaching content viewers and updating guidance to content reviews about identifying and removing predatory accounts. The Journal reported that an anti-paedophile activist was told by Instagram that one image of a scantily clad girl with a graphically sexual caption “does not go against our Community Guidelines” and was told to hide the account in order to avoid seeing its content.

Meta said in its statement it had also removed “thousands” of SG-CSAM-related search terms and hashtags on Instagram after researchers at the SIO found that paedophiles were searching under terms such as #pedobait and variations on #mnsfw (“minor not safe for work”).

Meta added that between 2020 and 2022 it had dismantled 27 abusive networks while in January this year it had disabled more than 490,000 accounts for violating its child safety policies.

The SIO report said industry-wide action is needed to tackle the problem.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.