Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Alex Hern UK technology editor

Child abuse image offences in UK have soared, NSPCC report shows

Child's hands on laptop keyboard
The head of the NSPCC called for tougher provisions in the online safety bill when it returns to parliament. Photograph: Jozef Polc/Alamy

Police have recorded a surge in child abuse image offences in the UK, with more than 30,000 reported in the most recent year, according to a report from the NSPCC.

That is an increase of more than 66% on figures from five years ago, when police forces across the country recorded 18,574 such offences.

The charity warned that the increase was in part due to the “pervasive” issue of young people being groomed into sharing images of their own abuse, with tech companies failing to stop their sites being used by offenders to “organise, commit and share child sexual abuse”.

But better police recording, greater awareness of abuse and survivors feeling more confident in coming forward can also contribute to higher numbers of recorded offences, the NSPCC added.

“These new figures are incredibly alarming but reflect just the tip of the iceberg of what children are experiencing online,” said Sir Peter Wanless, the chief executive of the NSPCC.

“We hear from young people who feel powerless and let down as online sexual abuse risks becoming normalised for a generation of children.”

In those instances where a social media or gaming site was recorded alongside the offence, just two companies were responsible for more than three-quarters of the reports: Snapchat, with more than 4,000 incidents, and Meta, whose three flagship apps – Facebook, Instagram and WhatsApp – were mentioned in more than 3,000 incidents. The company’s Oculus “metaverse” brand was mentioned in one report, with virtual reality more generally being mentioned seven times.

Teenager Roxy Longworth’s experience shows how fighting the problem can require coordination between corporate rivals. She was 13 when she was contacted on Facebook by a boy four years older than her, who coerced her into sending images via Snapchat. He passed the pictures on to his friends, and a pattern of blackmail and manipulation coerced Roxy into sending even more photos to another boy, which were then shared publicly on social media.

“I sat on the floor and cried,” Roxy said. “I’d lost all control and there was no one to talk to about it. I blocked him on everything and prayed he wouldn’t show anyone the pictures, because of how young I was.

“After that, I was just waiting to see what would happen. Eventually someone in my year sent me some of the pictures and that’s when I knew they were out.”

In a statement, a Meta spokesperson said: “This horrific content is banned on our apps, and we report instances of child sexual exploitation to [the National Center for Missing and Exploited Children].

“We lead the industry in the development and use of technology to prevent and remove this content, and we work with the police, child safety experts and industry partners to tackle this societal issue. Our work in this area is never done, and we’ll continue to do everything we can to keep this content off our apps.”

Jacqueline Beauchere, the global head of platform safety at Snapchat, said: “Any sexual abuse of children is abhorrent and illegal. We have dedicated teams around the world working closely with the police, experts and industry partners to combat it. When we proactively detect or are made aware of any sexual content exploiting minors, we immediately remove it, delete the account and report the offender to authorities. Snapchat has extra protections in place that make it more difficult for younger users to be discovered and contacted by strangers.”

The NSPCC, which compiled the figures from freedom of information requests sent to police forces across the UK, says the data demonstrates the need for a “child safety advocate” to be included in the next iteration of the online safety bill when it returns to parliament.

The proposal would give the advocate the power to intervene directly with Ofcom, the internet regulator, on behalf of children online, “to ensure appropriate counterbalance against well-resourced industry interventions”, the NSPCC says.

“By creating a child safety advocate that stands up for children and families, the government can ensure the online safety bill systemically prevents abuse,” Wanless added. “It would be inexcusable if in five years’ time we are still playing catch-up to pervasive abuse that has been allowed to proliferate on social media.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.