Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Conversation
The Conversation
Deborah Fry, Professor of International Child Protection Research and Director of Data at the Childlight Global Child Safety Institute , The University of Edinburgh

We found over 300 million young people had experienced online sexual abuse and exploitation over the course of our meta-study

Shutterstock/Namning

It takes a lot to shock Kelvin Lay. My friend and colleague was responsible for setting up Africa’s first dedicated child exploitation and human trafficking units, and for many years he was a senior investigating officer for the Child Exploitation Online Protection Centre at the UK’s National Crime Agency, specialising in extra territorial prosecutions on child exploitation across the globe.

But what happened when he recently volunteered for a demonstration of cutting-edge identification software left him speechless. Within seconds of being fed with an image of how Lay looks today, the AI app sourced a dizzying array of online photos of him that he had never seen before – including in the background of someone else’s photographs from a British Lions rugby match in Auckland eight years earlier.

“It was mind-blowing,” Lay told me. “And then the demonstrator scrolled down to two more pictures, taken on two separate beaches – one in Turkey and another in Spain – probably harvested from social media. They were of another family but with me, my wife and two kids in the background. The kids would have been six or seven; they’re now 20 and 22.”

Portait photo of a middle aged man.
Investigator, Kelvin Lay, Director of Engagement and Risk. University of Edinburgh

The AI in question was one of an arsenal of new tools deployed in Quito, Ecuador, in March when Lay worked with a ten-country taskforce to rapidly identify and locate perpetrators and victims of online child sexual exploitation and abuse – a hidden pandemic with over 300 million victims around the world every year.

That is where the work of the Childlight Global Child Safety Institute, based at the University of Edinburgh, comes in. Launched a little over a year ago in March 2023 with the financial support of the Human Dignity Foundation, Childlight’s vision is to use the illuminating power of data and insight to better understand the nature and extent of child sexual exploitation and abuse.


This article is part of Conversation Insights
The Insights team generates long-form journalism derived from interdisciplinary research. The team is working with academics from different backgrounds who have been engaged in projects aimed at tackling societal and scientific challenges.


I am a professor of international child protection research and Childlight’s director of data, and for nearly 20 years I have been researching sexual abuse and child maltreatment, including with the New York City Alliance Against Sexual Assault and Unicef.

The fight to keep our young people safe and secure from harm has been hampered by a data disconnect – data differs in quality and consistency around the world, definitions differ and, frankly, transparency isn’t what it should be. Our aim is to work in partnership with many others to help join up the system, close the data gaps and shine a light on some of the world’s darkest crimes.

302 million victims in one year

Our new report, Into The Light, has produced the world’s first estimates of the scale of the problem in terms of victims and perpetrators.

Our estimates are based on a meta-analysis of 125 representative studies published between 2011 and 2023, and highlight that one in eight children – 302 million young people – have experienced online sexual abuse and exploitation in a one year period preceding the national surveys.

Additionally, we analysed tens of millions of reports to the five main global watchdog and policing organisations – the Internet Watch Foundation (IWF), the National Centre for Missing and Exploited Children (NCMEC), the Canadian Centre for Child Protection (C3P), the International Association of Internet Hotlines (INHOPE), and Interpol’s International Child Sexual Exploitation database (ICSE). This helped us better understand the nature of child sexual abuse images and videos online.

While huge data gaps mean this is only a starting point, and far from a definitive figure, the numbers we have uncovered are shocking.

We found that nearly 13% of the world’s children have been victims of non-consensual taking, sharing and exposure to sexual images and videos.

In addition, just over 12% of children globally are estimated to have been subject to online solicitation, such as unwanted sexual talk which can include non-consensual sexting, unwanted sexual questions and unwanted sexual act requests by adults or other youths.

Cases have soared since COVID changed the online habits of the world. For example, the Internet Watch Foundation (IWF) reported in 2023 that child sexual abuse material featuring primary school children aged seven to ten being coached to perform sexual acts online had risen by more than 1,000% since the UK went into lockdown.

The charity pointed out that during the pandemic, thousands of children became more reliant on the internet to learn, socialise, and play and that this was something which internet predators exploited to coerce more children into sexual activities – sometimes even including friends or siblings over webcams and smartphones.

There has also been a sharp rise in reports of “financial sextortion”, with children blackmailed over sexual imagery that abusers have tricked them into providing – often with tragic results, with a spate of suicides across the world.

This abuse can also utilise AI deepfake technology – notoriously used recently to generate false sexual images of the singer Taylor Swift.

Our estimates indicate that just over 3% of children globally experienced sexual extortion in the past year.

A child sexual exploitation pandemic

This child sexual exploitation and abuse pandemic affects pupils in every classroom, in every school, in every country, and it needs to be tackled urgently as a public health emergency. As with all pandemics, such as COVID and AIDS, the world must come together and provide an immediate and comprehensive public health response.

Our report also highlights a survey which examines a representative sample of 4,918 men aged over 18 living in Australia, the UK and the US. It has produced some startling findings. In terms of perpetrators:

  • One in nine men in the US (equating to almost 14 million men) admitted online sexual offending against children at some point in their lives – enough offenders to form a line stretching from California on the west coast to North Carolina in the east or to fill a Super Bowl stadium more than 200 times over.

  • The surveys found that 7% of men in the UK had admitted the same – equating to 1.8 million offenders, or enough to fill the O2 area 90 times over and by 7.5% of men in Australia (nearly 700,000).

  • Meanwhile, millions across all three countries said they would also seek to commit contact sexual offences against children if they knew no one would find out, a finding that should be considered in tandem with other research indicating that those who watch child sexual abuse material are at high risk of going on to contact or abuse a child physically.

The internet has enabled communities of sex offenders to easily and rapidly share child abuse and exploitation images on a staggering scale, and this in turn, increases demand for such content among new users and increases rates of abuse of children, shattering countless lives.

In fact, more than 36 million reports of online sexual images of children who fell victim to all forms form of sexual exploitation and abuse were filed in 2023 to watchdogs by companies such as X, Facebook, Instagram, Google, WhatsApp and members of the public. That equates to one report every single second.

Quito operation

Like everywhere in the world, Ecuador is in the grip of this modern, transnational problem: the rapid spread of child sexual exploitation and abuse online. It can see an abuser in, say, London, pay another abuser in somewhere like the Philippines to produce images of atrocities against a child that are in turn hosted by a data centre in the Netherlands and dispersed instantly across multiple other countries.

When Lay – who is also Childlight’s director of engagement and risk – was in Quito in 2024, martial law meant a large hotel normally busy with tourists flocking for the delights of the Galápagos Islands, was eerily quiet, save for a group of 40 law enforcement analysts, researchers and prosecutors who had more than 15,000 child sexual abuse images and videos to analyse.

The cache of files included material logged with authorities annually, content from seized devices, and from Interpol’s International Child Sexual Exploitation (ICSE) database database. The files were potentially linked to perpetrators in ten Latin American and Caribbean countries: Argentina, Chile, Colombia, Costa Rica, Ecuador, El Salvador, Honduras, Guatemala, Peru and the Dominican Republic.

A shot  of a group of law enforcement officials
The ‘digital guardians’ taskforce that Kelvin Lay was a member of in support of global efforts against child exploitation in Latin America and the Caribbean. Lay is third from the right (back row). Edinburgh University

Child exploitation exists in every part of the world but, based on intelligence from multiple partners in the field, we estimate that a majority of Interpol member countries lack the training and resources to properly respond to evidence of child sexual abuse material shared with them by organisations like the National Center for Missing and Exploited Children (NCMEC). NCMEC is a body created by US Congress to log and process evidence of child sexual abuse material uploaded around the world and spotted, largely, by tech giants. However, we believe this lack of capacity means that millions of reports alerting law enforcement to abuse material are not even opened.

The Ecuador operation, in conjunction with the International Center for Missing and Exploited Children (ICMEC) and US Homeland Security, aimed to help change that by supporting authorities to develop further skills and confidence to identify and locate sex offenders and rescue child victims.

Central to the Quito operation was Interpol’s database database that contains around five million images and videos that specialised investigators from more than 68 countries use to share data and co-operate on cases.

Using image and video comparison software – essentially photo ID work that instantly recognises the digital fingerprint of images – investigators can quickly compare images they have uncovered with images contained in the database. The software can instantly make connections between victims, abusers and places. It also avoids duplication of effort and saves precious time by letting investigators know whether images have already been discovered or identified in another country. So far, it has helped identify more than 37,900 victims worldwide.

Lay has significant field experience using these resources to help Childlight turn data into action – recently providing technical advice to law enforcement in Kenya where successes included using data to arrest paedophile Thomas Scheller. In 2023, Scheller, 74, was given an 81-year jail sentence. The German national was found guilty by a Nairobi court of three counts of trafficking, indecent acts with minors and possession of child sexual abuse material.

Officials  at work on computers
The Quito nerve centre where the task force carried out its work. Edinburgh University

But despite these data strides, there are concerns about the inability of law enforcement to keep pace with a problem too large for officers to arrest their way out of. It is one enabled by emerging technological advances, including AI-generated abuse images, which threaten to overwhelm authorities with their scale.

In Quito, over a warming rainy season meal of encocado de pescado, a tasty regional dish of fish in a coconut sauce served with white rice, Lay explained:

This certainly isn’t to single out Latin America but it’s become clear that there’s an imbalance in the way countries around the world deal with data. There are some that deal with pretty much every referral that comes in, and if it’s not dealt with and something happens, people can lose their jobs. On the opposite side of the coin, some countries are receiving thousands of email referrals a day that don’t even get opened.

Now, we are seeing evidence that advances in technology can also be utilised to fight online sexual predators. But the use of such technology raises ethical questions.

Contentious AI tool draws on 40 billion online images

The powerful, but contentious AI tool, that left Lay speechless was a case in point: one of multiple AI facial recognition tools that have come onto the market, and with multiple applications. The technology can help identify people using billions of images scraped from the internet, including social media.

AI facial recognition software like this has reportedly been used by Ukraine to debunk false social media posts, enhance safety at check points and identify Russian infiltrators, as well as dead soldiers. It was also reportedly used to help identify rioters who stormed the US capital in 2021.

The New York Times magazine reported on another remarkable case. In May 2019, an internet provider alerted authorities after a user received images depicting the sexual abuse of a young girl.

One grainy image held a vital clue: an adult face visible in the background that the facial recognition company was able to match to an image on an Instagram account featuring the same man, again in the background. This was in spite of the fact that the image of his face would have appeared about half the size of a human fingernail when viewing it. It helped investigators pinpoint his identity and the Las Vegas location where he was found to be creating the child sexual abuse material to sell on the dark web. That led to the rescue of a seven-year-old girl and to him being sentenced to 35 years in jail.

Meanwhile, for its part, the UK government recently argued that facial recognition software can allow police to “stay one step ahead of criminals” and make Britain’s streets safer. Although, at the moment, the use of such software is not allowed in the UK.

When Lay volunteered to allow his own features to be analysed, he was stunned that within seconds the app produced a wealth of images, including one that captured him in the background of a photo taken at the rugby match years before. Think about how investigators can equally match a distinctive tattoo or unusual wallpaper where abuse has occurred and the potential of this as a crime-fighting tool is easy to appreciate.

Of course, it is also easy to appreciate the concerns some people have on civil liberties grounds which have limited the use of such technology across Europe. In the wrong hands, what might such technology mean for a political dissident in hiding for instance? One Chinese facial recognition startup has come under scrutiny by the US government for its alleged role in the surveillance of the Uyghur minority group, for example.

Role of big tech

Similar points are sometimes made by big tech proponents of end-to-end encryption on popular apps: apps which are also used to share child abuse and exploitation files on an industrial scale – effectively turning the lights off on some of the world’s darkest crimes.

Why – ask the privacy purists – should anyone else have the right to know about their private content?

And so, it may seem to some that we have reached a Kafkaesque point where the right to privacy of abusers risks trumping the privacy and safety rights of the children they are abusing.

Clearly then, if encryption of popular file sharing apps is to be the norm, a balance must be struck that meets the desire for privacy for all users, with the proactive detection of child sexual abuse material online.

Meta has shown recently that there is potential for a compromise that could improve child safety, at least to some extent. Instagram, described by the NSPCC recently as the platform most used for grooming, has developed a new tool aimed at blocking the sending of sexual images to children – albeit, notably, authorities will not be alerted about those sending the material.

This would involve so-called client-side scanning which Meta believes undermines the chief privacy protecting feature of encryption – that only the sender and recipient know about the contents of messages. Meta has said it does report all apparent instances of child exploitation appearing on its site from anywhere in the world to NCMEC.

One compromise with the use of AI to detect offenders, suggests Lay, is a simple one: to ensure it can only be used under strict licence of child protection professionals with appropriate controls in place. It is not “a silver bullet”, he explained to me. AI-based ID will always need to be followed up by old fashioned police work but anything that can “achieve in 15 seconds what we used to spend hours and hours trying to get” is worthy of careful consideration, he believes.

The Ecuador operation, combining AI with traditional work, had an immediate impact in March. ICMEC reports that it led to a total of 115 victims (mainly girls and mostly aged six-12 and 13-15) and 37 offenders (mainly adult men) positively identified worldwide. Within three weeks, ICMEC said 18 international interventions had taken place, with 45 victims rescued and seven abusers arrested.

An inforgraphic showing the results of an investigation into online sexual abuse
Infographic showing the results of 10-country task force investigation. Edinburgh University

One way or another, a compromise needs to be struck to deal with this pandemic.

Child sexual abuse is a global public health crisis that is steadily worsening thanks to advancing technologies which enable instantaneous production and limitless distribution of child exploitation material, as well as unregulated access to children online.

These are the words of Tasmanian, Grace Tame: a remarkable survivor of childhood abuse and executive director of the Grace Tame Foundation which works to combat the sexual abuse of children.

“Like countless child sexual abuse victim-survivors, my life was completely upended by the lasting impacts of trauma, shame, public humiliation, ignorance and stigma. I moved overseas at 18 because I became a pariah in my hometown, didn’t pursue tertiary education as hoped, misused alcohol and drugs, self-harmed, and worked several minimum wage jobs”. Tame believes that “a centralised global research database is essential to safeguarding children”.

If the internet and technology brought us to where we are today, the AI used in Quito to save 45 children is a powerful demonstration of the power of technology for good. Moreover, the work of the ten-country taskforce is testament to the potential of global responses to a global problem on an internet that knows no national boundaries.

Greater collaboration, education, and in some cases regulation and legislation can all help, and they are needed without delay because, as Childlight’s mantra goes, children can’t wait.


For you: more from our Insights series:

To hear about new Insights articles, join the hundreds of thousands of people who value The Conversation’s evidence-based news. Subscribe to our newsletter.

The Conversation

Deborah Fry receives funding from the Human Dignity Foundation.

This article was originally published on The Conversation. Read the original article.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.