Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Street
The Street
Colette Bennett

How a major tech trend is being used to protect your kids from predators

In an age where your adolescent or older teenager can easily meet someone dangerous online, parents and regulators alike are constantly searching for new ways to protect young people from ill-intentioned strangers.

While improving these practices is very much a work in progress, one new tool that some are turning to is a form of tech many already have mixed feelings about: AI.

Related: Human creativity persists in the era of generative AI

The National Center for Missing & Exploited Children, which processed more than 21,000 missing-child cases in 2023, recently told Fox Business it saved more than 4,000 hours of review and investigation time by using artificial-intelligence technology made by eDiscovery firm Reveal.

"In the past most of our legal team would need to drop everything to complete the document review/redaction/production manually and meet the delivery deadline just a few weeks later, as required by the court so the criminal case can move forward and not risk any sanctions, or worse, to risk any possibility of the case being dismissed," Gavin Portnoy, vice president of the National Center’s strategic advancement and partnerships division, said in a statement.

The office of legal counsel for the National Center for Missing & Exploited Children "instead saved hundreds of lawyer hours by utilizing Reveal's Logikcull technology to quickly and efficiently handle it."

The National Center also said it chose to keep the functions the AI tool does separate from any child sexual abuse material.

While Reveal was created to help process large amounts of legal data, Founder and CEO Wendell Jisa told Fox Business that he found it really interesting and compelling since the technology "was not created to do that" — to catch predators.

How AI aids the fight against child abuse

By age 18, one in five girls and one in 13 boys globally have experienced abuse and/or sexual exploitation, with many incidents taking place via online interactions, according to Unicef. Of the roughly 1.8 billion photos uploaded to the internet each day, around 720,000 are believed to be illegal images of children.

The number of predators to try to identify and catch is enormous. But the key to AI's power in this arena is its ability to process large quantities of data at a speed that would be impossible for a team of human investigators.

And the National Center is not alone in its efforts. In 2020, the United National Interregional Crime and Justice Research Institute Centre of AI and Robotics partnered with Ministry of Interior of the United Arab Emirates to launch the AI for Safer Children Initiative.

The research institute's AI for Safer Children Global Hub offers a centralized platform for law-enforcement agencies, including a catalog of AI tools, a learning center, and a networking feature to meet other officers in hopes of encouraging further use of the technology.

As hopeful as such news is to hear, many will immediately reflect on the possibility of pitfalls. Along with all the buzz about AI in 2023, many errors have been reported, such as Google's AI assistant Bard making factual errors during a demo

The message is clear: AI is still in its infancy in many ways, and it needs work. But even in its current stage, it can still do tremendous good.

Get exclusive access to portfolio managers’ stock picks and proven investing strategies with Real Money Pro. Get started now.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.