Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Glasgow Live
Glasgow Live
National
Sophie Buchan

Apple to scan iPhone and iCloud accounts for images and videos of child abuse

Apple are set to search all iPhones and iCloud accounts in order to scan for child sexual abuse content.

The new system, named neuralMatch in the US, is designed to compare similar, edited or identical images and videos to an existing database provided by child protection agencies.

Once these pictures or videos have been alerted to human operators, they will then be reviewed and handed over to the necessary authorities.

Apple says that the feature has "extremely high levels of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account".

Despite good intentions, some are worried that it will be an invasion of privacy if their pictures and videos are monitored.

The system will be rolled out to the US first but Apple say that the latest versions of its iPhone and iPad operating systems, released later this year, will include the new scanning technology.

The feature will be rolled out to users later this year. (Apple)

In a post on its website, the technology company said: "Our goal is to create technology that empowers people and enriches their lives while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).

"First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

"Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

"Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics."

Siri will provide resources and help around searches related to CSAM. (Apple)

Matthew Green, a security researcher at Johns Hopkins University, said: "Regardless of what Apple's long-term plans are, they've sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users' phones for prohibited content."

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.