Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Daily Mirror
Daily Mirror
National
Kieren Williams

iPhone will alert parents if their children have explicit images or sex texts

iPhones will send sexting warnings to parents if their children send, or receive, explicit images, Apple has announced.

On top of this, the tech giant has said that it will automatically report child abuse images on its devices to the authorities.

In an effort to protect young people online, and prevent the spread of child abuse material, the company is unveiling a trio of new safety tools.

The tool, known as neuralMatch, will initially only be available in the US, but Apple has plans for it soon to become available in the UK and other countries worldwide.

New Message systems will show a warning to a child, when they are sent sexually explicit messages.

Apple will introduce new safety measures to help protect children from child sex abuse materials (Getty Images)

It will automatically blur the image and reassure them it’s okay if they don’t want to view the picture, as well as presenting them with helpful resources.

Under the plans, parents using linked family accounts will also be warned of sexting and if children do choose to view the image, parents will be sent a notification.

Similar protections will be in place if a child attempts to send a sexually explicit image.

In addition to this, the new safety measures will allow the company to detect known child sex abuse materials stored on the iCloud and report them to law enforcement agencies.

This will be combined with new guidance in Siri and Search to direct users to helpful resources when they make searches related to child sexual abuse images.

Apple have said new detection tools are designed to protect user privacy and don’t allow the company to scan or see a user’s photo album.

The system instead, looks for matches based on a database of ‘hashes’ - a type of digital fingerprint - of known child sex abuse material images provided by child safety organisations.

This means the system doesn’t see the images themselves, just their digital fingerprints, and parents taking pictures of children in the bath for example, won’t be flagged.

The tools will automatically scan user's phones for child sex abuse images and could report (Getty Images)

If child sex abuse materials are confirmed, the user’s account will be disabled and the US National Center for Missing and Exploited Children will be notified.

This scanning will only take place when users try to upload an image to their iCloud photo library.

If a threshold for matches for harmful content is exceeded, then Apple can manually review the content to confirm that match and send a report to safety organisations.

The new tools will come into force as apart of the iOS and iPadOS 15 software update due in the autumn.

The announcement is the latest in a series of major updates from the iPhone maker in attempts to improve user safety.

Earlier ones include cutting down on third-party data collection.

There are some concerns the new system could be abused by governments cracking down and demanding data from Apple, or people framing others by sending seemingly innocent pictures which trigger the digital matches.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.