Apple are set to search all iPhones and iCloud accounts in order to scan for child sexual abuse content.
The new system, named neuralMatch in the US, is designed to compare similar, edited or identical images and videos to an existing database provided by child protection agencies.
Once these pictures or videos have been alerted to human operators, they will then be reviewed and handed over to the necessary authorities.
Apple says that the feature has "extremely high levels of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account".
Despite good intentions, some are worried that it will be an invasion of privacy if their pictures and videos are monitored.
The system will be rolled out to the US first but Apple say that the latest versions of its iPhone and iPad operating systems, released later this year, will include the new scanning technology.
In a post on its website, the technology company said: "Our goal is to create technology that empowers people and enriches their lives while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).
"First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.
"Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.
"Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics."
Matthew Green, a security researcher at Johns Hopkins University, said: "Regardless of what Apple's long-term plans are, they've sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users' phones for prohibited content."