Get all your news in one place.
100’s of premium titles.
One app.
Start reading
InnovationAus
InnovationAus
National
Justin Hendry

Apple warns against mass file scanning proposal

Online safety standards designed to force tech giants to scan cloud storage for illegal and harmful content in Australia set a “dangerous global precedent” that could undermine security protections and lead to mass surveillance, according to Apple.

The tech giant made the comments in a submission to the eSafety Commissioner, which is developing binding standards for designated internet services (DIS) and relevant electronic services (RES).

The standards, which will carry fines of almost $700,000 a day, are a last resort for the eSafety Commissioner, who last year rejected two codes developed by industry to cover DIS and RES services for failing to meet minimum expectations.

The code will require digital platforms to scan apps, websites and file and photo storage services like Apple iCloud and Google Drive, email services and some partially encrypted messaging services for child sexual abuse material (CSAM), according to a consultation paper released in November.

ArliftAtoz2205 / Shutterstock.com

But Apple, which scrapped plans to automatically flag CSAM in 2022, has said that while it “shares concerns around the proliferation of abhorrent CSAM and pro-terror content”, the draft standards “risk undermining fundamental privacy and security protections” for its billions of users.

“We have serious concerns that the draft standards pose grave risks to the privacy and security of our users and set a dangerous global precedent,” it said in the submission, first reported by the Guardian.

Much of the concern stems from what Apple views says is a requirement to “build backdoors into end-to-end encrypted services to monitor data”, despite eSafety explicitly stating it will “not require service providers to design systematic vulnerabilities or weaknesses” .

“That sentiment… is not explicitly stated anywhere in the actual draft RES and DIS standards,” Apple said, adding that this in “stark contrast to the clear protections for end-to-end encryption… in prior industry codes.

The company has recommended that a “clear and consistent approach expressly supporting end-to-end encryption” be adopted by eSafety to prevent “uncertainty and confusion or potential inconsistency across codes and standards”.

It also said that the definition of what might be “technically feasible” is too narrow and a departure from other similar regimes overseas, such as the in the United Kingdom, as well as Australia’s Assistance and Access Act.

“We urge eSafety to revise the definition of technical feasibility so that it more closely aligns with other approaches in and outside of Australia that explicitly take into account risks to user security and privacy, among other factors,” Apple said.

Apple also said that scanning technology “opens the door for bulk surveillance of communications and storage systems” and that these capabilities would “inevitably expand to other content types” at a later date.

“A tool for one type of surveillance can be reconfigured to surveil for other content, such as that related to a person’s familial, political, professional, religious, health, sexual, reproductive, or other activities.

“Tools of mass surveillance have widespread negative implications for freedom of opinion and expression and, by extension, democracy as a whole.”

Apple is also concerned that its global reach means that if it designs the technology for one government, it could “spur other countries to follow suit”, including those that “lack the robust legal protections”.

“Surveillance ordered by governments unconstrained by civil rights and civil liberties protections harm the internationally recognised and fundamental human rights of people whose data would be collected by network scanning,” it said.

“If such governments know that service providers have put into place scanning systems pursuant to mandates from the Australian government, they will seek to use those systems for their own purposes: if we are forced to build it, they will come.”

Requiring service providers to “indiscriminately search vast streams of protected communications and large stores of personal information, could similarly “upend the balance struck by years of Australian jurisprudence, transforming providers into agents of the government”.

“Such a fundamental shift should be the subject of an Act of Parliament, not subordinate legislation, with appropriate protection for providers of services required to undertake such actions.”

Apple said that is current approach to CSAM seeks to tackle it “without raising the serious privacy and security concerns that come with compelled surveillance of each and every Australian user’s privately stored personal information and private communications”.

“Apple will continue investing in technologies to protect our users because it is the right thing to do, and we urge eSafety to preserve the strength of those technologies and the protection that they offer users,” it said.

In a statement responding to Apple’s submission, eSafety said the “draft standards do not require providers to indiscriminately search protected communications or personal information or seize content”, or “break encryption nor introduce systemic vulnerabilities”.

“The use of hash-matching or other highly specific tools to detect this highly harmful material (which is illegal in most jurisdictions) does not set a dangerous global precedent,” eSafety said.

“If fact, US tech companies are required by law to report child sexual abuse material on their platforms to NCMEC and European policymakers have just affirmed that technology companies can scan for child sexual abuse material through to April, 2026.

“Tech companies control the technology they deploy and the implementation of those technologies. These companies can clearly indicate in their policies that scanning is confined to know CSAM and regularly resist attempts by undemocratic governments to use tools for broad surveillance.

“eSafety has received close to 50 written submissions on the draft standards, is carefully considering these submissions, and will publish them on our website later this month.”

In 2022, Apple made just 234 reports of CSAM in the United States, according to the National Center for Missing and Exploited Children. In comparison, Facebook (Meta) made 21,165,208 reports, Instagram (Meta) made 5,007,902 reports, and Google made 2,174,548 reports.

Update: February 23, 2024

This articles has been updated to include a statement from the eSafety Commissioner that was received after publication.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.