Get all your news in one place.
100’s of premium titles.
One app.
Start reading
InnovationAus
InnovationAus
Technology
Joseph Brookes

Online industry reveals plan to regulate internet safety

Associations representing companies like Meta, Google and Telstra have revealed industry codes of conduct they say will make the internet safer, just days after being threatened with legal action by the online regulator.

The codes, which have been developed through industry consultation over a year, cover almost all online areas, from manufacturing and servicing internet connected devices to internet service providers, social media, and encrypted messaging.

The codes mandate that providers alert authorities about the sharing of the most dangerous content when they believe it poses a serious and immediate threat, and have systems in place to refer reports of its sharing when through encrypted communications to authorities.

A “safety feature” in the code will require providers of encrypted messaging services to make users register for the service using a phone number, email address or other identifier.

Companies that don’t adhere to the codes face penalties and content takedowns from the Australian online regulator, who was handed greater investigative and enforcement powers by the Parliament last year.

Credit: Twin Design / Shutterstock.com

The eSafety Commissioner last year directed Australia’s online industry to develop codes to deal with the online treatment of Class 1 and Class 2 material – internet content which would either be refused classification or considered R18+.

This includes things like online material depicting or promoting child sexual exploitation, terrorism, extreme crime and violence, crime and violence, drug use, and pornography.

The power for the Commissioner to do so, and then investigate and enforce adherence to the codes, comes from Australia’s Online Safety Act, which was passed last year with bipartisan support.

On Thursday, groups like DIGI, which represents Big Tech and social media companies, the Australian Mobile Telecommunications Association and the video games sector’s Interactive Games and Entertainment Association released their version of codes for the most harmful class 1 material.

The eight codes cover various online services like social media, gaming, search messaging and app distribution, as well as internet carriage services and manufacturing and supply of “any equipment that connects to the internet”.

Under the banner of onlinesafety.org.au, the six industry associations are now publicly consulting on the codes, which will then be submitted to the eSafety Commissioner for registration.

The draft codes include minimum compliance levels. For example, under the social media services code all social media companies must report child sexual exploitation material (CSEM) or pro-terror material it believes is a serious and immediate threat to the safety of Australians to an appropriate entity within 24 hours or “as soon as reasonably practicable”.

For electronic service code, which includes messaging and gaming, any company deemed an enterprise relevant electronic service – companies the service to a wide range of users – must have agreements with users regarding the distribution of material.

Closed communication and encrypted messaging services must require a user to register for the service using a phone number, email address or “other identifier”.

Under this code, large and encrypted service providers must notify authorities about child exploitation material and pro-terror material when it is identified on its service, also within 24 hours or “as soon as reasonably practicable”.

All internet service providers will need to ensure children are unable to obtain an internet carriage service without parental or guardian consent, tell their customers not to produce online material that breaks Australian laws, and engage with the eSafety commissioner on developing a protocol for blocking child sexual exploitation material, according to the relevant code.

Spokespeople for the groups said the industry codes will improve the safety of the online environment.

“The draft codes strengthen the safeguards across the online industry to help protect Australians from certain harmful content, and also make existing protections more consistent and transparent,” DIGI managing director Sunita Bose said.

Fellow tech group Communications Alliance’s chief executive John Stanton added, “the new codes are there to protect all Australians. It is important that we hear from many different stakeholders and users of the internet whether the Draft Codes work for them.”

The industry groups behind the codes are the Australian Mobile Telecommunications Association (AMTA), BSA The Software Alliance (BSA), Communications Alliance (CA), Consumer Electronics Suppliers Association (CESA), Digital Industry Group Inc. (DIGI) and Interactive Games and Entertainment Association (IGEA).

Public consultations on their draft codes are open until October 2. Further work on another code covering Class 2 adult content is set to commence as the next stage once the Class 1 code is registered.

On Tuesday Australia’s current eSafety Commissioner Julie Inman Grant ordered the tech giants behind some of the biggest social media and messaging apps to detail what measures they are taking to tackle child exploitation material. She did this by issuing legal notices under the new government’s Basic Online Safety Expectations, a key part of the Online Safety Act 2021.

Correction: codes for Class 2 material will not necessarily be released this year, as an earlier version of this article states.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.