Get all your news in one place.
100’s of premium titles.
One app.
Start reading
ABC News
ABC News
National

Australia's e-safety commissioner issues legal notices to Apple, Meta and Microsoft

Australian authorities have issued leading online platforms with legal documents ordering them to disclose how they are addressing the proliferation of child sexual exploitation material.

Australia's e-safety commission has issued Apple, Facebook and WhatsApp's parent company Meta, Microsoft, Snap and Omegle with legal notices under the Australian government’s new basic online safety expectations. 

The eSafety Commissioner Julie Inman Grant said it was a world-leading tool designed to encourage fundamental online safety practices and drive transparency and accountability.

"They will help us lift the hood on what companies are doing — and are not doing — to protect their users from harm," she said. 

"Some of the most harmful material online today involves the sexual exploitation of children and, frighteningly, this activity is no longer confined to hidden corners of the dark web but is prevalent on the mainstream platforms we and our children use every day." 

Companies who do not respond to the notices within 28 days could face financial penalties of up to $555,000 a day.

Ms Inman Grant said it was feared child exploitation material would spread unchecked as more companies moved towards encrypted messaging services and live streaming. 

"Child sexual exploitation material that is reported now is just the tip of the iceberg," she said. 

"Online child sexual abuse that isn't being detected and remediated continues to be a huge concern."

The notices have been issued based on the number of complaints eSafety received, the reach of service and whether limited information is available on a company's safety actions or interventions on their services.

The commission plans to issue further notices to additional providers to build a comprehensive picture of online safety measures across a wide range of services.

Ms Inman Grant said eSafety has handled more than 61,000 complaints about illegal and restricted content since 2015, with the majority involving child sexual exploitation material. 

"We have seen a surge in reports about this horrific material since the start of the pandemic, as technology was weaponised to abuse children," she said. 

"The harm experienced by victim-survivors is perpetuated when platforms and services fail to detect and remove the content."

She said tools were available to stop material being found and recirculated, but many tech companies publish insufficient information about where or how these tools operate.

"Industry must be up-front on the steps they are taking so that we can get the full picture of online harms occurring and collectively focus on the real challenges before all of us," Ms Inman Grant said. 

"We all have a responsibility to keep children free from online exploitation and abuse."

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.