Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
Business
Paul Karp

Digital code of conduct fails to stop all harms of misinformation, Acma warns

Google and Facebook logos overlaid over the Australian flag
The Morrison government has released Acma’s June 2021 report on the digital platforms’ misinformation and disinformation code. Photograph: Dado Ruvić/Reuters

The code of conduct adopted by digital platforms, including Facebook and Google, is “too narrow” to prevent all the harms of misinformation and disinformation, Australia’s media regulator has warned.

The requirement that harm from social media posts must be both “serious” and “imminent” before tech companies take action has allowed longer term “chronic harms” including vaccine misinformation and the erosion of democracy, according to the Australian Communication and Media Authority.

The Morrison government released Acma’s June 2021 report on the misinformation and disinformation code on Monday, promising to help boost the regulator’s power to demand information from digital platforms and give it reserve powers to create new rules for the industry.

Labor accused the government of promising the new powers in the “dying days of the 46th parliament”.

The code was drawn up by the Digital Industry Group Inc after the digital platforms inquiry in 2019. It is a form of self-regulation adopted in February 2021 by Google, Facebook, Microsoft, Twitter, TikTok, Redbubble, Apple and Adobe.

Acma found that 82% of Australians report having seen Covid-19 misinformation over the past 18 months, warning that “falsehoods and conspiracies” online had undermined Australia’s public health response. Some 22% reported seeing “a lot” of misinformation online, with younger Australians most at risk.

Misinformation was most common on larger digital platforms, including Facebook and Twitter, but “smaller private messaging apps and alternative social media services are also increasingly used to spread misinformation or conspiracies due to their less restrictive content moderation policies”, it said.

Acma said misinformation “typically spreads via highly emotive and engaging posts within small online conspiracy groups” which were then “amplified” by local figures.

The celebrity chef Pete Evans and the prominent anti-vaccine campaigner Taylor Winterstein topped the list of “influencers sharing misinformation narratives”, according to research commissioned from We Are Social.

They have denied sharing misinformation but Evans has been removed from Facebook and Instagram. The United Australia Party MP Craig Kelly who was removed from Facebook for promoting unproven Covid treatments also featured on the list. Kelly denies sharing misinformation and has accused the social media platforms of interfering with his duties as an MP because he was unable to communicate with constituents through the platform.

Acma said it was appropriate for digital platforms to apply a threshold that misinformation must be reasonably likely to cause “serious” harm before they censor posts.

But the requirement that misinformation must also be “imminent” allows a narrow interpretation that “would likely exclude a range of chronic harms that can result from the cumulative effect of misinformation over time, such as reductions in community cohesion and a lessening of trust in public institutions”.

It cited the 2021 Capitol riot in the US as “an example of the impact of longer-term chronic harms arising from the widespread belief in misinformation, and how this can spill over to the real-world as incitement to commit violent acts”.

Digi dead-batted Acma’s call to remove the requirement that harm be “imminent” from the code, promising only to consider the recommendation when it reviews the code this year.

“It is important to note that the code’s current approach does not preclude action on what might be described as chronic harms, and we’ve certainly seen signatories report action on these in their transparency reports,” a Digi spokesperson said.

Acma asked for “formal information-gathering powers … to oversee digital platforms, including the ability to request Australia-specific data on the effectiveness of measures to address disinformation and misinformation” and “reserve powers” to introduce binding rules and codes of conduct.

The communications minister, Paul Fletcher, agreed to those requests, arguing the latter would encourage the platforms to be “more ambitious” when revising the voluntary code.

“Acma’s report highlights that disinformation and misinformation are significant and ongoing issues,” Fletcher said. “Digital platforms must take responsibility for what is on their sites and take action when harmful or misleading content appears.”

Labor’s shadow communications minister, Michelle Rowland, and assistant minister, Tim Watts, said the government had failed to empower Acma “to act on misinformation and disinformation, despite evidence of it circulating online during the Black Summer bushfires, the Covid-19 pandemic and around elections”.

Digi agreed “in principle” to the recommendations including the introduction of “reserve powers”. The industry body called for Acma’s role to include an “appeals mechanism in the event of disagreements in the final outcomes of complaints raised through Digi’s complaints portal”.

Acma also called for private messaging services to be included within the scope of the code because they are “known vectors of disinformation and misinformation”.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.