Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
Business
Josh Taylor

Georgie Purcell photoshop scandal shows why transparency is crucial when it comes to AI

Animal Justice Party MP Georgie Purcell
Animal Justice Party MP Georgie Purcell had her photo edited to enlarge her breasts and insert a crop into her top that hadn’t been there. Photograph: Diego Fedele/AAP

It’s been five years since Australia’s last Photoshop scandal, involving then-prime minister Scott Morrison’s white shoes, but it feels like a world away.

This week the Animal Justice Party MP Georgie Purcell had her photo edited to enlarge her breasts and insert a crop into her top that hadn’t been there. Having previously been a victim of image-based abuse, Purcell said the incident felt violating, and that the explanation given by Nine News failed to address the issue.

For its part, Nine blamed an “automation” tool in Photoshop – the recently launched “generative fill”, which, as the name suggests, fills in the blanks of an image when it is resized using artificial intelligence. Nine said the company was working from an already-cropped version of the original image, and used the tool to expand beyond the image’s existing borders. But whoever did alter the image presumably still exported the modified version without considering the impact of their changes.

The Photoshop blunder feels like a harbinger for a media world that increasingly relies on artificial intelligence, where determining whether something was created by human or machine is ever more murky and AI becomes a convenient scapegoat to explain away mistakes.

The incident also reveals Nine is using AI on images it broadcasts without disclosing the AI manipulation.

In August, Nine’s CEO, Mike Sneesby, said he could “see potential for Nine to use AI to drive meaningful, longer term benefits in content production, operational efficiency and commercialisation throughout the business”.

Adobe’s generative fill tool no doubt offers “operational efficiency”, but should Nine have declared it had started using generative fill and flagged that in images put to air?

Although Nine has apologised and accepted responsibility, the incident appears to breach the (voluntary) Australian AI ethics principles, which advise that people using AI should be identifiable and accountable for the outcomes, and there should be human oversight.

The Media, Entertainment and Arts Alliance journalist code of ethics concurs, stating pictures and sound must be true and accurate, and “manipulation likely to mislead should be disclosed”.

On the tech side of things, it raises questions about the dataset Adobe uses to train its AI. Tests conducted by Guardian Australia this week suggested Adobe’s generative fill on images of women would often lead to shorter shorts, something Crikey was also able to replicate.

Adobe said in a statement it had trained its model with “diverse image datasets” and continually tests the model to mitigate against “perpetuating harmful stereotypes”. The company said it was also reliant on reports from users for potentially biased outputs to improve the processes.

“This two-way dialogue with the public is critical so that we can work together to continue to make generative AI better for everyone.”

Part of the mess that AI tools create is not just the fake images, video and audio but the doubt they sow about everything else.

Australia is yet to see a scandal involving a politician claiming an inconvenient audio grab or video is an AI deep fake, but it likely won’t be long.

In the US, right wing political operative Roger Stone last month claimed leaked audio of him threatening to kill Democrats was AI-generated. At the same time, an AI-faked version of US president Joe Biden’s voice was making robocalls that spread misinformation about the New Hampshire primary.

When you can’t tell what is real and what is AI, suddenly everything is suspect. That means for media companies at the very least, and tech companies too, disclosure is crucial.

Globally, legislators are still figuring out exactly how to implement guardrails and progress has been piecemeal. In the United States, legislation has been introduced to criminalise the spread of nonconsensual, sexualised images generated by artificial intelligence after the online circulation of deepfakes depicting Taylor Swift last week.

Australia will probably join in this ban via codes enforced by the eSafety commissioner, but it to has largely been watching from afar. Last month, Australia announced an “expert panel” will be consulted on the best next steps on high risk AI.

And some issues will be covered by existing law. Dr Rita Matulionyte, a senior lecturer in law at Macquarie University, has authored a paper on AI and moral rights. She told Guardian Australia the copyright act, for instance, should prevent “derogatory treatment” of copyright works, such as alteration or mutilation by AI, although there were few cases where it had been successfully argued.

Matulionyte said it was also unclear whether such law would help Purcell, given she was not the photographer, and the manipulation may not be substantial enough.

“If the person in the image was stripped of most/all of the clothes or a background were added that would mutilate the idea behind the picture, then the infringement of the right of integrity would be more likely to succeed,” she said.

In the end, it’s all about transparency.

The government has said it will work with industry to develop a “voluntary code” to label or watermark AI-generated content. Leaving it up to the goodwill of the large companies involved in this technology to do the right thing is clearly not a viable option.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.