
NSW is stepping up its response to technology-driven abuse, with the state government today introducing laws that make it a crime to create or share sexually explicit deepfakes. These are images or audio of people made using artificial intelligence to look — and sometimes sound real — but are completely fake.
The E-Safety commissioner reported in 2024 that explicit deepfakes have increased on the internet as much as 550 per cent year on year since 2019.
Under the changes to the Crimes Act 1900, making or distributing a deepfake designed to show an identifiable person in a sexual way could see offenders jailed for up to three years. That penalty also applies if you threaten to send these kinds of images or audio, not just if you actually do it.
The updated rules go further than before, also banning AI-generated sexually explicit audio, whether it’s someone’s voice copied or created completely by a machine.

Until now, offences in NSW centred on sharing real intimate images without consent, but the law didn’t directly tackle fake, AI-generated content. With this change, anything that convincingly appears to show a real person and is shared without their permission can land people in court.
Attorney General Michael Daley put it simply in today’s media release: “The NSW Government recognises the law needs to keep up with technology and we are moving to better protect people, particularly young women, from image-based abuse. This bill closes a gap in NSW legislation that leaves women vulnerable to AI-generated sexual exploitation.”
Women’s safety advocates have welcomed the law. According to NSW Women’s Safety Commissioner Hannah Tonkin, “Rapid developments in AI have made it easy to create extremely life-like, sexually explicit depictions of real people. These images are humiliating and degrading in themselves, and they can be shared widely and used for intimidation or extortion.”
Full Stop Australia, a frontline support service, highlights that the changes match what victim-survivors have been saying for some time.
CEO Karen Bevan said, “The new law directly acknowledges the serious impacts that production and distribution of this non-consensual material have on victim-survivors.”
NSW is now aligning its laws with those already in place in other parts of Australia, targeting non-consensual, AI-driven sexually explicit content involving adults.
Lead image: iStock
Help is available.
If you’re in distress, please call Lifeline on 13 11 14 or chat online. If it’s an emergency, please call 000.
Under 25? You can reach the Kids Helpline at 1800 55 1800 or chat online.
The post Creating And Sharing Sexually Explicit Deepfakes Is Now Illegal Under New NSW Laws appeared first on PEDESTRIAN.TV .