
In a major move to highlight the seriousness of deepfake images and its significant impacts on victims, Australia’s online safety regulator is seeking a $450,000 maximum penalty for a man accused of posting images of Aussie women to a deepfake pornography website.
In the first case of its kind heard in an Australian court, the eSafety commissioner has launched proceedings against Anthony Rotondo over his failure to remove deepfake “intimate images” of a number of prominent Australian women that he uploaded to the site.
The names of the women have been kept confidential.

A spokesperson for the eSafety spokesperson said the regulator was seeking between $400,000 and $450,000 for the breaches of the Online Safety Act.
They explained the penalty submission stemmed from its view “of the seriousness of the breaches and the significant impacts on the women targeted”.
“Importantly, the penalty will deter others from engaging in such harmful conduct,” they told PEDESTRIAN.TV in a statement.
“eSafety remains deeply concerned by the non-consensual creation and sharing of explicit deepfake images which can cause significant psychological and emotional distress.”
The penalties hearing was held on Monday. The court has reserved its decision for a later date.
As reported by The Guardian, the court heard Rotondo initially refused to comply with the order to remove the images while he was based in the Philippines, but the commissioner launched the case once he returned to Australia.
Back in December 2023, he admitted he continued to publish similar images even after he was ordered by a court not to, and was fined $25,000 for contempt of court. He later shared his password so the deepfake images could be removed from the deepfake pornography site.
The site in question, a notorious AI-generated pornography site called MrDeepFakes, was recently shut down, and was popular for allowing users to upload explicit, digitally altered and nonconsensual content, particularly of celebrities.

Last year, the Australian federal government introduced new laws that ban the sharing of non-consensual deepfake sexually explicit material. This made it a serious crime to share fake sexual images or videos of someone without their consent, including if made using AI.
Under this legislation, people who share this kind of material can face up to six years in prison, while those also responsible for its creation could be looking at a higher penalty for the aggravated offence, of up to seven years.
In her opening statement to the Senate committee reviewing the bill in July last year, the eSafety commissioner, Julie Inman Grant, said deepfakes had increased on the internet by a considerable 550 per cent since 2019.
Pornographic videos made up a whopping 99 per cent of the deepfake material online, with the majority of that imagery consisting of women and girls.
“Deepfake image based abuse is not only becoming more prevalent but is also very gendered and incredibly distressing to the victim-survivor,” Inman Grant said, per The Guardian.
“Shockingly, thousands of open-source AI apps like these have proliferated online and are often free and easy to use by anyone with a smartphone.
“So these apps make it simple and cost-free for the perpetrator, while the cost to the target is one of lingering and incalculable devastation.”
In 2023–24, the eSafety Commissioner received over 7,270 reports about image-based abuse, including deepfakes. It issued removal requests for content hosted across more than 947 locations on 191 different platforms and services.
The post Man Who Posted Deepfakes Of Prominent Aussie Women Could Cop $450K Penalty appeared first on PEDESTRIAN.TV .