Federal Court issues fine for deepfake pornography

In a first-of-its-kind case, the Federal Court has ordered a man to pay AUD $343,500 for creating and publishing deepfake pornographic images of prominent Australian women without their consent.

This case marks a major step in how the law deals with image-based abuse, AI-generated content, and non-consensual pornography in Australia. Here’s what it means, what the law says now, and what to watch out for.

What Happened: The Case in a Nutshell

  • The individual, Anthony Rotondo, posted 12 deepfake images of six women to a now-defunct website between November 2022 and October 2023.

  • These images were sexually explicit, showing nudity or simulated sexual acts, created without the women’s permission.

  • The eSafety Commissioner initiated a civil action under the Online Safety Act, alleging multiple breaches.

  • The court found Rotondo in breach on 14 occasions and ordered the fine plus the eSafety Commissioner’s legal costs.

  • Privacy orders protect the identities of the women involved.

Justice Longbottom described the actions as “serious, deliberate and sustained.”

One of the victim statements stood out: even though the images were fake, the woman said she felt violated, vulnerable and without control.

Legal Context: Why This Case Matters

This ruling is significant for several reasons:

  1. First major penalty
    It’s among the first times in Australia that a deepfake pornography case has resulted in a substantial financial penalty.

  2. Affirming non-consensual deepfakes as serious harm
    The court accepted that creating and distributing these images causes real psychological, emotional, and reputational harm — even if the images are digitally manipulated and not “real”.

  3. Online Safety Act in action
    The case uses the Online Safety Act as a vehicle to hold creators of non-consensual deepfake content accountable.

  4. Deterrent and legal signal
    The size of the penalty and court commentary send a strong message that misuse of AI for non-consensual intimate content will not be ignored by the courts.

  5. Role of expert evidence
    In this case, expert testimony (for example from Prof Clare McGlynn) helped the court understand the nature of image-based abuse and the equivalence of harm between real and synthetic images.

What the Law Currently Allows — and Gaps to Watch

  • Under the Online Safety Act, the eSafety Commissioner can bring civil actions against someone who shares or fails to remove non-consensual intimate or sexually explicit images.

  • In 2024, Parliament passed new federal criminal laws specifically targeting deepfake sexual imagery (creation, sharing) to strengthen enforcement. The Guardian

  • However, practical challenges remain:

    • Tracing perpetrators who operate across borders or use anonymising tools

    • Enforcing removal orders when websites are offshore

    • Ensuring victims can get relief fast and with less retraumatisation

    • Bridging gaps between civil and criminal liability

What This Means for Individuals & Victims

  • If your image has been manipulated or published without consent, you may be able to seek redress under the Online Safety Act through complaints to eSafety, or civil action if your case is strong.

  • Courts are increasingly recognising that “fake” intimate images can inflict harms equivalent to real ones, so they will not shrug off claims just because the content was AI-generated.

  • Expert evidence (psychology, digital forensics, gender studies) may play a crucial role in proving harm and causation.

  • Timing and secrecy matter — swift action may help with takedowns and reduce further distribution.

What We’re Watching

  • Will more courts follow suit and award similar (or higher) damages in deepfake cases?

  • How will the new criminal offence provisions be used in practice?

  • Will legislative or regulatory measures close gaps in cross-border enforcement?

  • Will technology platforms be required to more aggressively monitor, block or remove non-consensual AI-generated content?

    This OYBlog was created using AI assistance based off the following source: ABC+1

Previous
Previous

The Hidden Sexual Assault You May Not Know About: Stealthing

Next
Next

Tasmania Compensation Scheme for Historical LGBTQIA Convictions Explained