Can I Sue Over Deepfakes or AI-Generated Images of Me?

Abstract digital face representing AI and deepfake technology - Photo by Viktor SOLOMONIK on Unsplash
Someone used artificial intelligence to create fake images or videos of you—perhaps putting your face on explicit content, making you appear to say things you never said, or using your likeness without permission. Deepfake technology has made it shockingly easy to create convincing fake media, and victims are fighting back with both criminal complaints and civil lawsuits.

This article is for informational purposes only and does not constitute legal advice. If you’re a victim of non-consensual intimate imagery, contact a qualified attorney and consider reporting to law enforcement.

The Short Answer

Yes, you likely have legal options if someone created deepfakes or AI-generated images of you without consent. Depending on your state and the nature of the images, you may have claims for invasion of privacy, defamation, intentional infliction of emotional distress, violation of your right of publicity, or under new state laws specifically targeting AI-generated content. If the content is sexual, many states now have criminal laws as well.

What Are Deepfakes?

Deepfakes are synthetic media—images, videos, or audio—created using artificial intelligence to depict someone doing or saying something they never actually did. The technology has advanced rapidly:

  • Face-swapping: Placing your face onto someone else’s body in photos or videos
  • Lip-syncing: Making it appear you said words you never spoke
  • Voice cloning: Replicating your voice to create fake audio
  • Full-body generation: Creating entirely fake but realistic images of you
  • “Nudification” apps: AI tools that digitally remove clothing from photos

What once required Hollywood-level resources can now be done with free apps in minutes.

Types of Harmful Deepfakes

Non-Consensual Intimate Images (NCII)

The most common harmful use: creating fake pornographic images or videos using someone’s face. Studies suggest the vast majority of deepfakes online are non-consensual pornography, disproportionately targeting women.

Reputation Attacks

Deepfakes showing you committing crimes, using drugs, making offensive statements, or engaging in embarrassing behavior—used to damage personal or professional reputation.

Fraud and Impersonation

Using your likeness to scam others, create fake endorsements, or commit identity theft.

Harassment and Stalking

Creating deepfakes as part of a pattern of harassment, often by ex-partners or stalkers.

Legal Claims You May Have

Right of Publicity/Misappropriation

Every state recognizes some form of right to control commercial use of your name, image, and likeness. If someone uses your AI-generated likeness for commercial purposes—ads, products, content monetization—you may have a misappropriation claim.

Invasion of Privacy

Several privacy torts may apply:

  • False light: Depicting you in a way that’s highly offensive and false
  • Public disclosure of private facts: If the deepfake incorporates real private information
  • Intrusion upon seclusion: If images were obtained through privacy violations

Defamation

If the deepfake makes false statements of fact about you (showing you committing a crime, for example), you may have a defamation claim. You’d need to show the creator knew or should have known the content was false and damaging.

Intentional Infliction of Emotional Distress (IIED)

Creating and distributing harmful deepfakes may constitute extreme and outrageous conduct causing severe emotional distress—the elements of an IIED claim.

Copyright Infringement

If the deepfake was created using photos you own the copyright to (like selfies you took), the creator may have infringed your copyright.

State Deepfake Laws

A growing number of states have enacted laws specifically addressing deepfakes, many with civil remedies:

  • California: Civil cause of action for non-consensual deepfake pornography (AB 602) and use of deepfakes in political ads
  • Texas: Criminal penalties for creating deepfake pornography without consent
  • Virginia: One of the first states to criminalize deepfake pornography
  • New York: Right of publicity protections extended to digital replicas
  • Georgia, Florida, Minnesota, and others: Various deepfake-specific statutes

Criminal Options

Beyond civil lawsuits, deepfakes may violate criminal laws:

  • Revenge porn laws: Many states’ non-consensual pornography laws now cover AI-generated content
  • Harassment/stalking: If part of a pattern of harassment
  • Identity theft: If used to impersonate you for fraud
  • Child exploitation: AI-generated child sexual abuse material (CSAM) is a serious federal crime
  • Cyber harassment: Federal and state laws against online harassment

Report to local law enforcement and potentially the FBI’s Internet Crime Complaint Center (IC3) for serious cases.

Who Can You Sue?

  • The creator: Whoever made the deepfake—if you can identify them
  • Distributors: People who shared the content knowing it was fake
  • Websites/platforms: Limited liability under Section 230, but some claims may survive
  • AI tool providers: Potentially, if their tools facilitated the harm and they failed to implement safeguards

The biggest challenge is often identifying anonymous creators. Subpoenas to platforms for account information may be necessary.

What You Could Recover

  • Compensatory damages: For emotional distress, therapy costs, lost income, and reputational harm
  • Statutory damages: Under state deepfake laws, often $10,000-$150,000 per violation
  • Punitive damages: For particularly egregious conduct
  • Attorney’s fees: Some statutes provide for fee recovery
  • Injunctive relief: Court orders requiring removal of content

Steps to Take If You’re a Victim

  1. Document everything: Screenshot or save copies of the deepfake content, URLs, and any identifying information about the creator or poster
  2. Don’t engage with the creator: Avoid contact that might tip them off before you can preserve evidence
  3. Report to platforms: Use content reporting tools to request removal (most platforms prohibit deepfake pornography)
  4. File a DMCA takedown: If you own copyright in source photos, send DMCA notices to hosting platforms
  5. Contact the Cyber Civil Rights Initiative: They offer resources for victims of image-based abuse
  6. Report to law enforcement: Especially for intimate imagery—file a police report
  7. Consult an attorney: An attorney experienced in cyber harassment can advise on civil options and help unmask anonymous perpetrators
  8. Consider reputation management: Services can help suppress search results and monitor for new postings

Challenges in Deepfake Cases

  • Identifying creators: Anonymous posting makes it hard to find who to sue
  • Section 230: Platforms often have immunity for user-generated content
  • Jurisdiction: Creators may be in different states or countries
  • Proving harm: Documenting emotional and reputational damages
  • Viral spread: Once shared, content can be nearly impossible to fully remove

Frequently Asked Questions

Can I sue if the deepfake is “obviously fake”?

Yes. Even if a deepfake wouldn’t fool experts, it can still harm your reputation if viewers believe it or if it causes emotional distress. The realistic appearance of modern deepfakes means many viewers will be fooled.

What if I don’t know who created the deepfake?

An attorney can help subpoena platform records to identify anonymous accounts. Many perpetrators can be identified through their digital footprints.

Can I sue the AI company that made the tool?

This is an evolving area. If a company specifically markets tools for creating non-consensual intimate imagery, or completely fails to implement safeguards, there may be liability theories available. However, this is legally uncertain.

Is there a time limit to sue?

Yes. Statutes of limitations vary by state and claim type—typically 1-3 years from when you discovered the content. Act promptly.

What about deepfakes of celebrities?

Celebrities have the same legal protections, often with stronger right of publicity claims given their commercial interest in their image. Several celebrities have sued over AI-generated content using their likeness.

Can minors’ parents sue over deepfakes of their children?

Absolutely. Parents can bring claims on behalf of minor children. Deepfakes of minors, especially intimate imagery, are taken extremely seriously and may trigger federal law enforcement involvement.

The Bottom Line

Deepfake technology has outpaced the law, but victims aren’t without recourse. A combination of traditional legal claims (privacy, defamation, emotional distress) and new deepfake-specific laws give victims options for fighting back. The legal landscape is evolving rapidly in victims’ favor.

If you’ve been victimized by deepfakes or AI-generated images, document everything, report to platforms and law enforcement, and consult with an attorney who handles cyber harassment or privacy cases. You don’t have to face this alone.

Shaun Walker

Shaun Walker

Shaun Walker is a legal writer who helps readers understand their rights and navigate complex legal situations. He specializes in making the law accessible to everyday people facing real-world challenges.