Can I Sue Someone for Creating Deepfakes or AI Images of Me? (UK)

Grid of diverse women faces representing AI-generated deepfake images

Someone has used artificial intelligence to create fake images or videos of you — perhaps intimate content, embarrassing scenarios, or putting words in your mouth you never said. Deepfake technology has made this disturbingly easy. In the UK, you have several legal options to fight back.

This article is for informational purposes only and does not constitute legal advice.

Short Answer

Yes, creating or sharing deepfakes can be illegal in the UK under multiple laws. The Online Safety Act 2023 made sharing intimate deepfakes a specific criminal offence. You may also have civil claims for harassment, data protection breaches, defamation, and misuse of private information. The law is increasingly recognising the serious harm these images cause.

What Are Deepfakes?

Deepfakes use AI to create convincing fake images or videos. Common types include:

  • Intimate image deepfakes: Fake pornographic content using someone’s face
  • Face-swaps: Placing someone’s face in compromising situations
  • Voice cloning: AI-generated audio mimicking someone’s voice
  • Manipulated videos: Making it appear someone said or did something they didn’t
  • AI-generated images: Entirely artificial but realistic images using someone’s likeness

UK Legal Framework

Online Safety Act 2023 — Intimate Image Offences

The Online Safety Act created specific criminal offences for intimate deepfakes:

  • Sharing intimate deepfakes: Criminal offence to share AI-generated intimate images without consent
  • No intent requirement: Unlike some offences, prosecution doesn’t require proving intent to cause distress
  • Threatening to share: Also criminal to threaten to share such images

This means police can now prosecute deepfake intimate image creators and sharers.

Protection from Harassment Act 1997

Creating or distributing deepfakes can constitute harassment if it:

  • Forms a course of conduct (two or more occasions)
  • Causes alarm or distress
  • A reasonable person would consider it harassment

This provides both criminal sanctions and civil remedies including injunctions and damages.

Malicious Communications Act 1988

It’s an offence to send communications that are:

  • Grossly offensive, indecent, obscene, or menacing
  • False and known to be false
  • Sent with intent to cause distress or anxiety

UK GDPR and Data Protection Act 2018

Your image is personal data. Processing it without consent or lawful basis is illegal. You can:

  • Demand deletion under the “right to erasure”
  • Claim compensation for data protection breaches
  • Report to the Information Commissioner’s Office (ICO)

Misuse of Private Information

This civil claim protects your reasonable expectation of privacy. Creating sexualised or humiliating images of you, even if fake, can breach this right.

Defamation Act 2013

If a deepfake implies false facts about you that cause serious harm to your reputation, you may have a defamation claim. The fact that it’s AI-generated doesn’t prevent it being defamatory.

Copyright

If the deepfake uses photographs you own copyright in, you may have intellectual property claims.

When You May Have a Claim

You likely have legal options if:

  • Intimate or sexualised deepfake images have been created of you
  • Deepfakes portray you in embarrassing or damaging situations
  • Fake videos show you saying things you never said
  • The content has been shared publicly or with others
  • You can identify who created or distributed the content
  • The content has caused you distress, reputational harm, or financial loss

When It’s More Difficult

Claims may be harder when:

  • The creator is anonymous and cannot be identified
  • Content is hosted outside UK jurisdiction
  • The deepfake is clearly satire or parody (though this has limits)
  • You can’t prove who made or shared it
  • The content has been deleted and you have no copies as evidence

Potential Remedies

Depending on your circumstances, you may obtain:

  • Injunction: Court order requiring removal and preventing further distribution
  • Damages: Compensation for emotional distress, reputational harm, and financial losses
  • Norwich Pharmacal orders: Court orders requiring platforms to reveal the identity of anonymous creators
  • Criminal prosecution: Police action against perpetrators
  • Platform takedown: Most platforms prohibit deepfakes and will remove reported content

Steps to Take

  1. Preserve evidence: Screenshot or save copies of the deepfake content, URLs, and any identifying information about the creator
  2. Report to the platform: Use abuse reporting tools — most platforms prohibit non-consensual intimate images
  3. Report to police: If the content is intimate/sexual or constitutes harassment, report to your local police or via the non-emergency 101 line
  4. Contact the Revenge Porn Helpline: They assist with intimate image abuse including deepfakes (0345 6000 459)
  5. Report to ICO: For data protection breaches
  6. Seek legal advice: A solicitor can advise on civil claims and obtaining court orders
  7. Document the impact: Keep records of how the deepfake has affected your mental health, relationships, and career

Platform Responsibilities

Under the Online Safety Act, platforms have duties to:

  • Remove illegal content including intimate deepfakes
  • Have effective reporting mechanisms
  • Take proactive measures to prevent harm

If platforms fail to act on reports, they may face regulatory action from Ofcom, and their inaction strengthens your case.

Frequently Asked Questions

Is it illegal to create deepfakes even if they’re not shared?

Creating intimate deepfakes may not itself be an offence, but sharing them is. However, if creation is part of a harassment campaign or involves illegal access to images, other offences may apply. The law is evolving in this area.

What if the person is in another country?

UK criminal law has limited reach abroad, but you can still pursue civil claims if the content affects you in the UK. Platforms accessible in the UK must comply with the Online Safety Act regardless of where content originates.

Can I sue the AI company that made the tool?

This is largely untested. Currently, claims typically focus on the individual who created/shared the content. However, legal arguments about AI company liability are developing, particularly where tools lack adequate safeguards.

What if I’m a public figure?

Public figures have reduced privacy expectations for matters of public interest, but this doesn’t extend to sexualised content or clearly false and damaging material. Intimate deepfakes of public figures are still illegal.

How long do I have to take action?

Defamation claims must generally be brought within 1 year. Harassment and data protection claims have 6-year limitation periods. Criminal matters have no limitation for serious offences. Act promptly to preserve evidence.

Will reporting to police actually help?

Police are increasingly trained on intimate image abuse. The new criminal offences give them clear powers. While investigation can take time, reporting creates an official record and may identify serial offenders.

Need Legal Help?

If you’ve been targeted with deepfake content, consider consulting a solicitor who specialises in online abuse, privacy, or media law. The Revenge Porn Helpline also provides free support and can assist with takedowns.

Shaun Walker

Shaun Walker

Shaun Walker is a legal writer who helps readers understand their rights and navigate complex legal situations. He specializes in making the law accessible to everyday people facing real-world challenges.