The Campaign

Who We Are

#MyImageMyChoice aims to amplify the voices of intimate image abuse survivors, and create cultural change. 

Starting as a grassroots project, it has gone on to have an outsized impact, working with over 30 survivors as well as organizations including The White House, Bumble, World Economic Forum and survivor coalitions. 

We want to:

  • Overturn the shaming and silencing survivors face
  • Campaign for legislation 
  • Work to #BlockMrDeepfakes and the 3000+ sites dedicated to this abuse
  • Stimulate a digital phase of the #metoo movement
  • Educate on digital consent

#MyImageMyChoice is proud to partner with Synthesia, the world’s largest AI video generation platform, to safely bring anonymous stories to life with our AI avatars. Synthesia believes in the good that AI has to offer, all content uploaded to the site is consensual. The problem doesn’t lie with technology, but how we use it.

We would love for organizations to join our coalition, please reach out to!

Impact to Date

Huge grassroots support: We have built coalitions of survivors and advocates, including a network ambassadors on college campuses, and created petitions gaining 100k+ signatures.


Global press coverage: With NPR, ABC, BBC, The Guardian, Vogue and more, we are driving this issue up the cultural agenda.


Tangible policy change: Our testimonies have been crucial to landmark reports by The White House, UK Law Commission, and others.


Viral Digital Content: we produced content including videos with GIBI ASMR (700k+ views) and NOW THIS (150k+ impressions).

Expertise: we built trusted relationships with key stakeholders, and been invited to be Advisors to the World Economic Forum and others.

Upcoming Events

Screening in UK parliament with Jess Phillips MP — UK Parliament, January 22, 2024
Virtual Summit on Deepfakes — Virtual, March 19-20


NowThis – Can your profile pic be used as deepfake porn?

ABC NewsAI-Generated: How nonconsensual deepfake porn targets women

The TimesI stared at my face in a sex video. I was a deepfake porn victim.

The GuardianInside the Taylor Swift deepfake scandal: ‘It’s men telling a powerful woman to get back in her box’ 

Glamour UKIt’s not just Taylor Swift; all women are at risk from the rise of deepfakes

ContextTough laws needed for deepfake porn, say women who suffer AI abuse

CosmopolitanThe new law on deepfakes has officially kicked in: here’s all you need to know

CNNOpinion: The rise of deepfake pornography is devastating for women 

VogueDeepfakes Are Ruining Lives, This Form Of Abuse Must Be Stopped

BBC#MyImageMyChoice – BBC News Interview with Sophie Compton and Sophie Mortimer

NBCFound through Google, bought with Visa and Mastercard: Inside the deepfake porn economy 

Channel 4 NewsIs enough being done to protect women whose sexual images are posted online without their consent?

GlamourEmma Barnett: Misuse and abuse of nude photos can utterly ruin your life, so what can we do about it?

ExpressRevenge porn attacks doubled in last two years

Huffpost‘Revenge Porn’ Ruined My Teen Years. Now I’m Fighting Back

ITV NewsWhat it feels like to be a victim of revenge porn

BBCRevenge porn law: Norfolk victim of online intimate photos ‘distraught’

BBCStolen naked images traded in cities around the world

BBC#MyImageMyChoice – BBC News with Victoria Derbyshire

MIT Technology ReviewDeepfake porn is ruining women’s lives. Now the law may finally ban it

The Times‘I feel violated’ — website lists stolen nude pictures by women’s location

NowThisThis survivor-led campaign from @myimagemychoice is pushing for new laws against ‘revenge porn’