The Campaign
Who We Are
#MyImageMyChoice aims to amplify the voices of intimate image abuse survivors, and create cultural change.
Starting as a grassroots project, it has gone on to have an outsized impact, working with over 30 survivors as well as organizations including The White House, Bumble, World Economic Forum and survivor coalitions.
We want to:
- Overturn the shaming and silencing survivors face
- Campaign for legislation
- Work to #BlockMrDeepfakes and the 3000+ sites dedicated to this abuse
- Stimulate a digital phase of the #metoo movement
- Educate on digital consent
#MyImageMyChoice is proud to partner with Synthesia, the world’s largest AI video generation platform, to safely bring anonymous stories to life with our AI avatars. Synthesia believes in the good that AI has to offer, all content uploaded to the site is consensual. The problem doesn’t lie with technology, but how we use it.
We would love for organizations to join our coalition, please reach out to myimagemychoice@gmail.com!
Impact to Date
Huge grassroots support: We have built coalitions of survivors and advocates, including a network ambassadors on college campuses, and created petitions gaining 100k+ signatures.
Global press coverage: With NPR, ABC, BBC, The Guardian, Vogue and more, we are driving this issue up the cultural agenda.
Tangible policy change: Our testimonies have been crucial to landmark reports by The White House, UK Law Commission, and others.
Viral Digital Content: we produced content including videos with GIBI ASMR (700k+ views) and NOW THIS (150k+ impressions).
Expertise: we built trusted relationships with key stakeholders, and been invited to be Advisors to the World Economic Forum and others.
Upcoming Events
NowThis – Can your profile pic be used as deepfake porn?
ABC News – AI-Generated: How nonconsensual deepfake porn targets women
The Times – I stared at my face in a sex video. I was a deepfake porn victim.
The Guardian – Inside the Taylor Swift deepfake scandal: ‘It’s men telling a powerful woman to get back in her box’
Glamour UK – It’s not just Taylor Swift; all women are at risk from the rise of deepfakes
Context – Tough laws needed for deepfake porn, say women who suffer AI abuse
Cosmopolitan – The new law on deepfakes has officially kicked in: here’s all you need to know
CNN – Opinion: The rise of deepfake pornography is devastating for women
Vogue – Deepfakes Are Ruining Lives, This Form Of Abuse Must Be Stopped
BBC – #MyImageMyChoice – BBC News Interview with Sophie Compton and Sophie Mortimer
NBC – Found through Google, bought with Visa and Mastercard: Inside the deepfake porn economy
Channel 4 News – Is enough being done to protect women whose sexual images are posted online without their consent?
Express – Revenge porn attacks doubled in last two years
Huffpost – ‘Revenge Porn’ Ruined My Teen Years. Now I’m Fighting Back
ITV News – What it feels like to be a victim of revenge porn
BBC – Revenge porn law: Norfolk victim of online intimate photos ‘distraught’
BBC – Stolen naked images traded in cities around the world
BBC – #MyImageMyChoice – BBC News with Victoria Derbyshire
MIT Technology Review – Deepfake porn is ruining women’s lives. Now the law may finally ban it
The Times – ‘I feel violated’ — website lists stolen nude pictures by women’s location
NowThis – This survivor-led campaign from @myimagemychoice is pushing for new laws against ‘revenge porn’