The Issue

#MyImageMyChoice is a cultural movement tackling intimate image abuse from the creators of the documentary ANOTHER BODY.

Intimate image abuse can have devastating, even life-threatening, impacts. But governments and tech platforms aren’t doing anything to address it. Intimate image abuse websites, based on violating consent, have become thriving online businesses. Companies like Google, Visa, Verizon are enabling and profiting off this abuse, and normalizing misogyny.

We are campaigning for governments and tech companies to #BlockMrDeepfakes and the 3000+ sites dedicated to online gendered abuse.

Press Coverage

Our Work To Date

#MyImageMyChoice aims to amplify survivor voices and advocacy for change. Starting as a grassroots project, #MyImageMyChoice has gone on to have an outsized impact, working with over 30 survivors, grassroots groups, as well as governments, companies and international NGOs such as The White House and Bumble.

Grassroots organizing.

We have built coalitions of survivors and advocates, including a network of ambassadors on college campuses, petitions gaining 100k+ signatures and viral digital content.

Awareness Raising

Content we helped create includes a video with GIBI ASMR that reached 700k+ views, a video with NowThis that received 150k+ impressions across all platforms, and a video with Alkiiwii that reached over 2.5 million users and gained 200k likes. We co-hosted the world’s first virtual summit on deepfake abuse, collaborating with 30+ speakers as well as 40 partners & supporters, and drawing 1500+ attendees.

#MyImageMyChoice co-founders were invited to speak on a panel at Adobe’s Content Authenticity Initiative Symposium hosted by the Stanford Data Lab; at the Mozilla Festival about the ways philanthropy can intervene to stop online sexual violence; at an event hosted by Control AI to inform parents about the dangers of deepfake technology, among others.

Tangible policy change

Testimonies from our survival coalition have been crucial to landmark reports by The White House, UK Law Commission, and others. We have been invited to be Advisors to the World Economic Forum and participate in roundtables at The White House, UK Parliament and others.  

The White House told us the film specifically galvanized Kamala Harris’ work on this issue, directly contributing to the inclusion of a section on deepfake abuse in the president’s historic AI Executive Order. In recognition of this, the White House invited the film team and participants Taylor and Gibi to attend its signing in October 2023. We developed a landmark dossier of research that has been crucial to many US policymakers’ work on this issue. 

#MyImageMyChoice testimonies also contributed to UK Law Commission reports, and subsequent policy change through the UK’s landmark Online Safety Bill, which includes the criminalization of non-consensual sharing of deepfake intimate images. We hosted a screening in the UK Parliament with MP Jess Phillips in January 2024, using the film to support efforts to ensure that protections against deepfakes outlined in the Online Safety Bill are effectively implemented. The UK Parliament has since announced plans to criminalize the very creation of deepfake abuse, regardless of intent to distribute.

The Facts

Download our landmark research dossier into the Landscape of Deepfake Abuse here.

Survivor stories

The issue of deepfake sexual abuse is growing exponentially, it has risen by 3000% since 2019. We created a film and a cultural movement to combat this. 

#MyImageMyChoice is a cultural movement changing laws and culture around deepfake sexual abuse, from the creators of the documentary ANOTHER BODY. 

 

ANOTHER BODY is a SXSW winning and Emmy nominated documentary that follows a US college student’s quest for justice after she finds deepfake pornography of her online. 

 

Instagram TikTok

The Campaign

Who We Are

#MyImageMyChoice aims to amplify the voices of intimate image abuse survivors, and create cultural change. 

Starting as a grassroots project, it has gone on to have an outsized impact, working with over 30 survivors as well as organizations including The White House, Bumble, World Economic Forum and survivor coalitions. 

We want to:

  • Overturn the shaming and silencing survivors face
  • Campaign for legislation 
  • Work to #BlockMrDeepfakes and the 3000+ sites dedicated to this abuse
  • Stimulate a digital phase of the #metoo movement
  • Educate on digital consent

#MyImageMyChoice is proud to partner with Synthesia, the world’s largest AI video generation platform, to safely bring anonymous stories to life with our AI avatars. Synthesia believes in the good that AI has to offer, all content uploaded to the site is consensual. The problem doesn’t lie with technology, but how we use it.

We would love for organizations to join our coalition, please reach out to myimagemychoice@gmail.com!

Impact to Date

Huge grassroots support: We have built coalitions of survivors and advocates, including a network ambassadors on college campuses, and created petitions gaining 100k+ signatures.

 

Global press coverage: With NPR, ABC, BBC, The Guardian, Vogue and more, we are driving this issue up the cultural agenda.

 

Tangible policy change: Our testimonies have been crucial to landmark reports by The White House, UK Law Commission, and others.

 

Viral Digital Content: we produced content including videos with GIBI ASMR (700k+ views) and NOW THIS (150k+ impressions).

Expertise: we built trusted relationships with key stakeholders, and been invited to be Advisors to the World Economic Forum and others.

Past Events

Screening in UK parliament with Jess Phillips MP — UK Parliament, January 22, 2024
Virtual Summit on Deepfakes — Virtual, March 19-20

 

NowThis – Can your profile pic be used as deepfake porn?

ABC NewsAI-Generated: How nonconsensual deepfake porn targets women

The TimesI stared at my face in a sex video. I was a deepfake porn victim.

The GuardianInside the Taylor Swift deepfake scandal: ‘It’s men telling a powerful woman to get back in her box’ 

Glamour UKIt’s not just Taylor Swift; all women are at risk from the rise of deepfakes

ContextTough laws needed for deepfake porn, say women who suffer AI abuse

CosmopolitanThe new law on deepfakes has officially kicked in: here’s all you need to know

CNNOpinion: The rise of deepfake pornography is devastating for women 

VogueDeepfakes Are Ruining Lives, This Form Of Abuse Must Be Stopped

BBC#MyImageMyChoice – BBC News Interview with Sophie Compton and Sophie Mortimer

NBCFound through Google, bought with Visa and Mastercard: Inside the deepfake porn economy 

Channel 4 NewsIs enough being done to protect women whose sexual images are posted online without their consent?

GlamourEmma Barnett: Misuse and abuse of nude photos can utterly ruin your life, so what can we do about it?

ExpressRevenge porn attacks doubled in last two years

Huffpost‘Revenge Porn’ Ruined My Teen Years. Now I’m Fighting Back

ITV NewsWhat it feels like to be a victim of revenge porn

BBCRevenge porn law: Norfolk victim of online intimate photos ‘distraught’

BBCStolen naked images traded in cities around the world

BBC#MyImageMyChoice – BBC News with Victoria Derbyshire

MIT Technology ReviewDeepfake porn is ruining women’s lives. Now the law may finally ban it

The Times‘I feel violated’ — website lists stolen nude pictures by women’s location

NowThisThis survivor-led campaign from @myimagemychoice is pushing for new laws against ‘revenge porn’

Take Action

#MyImageMyChoice created a landmark research dossier on the landscape of deepfake abuse in collaboration with researcher Genevieve Oh. The research shows an exponential rise in deepfake abuse in 2023 and 2024 from previous years.

The recent rise is driven by a vast increase in user-friendly “nudifier” apps that generate deepfake abuse content in seconds as well as the growing opportunity to monetize deepfake abuse via online market places, among other factors. This growth is abetted by dozens of tech companies that host, process payments for, and drive internet users to this content, including Google, Microsoft, Amazon, Cloudflare, Apple, Visa, Mastercard. And our legislative infrastructure is currently failing to prevent this culture from furthering itself into the mainstream.

Access the full dossier by clicking the above button.

Join over 55,000 people in calling on governments and tech companies to #BlockMrDeepfakes and the 3000+ websites dedicated to image-based sexual abuse. Companies are profiting off this abuse, and these sites need to be shut down.

Help us communicate our work.

Sample letters to political representatives in the US, UK and EU.
Sample letters to key changemakers Google, Verizon, Visa, Paypal, Mastercard.

#MyImageMyChoice is creating a cultural movement where everyone can speak out about their experiences of image-based sexual abuse. It aims to overturn the shaming and silencing survivors face, and stimulate a new digital phase of the #metoo movement. Your story has immense power, and we support people to share it either anonymously or publicly.

Are you experiencing intimate image abuse?

Get Support

Here is a list of resources for people experiencing intimate image abuse, compiled by The Reclaim Coalition to End Online Image-based Sexual Violence. Please know that you are not alone, the responsibility for your consent being violated lies solely with the perpetrator, not with anything you did or said. There are many incredible organizations who might be able to help with advice, takedown requests, legal support. Please don’t hesitate to reach out if you want more information at myimagemychoice@gmail.com.

Get support

988 Lifeline

RAINN

Love Is Respect

Sanar Institute

Chayn

Callisto & Callisto Vault

Content removal

Global resources

The Revenge Porn Helpline

The Cyber Civil Rights Initiative

Take It Down

Stop Non-Consensual Intimate Image (NCII) Abuse

Alecto AI

Legal support

C.A. Goldberg, PLLC

Learn more about IBSV and how to take action

The Reclaim Coalition