Press Coverage
Our Work To Date
#MyImageMyChoice aims to amplify survivor voices and advocacy for change. Starting as a grassroots project, #MyImageMyChoice has gone on to have an outsized impact, working with over 30 survivors, grassroots groups, as well as governments, companies and international NGOs such as The White House and Bumble.

Grassroots organizing.
We have built coalitions of survivors and advocates, including a network of ambassadors on college campuses, petitions gaining 100k+ signatures and viral digital content.
Awareness Raising
Content we helped create includes a video with GIBI ASMR that reached 700k+ views, a video with NowThis that received 150k+ impressions across all platforms, and a video with Alkiiwii that reached over 2.5 million users and gained 200k likes. We co-hosted the world’s first virtual summit on deepfake abuse, collaborating with 30+ speakers as well as 40 partners & supporters, and drawing 1500+ attendees.
#MyImageMyChoice co-founders were invited to speak on a panel at Adobe’s Content Authenticity Initiative Symposium hosted by the Stanford Data Lab; at the Mozilla Festival about the ways philanthropy can intervene to stop online sexual violence; at an event hosted by Control AI to inform parents about the dangers of deepfake technology, among others.

Tangible policy change
Testimonies from our survival coalition have been crucial to landmark reports by The White House, UK Law Commission, and others. We have been invited to be Advisors to the World Economic Forum and participate in roundtables at The White House, UK Parliament and others.
The White House told us the film specifically galvanized Kamala Harris’ work on this issue, directly contributing to the inclusion of a section on deepfake abuse in the president’s historic AI Executive Order. In recognition of this, the White House invited the film team and participants Taylor and Gibi to attend its signing in October 2023. We developed a landmark dossier of research that has been crucial to many US policymakers’ work on this issue.
#MyImageMyChoice testimonies also contributed to UK Law Commission reports, and subsequent policy change through the UK’s landmark Online Safety Bill, which includes the criminalization of non-consensual sharing of deepfake intimate images. We hosted a screening in the UK Parliament with MP Jess Phillips in January 2024, using the film to support efforts to ensure that protections against deepfakes outlined in the Online Safety Bill are effectively implemented. The UK Parliament has since announced plans to criminalize the very creation of deepfake abuse, regardless of intent to distribute.



The Facts
Deepfake abuse is a form of image-based sexual violence, where somebody’s image is inserted into sexually explicit content without their consent. It includes videos where one person’s face is placed on the body of a porn performer without consent, real images digitally altered so that the body is stripped of all clothing, or images generated by AI that resemble the victim depicted in sexually explicit scenarios.
Language matters: we use the term ‘deepfake sexual abuse’ or ‘deepfake abuse’ rather than deepfake porn and image-based sexual violence rather than ‘revenge porn’. It is not pornography, these are tech-enabled online types of sexual abuse.
It’s when someone takes, creates or shares intimate images of another person without their consent, or threatens to do so. Content can include imagery that is sexual, nude, or suggestive, including:
- Sharing nude images (so-called ‘revenge porn’)
- Creating or sharing fake porn imagery like deepfake videos
- Spycam footage
- Videos of sexual assaults or rapes
- Threats to share (often for coercion)
In surveys, in the US, 1 in 12 U.S. adults reported that they have been victims of image-based abuse, but real rates could actually be a lot higher. The victims are overwhelmingly women (95%). The perpetrators are mostly men (76%). It’s not just the creators, but those who consume this content that give rise to this abuse.
In 2023, more deepfake abuse videos were shared than the total number from every other year in history. On the top 40 sites dedicated to deepfake abuse there are now over 270 thousand videos, gaining over 4 billion views. This is a 3000% increase from 2019.
In 2023 there was a new and very concerning development: the growth of “nudify” apps – apps that seemingly strip clothes of pictures, or generate imagery that portrays someone in an explicit context. These apps have massively lowered the barrier to entry – now anyone can make deepfake abuse material on their smartphone.
Again the rate of increase has been vast: in Jan 2023 there were a handful of these apps. There are now over 290. 80% launched in the past 12 months. One, Undress.ai, processed 600,000 photos of regular women in the first 21 days after it launched.
Nudify apps have resulted in the most targeted group in terms of numbers of victims is no longer female celebrities but ordinary women.
In most places across the globe, including most states in the USA, there are no laws preventing people from creating and sharing deepfake abuse material.
There are some movements to change this. The UK passed the Online Safety Act in October 2023, which made the sharing of deepfake abuse an offense and began the process of requiring tech platforms to take more accountability for the hosting and distribution of deepfake abuse material. In April 2024 the UK Parliament announced its proposal for a new law to criminalize the creation of deepfake abuse, regardless of the intent to distribute the material. As a result of this announcement, the largest and most notorious deepfake abuse website MrDeepfakes.com blocked access to users across the UK.
In the US, senators have introduced a new bill, the DEFIANCE Act of 2024, creating a federal civil right of action for survivors to seek justice against perpetrators who create, distribute, or solicit deepfake abuse material. In May 2024, the White House issued a Call to Action urging the tech industry to take steps to combat deepfake abuse. However, US representatives have yet to introduce federal legislation that holds accountable creators and distributors of deepfake abuse, or which holds companies to account for their role in facilitating the distribution of this content.
We cannot wait around for governments to tackle this, we need to send a clear message that this is not OK.
94% of deepfake abuse is hosted on sites dedicated to this practice – dedicated solely to violating consent. The biggest site, MrDeepFakes.com, gets 13.4 million hits a month. What we have seen over the past few years is deepfaking go from a fringe community of individuals on 4chan or Reddit into a mainstream industry that has been emboldened by a total lack of accountability. This industry has got organized: creators are making serious money, some over 20k a month, some are hiring assistants to help with the volume of commissions.
The tech sector in particular bears huge responsibility for the way that this practice has been popularized. Our research indicates that Google Search drives 68% of all web traffic to deepfake sites, and drives traffic to sites, how-to-tutorials, apps even when users are not looking for this material.
This can have devastating and even life-threatening impacts. It can lead to serious mental health challenges and ‘social rupture’ such as stigmatization and isolation. This can be ‘life-shattering’. Helen, a 34 year old teacher, started experiencing panic attacks after her abuse: “Every time I left the house, I got a sense of dread. Everyone I passed in the street, I just felt like they knew.” Chrissy, Youtuber, “It felt like the image was kind of like ammunition that could be used against me for the rest of my life. It just became very easy to think about how death would be much easier than living this hell.”
The tragic reality is that several young women have taken their lives because of this. One, Basant, who was 17, wrote a letter to her mum starting with the line “Mom, believe me, the girl in those pictures is not me”
What’s also chilling is the way swathes of society – in particular women in the public eye like politicians, journalists, Youtubers, actors – have come to expect this abuse as part of the job. This creates the ‘silencing effect’. Victims modify their behavior, retreat from online spaces, and are shut out from full participation in public discourse – especially online. Most people don’t want to risk speaking out about their experiences because this might provoke retaliation, or drive more viewers to their intimate content. This is turning away generations of women from pursuing the career they want and discovering their full potential.
On top of this, victims deal with a culture of victim blaming. Police, moderators, bosses, lawmakers routinely downplay or dismiss victims’ experiences, or question the victims actions. Victims are scrutinized and judged for pictures they chose to take, clothes they chose to wear, the job they chose to do, the privacy settings they chose to assume, the job they chose to do. But the people who should be scrutinized are the ones breaching the victim’s consent.
Download our landmark research dossier into the Landscape of Deepfake Abuse here.
Survivor stories
The issue of deepfake sexual abuse is growing exponentially, it has risen by 3000% since 2019. We created a film and a cultural movement to combat this.
#MyImageMyChoice is a cultural movement changing laws and culture around deepfake sexual abuse, from the creators of the documentary ANOTHER BODY.
ANOTHER BODY is a SXSW winning and Emmy nominated documentary that follows a US college student’s quest for justice after she finds deepfake pornography of her online.
The Campaign
Who We Are
#MyImageMyChoice aims to amplify the voices of intimate image abuse survivors, and create cultural change.
Starting as a grassroots project, it has gone on to have an outsized impact, working with over 30 survivors as well as organizations including The White House, Bumble, World Economic Forum and survivor coalitions.
We want to:
- Overturn the shaming and silencing survivors face
- Campaign for legislation
- Work to #BlockMrDeepfakes and the 3000+ sites dedicated to this abuse
- Stimulate a digital phase of the #metoo movement
- Educate on digital consent
#MyImageMyChoice is proud to partner with Synthesia, the world’s largest AI video generation platform, to safely bring anonymous stories to life with our AI avatars. Synthesia believes in the good that AI has to offer, all content uploaded to the site is consensual. The problem doesn’t lie with technology, but how we use it.
We would love for organizations to join our coalition, please reach out to myimagemychoice@gmail.com!





Impact to Date
Huge grassroots support: We have built coalitions of survivors and advocates, including a network ambassadors on college campuses, and created petitions gaining 100k+ signatures.
Global press coverage: With NPR, ABC, BBC, The Guardian, Vogue and more, we are driving this issue up the cultural agenda.
Tangible policy change: Our testimonies have been crucial to landmark reports by The White House, UK Law Commission, and others.
Viral Digital Content: we produced content including videos with GIBI ASMR (700k+ views) and NOW THIS (150k+ impressions).
Expertise: we built trusted relationships with key stakeholders, and been invited to be Advisors to the World Economic Forum and others.
Past Events
NowThis – Can your profile pic be used as deepfake porn?
ABC News – AI-Generated: How nonconsensual deepfake porn targets women
The Times – I stared at my face in a sex video. I was a deepfake porn victim.
The Guardian – Inside the Taylor Swift deepfake scandal: ‘It’s men telling a powerful woman to get back in her box’
Glamour UK – It’s not just Taylor Swift; all women are at risk from the rise of deepfakes
Context – Tough laws needed for deepfake porn, say women who suffer AI abuse
Cosmopolitan – The new law on deepfakes has officially kicked in: here’s all you need to know
CNN – Opinion: The rise of deepfake pornography is devastating for women
Vogue – Deepfakes Are Ruining Lives, This Form Of Abuse Must Be Stopped
BBC – #MyImageMyChoice – BBC News Interview with Sophie Compton and Sophie Mortimer
NBC – Found through Google, bought with Visa and Mastercard: Inside the deepfake porn economy
Channel 4 News – Is enough being done to protect women whose sexual images are posted online without their consent?
Express – Revenge porn attacks doubled in last two years
Huffpost – ‘Revenge Porn’ Ruined My Teen Years. Now I’m Fighting Back
ITV News – What it feels like to be a victim of revenge porn
BBC – Revenge porn law: Norfolk victim of online intimate photos ‘distraught’
BBC – Stolen naked images traded in cities around the world
BBC – #MyImageMyChoice – BBC News with Victoria Derbyshire
MIT Technology Review – Deepfake porn is ruining women’s lives. Now the law may finally ban it
The Times – ‘I feel violated’ — website lists stolen nude pictures by women’s location
NowThis – This survivor-led campaign from @myimagemychoice is pushing for new laws against ‘revenge porn’
Take Action
#MyImageMyChoice created a landmark research dossier on the landscape of deepfake abuse in collaboration with researcher Genevieve Oh. The research shows an exponential rise in deepfake abuse in 2023 and 2024 from previous years.
The recent rise is driven by a vast increase in user-friendly “nudifier” apps that generate deepfake abuse content in seconds as well as the growing opportunity to monetize deepfake abuse via online market places, among other factors. This growth is abetted by dozens of tech companies that host, process payments for, and drive internet users to this content, including Google, Microsoft, Amazon, Cloudflare, Apple, Visa, Mastercard. And our legislative infrastructure is currently failing to prevent this culture from furthering itself into the mainstream.
Access the full dossier by clicking the above button.
Join over 55,000 people in calling on governments and tech companies to #BlockMrDeepfakes and the 3000+ websites dedicated to image-based sexual abuse. Companies are profiting off this abuse, and these sites need to be shut down.
Sample letters to political representatives in the US, UK and EU.
Sample letters to key changemakers Google, Verizon, Visa, Paypal, Mastercard.
#MyImageMyChoice is creating a cultural movement where everyone can speak out about their experiences of image-based sexual abuse. It aims to overturn the shaming and silencing survivors face, and stimulate a new digital phase of the #metoo movement. Your story has immense power, and we support people to share it either anonymously or publicly.
Get Support
Here is a list of resources for people experiencing intimate image abuse, compiled by The Reclaim Coalition to End Online Image-based Sexual Violence. Please know that you are not alone, the responsibility for your consent being violated lies solely with the perpetrator, not with anything you did or said. There are many incredible organizations who might be able to help with advice, takedown requests, legal support. Please don’t hesitate to reach out if you want more information at myimagemychoice@gmail.com.
Get support
Content removal
The Cyber Civil Rights Initiative
Stop Non-Consensual Intimate Image (NCII) Abuse
Legal support
Learn more about IBSV and how to take action