The Issue
#MyImageMyChoice is a cultural movement tackling intimate image abuse from the creators of the documentary ANOTHER BODY.
Intimate image abuse can have devastating, even life-threatening, impacts. But governments and tech platforms aren’t doing anything to address it. Intimate image abuse websites, based on violating consent, have become thriving online businesses. Companies like Google, Visa, Verizon are enabling and profiting off this abuse, and normalizing misogyny.
We are campaigning for governments and tech companies to #BlockMrDeepfakes and the 3000+ sites dedicated to online gendered abuse.
Our Work To Date
#MyImageMyChoice aims to amplify survivor voices and advocacy for change. Starting as a grassroots project, #MyImageMyChoice has gone on to have an outsized impact, working with over 30 survivors, grassroots groups, as well as governments, companies and international NGOs such as The White House and Bumble.
Grassroots organizing.
We have built coalitions of survivors and advocates, including a network of ambassadors on college campuses, petitions gaining 100k+ signatures and viral digital content.
Awareness Raising
Content we helped create includes a video with GIBI ASMR that reached 700k+ views, a video with NowThis that received 150k+ impressions across all platforms, and a video with Alkiiwii that reached over 2.5 million users and gained 200k likes. We co-hosted the world’s first virtual summit on deepfake abuse, collaborating with 30+ speakers as well as 40 partners & supporters, and drawing 1500+ attendees.
#MyImageMyChoice co-founders were invited to speak on a panel at Adobe’s Content Authenticity Initiative Symposium hosted by the Stanford Data Lab; at the Mozilla Festival about the ways philanthropy can intervene to stop online sexual violence; at an event hosted by Control AI to inform parents about the dangers of deepfake technology, among others.
Tangible policy change
Testimonies from our survival coalition have been crucial to landmark reports by The White House, UK Law Commission, and others. We have been invited to be Advisors to the World Economic Forum and participate in roundtables at The White House, UK Parliament and others.
The White House told us the film specifically galvanized Kamala Harris’ work on this issue, directly contributing to the inclusion of a section on deepfake abuse in the president’s historic AI Executive Order. In recognition of this, the White House invited the film team and participants Taylor and Gibi to attend its signing in October 2023. We developed a landmark dossier of research that has been crucial to many US policymakers’ work on this issue.
#MyImageMyChoice testimonies also contributed to UK Law Commission reports, and subsequent policy change through the UK’s landmark Online Safety Bill, which includes the criminalization of non-consensual sharing of deepfake intimate images. We hosted a screening in the UK Parliament with MP Jess Phillips in January 2024, using the film to support efforts to ensure that protections against deepfakes outlined in the Online Safety Bill are effectively implemented. The UK Parliament has since announced plans to criminalize the very creation of deepfake abuse, regardless of intent to distribute.
The Facts
It’s when someone takes, creates or shares intimate images of another person without their consent, or threatens to do so. Content can include imagery that is sexual, nude, or suggestive, including:
- Sharing nude images (so-called ‘revenge porn’)
- Creating or sharing fake porn imagery like deepfake videos
- Spycam footage
- Videos of sexual assaults or rapes
- Threats to share (often for coercion)
In the US, 1 in 12 U.S. adults reported that they have been victims of image-based abuse.
1 in 20 U.S. adults reported that they have been perpetrators of image-based abuse.
But real rates could actually be a lot higher – 1 in 3 people reported being victims in the UK and Australia.
The victims are overwhelmingly women (95%).
The perpetrators are mostly men (76%).
And it’s not just the creators, but those who consume this content that give rise to this abuse.
It’s not all ‘revenge’ – people share images because they want sexual gratification, control, money, or because of voyeurism, extortion, misogyny, obsession. Some want increased social status and feel entitled to share these images for a laugh. Research on unsolicited images shows that some people believe it’s flattering or flirtatious.
For those that consume this content, each individual who watches or forwards a video might not realize that their actions are also giving rise to the problem.
This can have devastating and even life-threatening impacts. It can lead to serious mental health challenges and ‘social rupture’ such as stigmatization and isolation. This can be ‘life-shattering’.
But what’s also chilling is the way swathes of society – in particular women in the public eye like politicians, journalists, Youtubers, actors – have to just get used to this abuse as part of the job. This creates the ‘silencing effect’.
Because of this abuse, many victims are silenced. They modify their behavior, retreat from online spaces, and are shut out from full participation in public discourse – especially online. Most people don’t want to risk speaking out about their experiences because this might provoke retaliation, or drive more viewers to their intimate content.
On top of this, victims deal with a HUGE amount of victim blaming. Police, moderators, bosses, lawmakers routinely downplay or dismiss victims’ experiences, or question the victims actions. Victims are scrutinized and judged for pictures they chose to take, clothes they chose to wear, the job they chose to do, the privacy settings they chose to assume, the job they chose to do. But the people who should be scrutinized are the ones breaching the victim’s consent.
Most governments aren’t taking action. Most don’t have laws, or their laws are full of loopholes. Most countries don’t have a framework around who is responsible for policing online spaces. So whole communities and online cultures are now thriving on this abuse. People are making serious money from perpetrating this. Specific websites like AnonIB, which is set up to request intimate images of women from specific towns or schools, or MrDeepfakes.com, the world’s biggest deepfake porn website, are allowed to grow and face no consequences.
Things might be changing. In the US, Kamala Harris introduced an online harassment task force. The EU is introducing laws that will begin a process of requiring tech platforms to take more accountability. The UK is soon to introduce the Online Safety Bill, which criminalizes deepfakes and should force tech platforms to take action to remove them. However, it is unclear how this will be effectively implemented. We cannot wait around for governments to tackle this, we need to send a clear message that this is not OK.
Download our landmark research dossier into the Landscape of Deepfake Abuse here.
Survivor stories
The issue of deepfake sexual abuse is growing exponentially, it has risen by 3000% since 2019. We created a film and a cultural movement to combat this.
#MyImageMyChoice is a cultural movement changing laws and culture around deepfake sexual abuse, from the creators of the documentary ANOTHER BODY.
ANOTHER BODY is a SXSW winning and Emmy nominated documentary that follows a US college student’s quest for justice after she finds deepfake pornography of her online.