#MyImageMyChoice is creating a cultural movement where everyone can speak out about their experiences of image-based sexual abuse, addressing creators and consumers with consequences of their actions.
About
Our private, intimate images have been shared without our consent. We’ve been secretly filmed, threatened, deepfaked. We’ve been assaulted, and footage from this has been distributed. Our images have been requested, traded and purchased on forums, chat rooms, and pornsites. But we should all have a right to privacy and agency over images of our bodies.
Intimate image abuse can have devastating, even life-threatening impacts. But most victims don’t receive support from governments, police, tech platforms or law. The only action they can take is to learn to move on. Meanwhile, websites specifically dedicated to intimate image abuse (like Anon IB or MrDeepfakes) are thriving. These sites need to be shut down.
There’s a huge disconnect between how (mostly male) perpetrators and (mostly female) victims perceive this content. While victims experience it as devastating, creators and consumers might not realize this is a big deal – thinking it’s just a laugh, it’s flirtatious. This creates the opportunity to start a conversation. How can we bridge that gap?
In Autumn 2022, the White House is leading a Task Force to Address Online Harassment and Abuse. Now is the time to do this.
The Facts
It’s when someone takes, creates or shares intimate images of another person without their consent, or threatens to do so. Content can include imagery that is sexual, nude, or suggestive. This includes:
- Sharing nude images (so-called ‘revenge porn’)
- Creating or sharing fake porn such as deepfake videos
- Spy cam footage
- Filming a sexual assault or rapes
- Threats to share images (often for coercion)
In the US, 1 in 12 U.S. adults reported that they have been victims of intimate image abuse.
1 in 20 U.S. adults reported that they have been perpetrators of intimate image abuse.
The victims are overwhelmingly female-identifying (95%).
The perpetrators are mostly male-identifying (76%).
And it’s not just the creators, but those who consume this content that give rise to this abuse.
It’s not all ‘revenge’ – people share images because they want sexual gratification, control, money, or because of voyeurism, extortion, misogyny, obsession. Some want increased social status and feel entitled to share these images for a laugh. Research on unsolicited images shows that some people believe it’s flattering or flirtatious.
For those that consume this content, each individual who watches or forwards a video might not realize that their actions are also giving rise to the problem.
There are more and more sites that are specifically set up to perpetrate this abuse, like MrDeepfakes and Anon IB:
MrDeepfakes is the world’s biggest non-consensual deepfake pornsite. It has hundreds of thousands of members. Its videos gain millions of views. It’s the first thing that comes up when you google ‘deepfake porn.’ On its flourishing forum, perpetrators can learn how to make better fake porn videos.
AnonIB is a site where people request and share non-consensual intimate images, listing victims by town, college, school. Its users trade hacked or leaked images of their exes, friends, classmates. People ask for ‘nudes of her little sister,’ ‘more of her,’ alongside demeaning comments.
This can have devastating and even life-threatening impacts. It can lead to serious mental health challenges and ‘social rupture’ such as stigmatization and isolation. This can be ‘life-shattering’.
But what’s also chilling is the way swathes of society – in particular women in the public eye like politicians, journalists, Youtubers, actors – have to just get used to this abuse as part of the job. This creates the ‘silencing effect’.
Because of this abuse, many victims are silenced. They modify their behavior, retreat from online spaces, and are shut out from full participation in public discourse – especially online. Most people don’t want to risk speaking out about their experiences because this might provoke retaliation, or drive more viewers to their intimate content.
On top of this, victims deal with a HUGE amount of victim blaming. Police, moderators, bosses, lawmakers routinely downplay or dismiss victims’ experiences, or question the victims actions. Victims are scrutinized and judged for pictures they chose to take, clothes they chose to wear, the job they chose to do, the privacy settings they chose to assume. But the people who should be scrutinized are the ones breaching the victim’s consent.
Most governments aren’t taking action. Most don’t have laws, or their laws are full of loopholes. Most countries don’t have a framework around who is responsible for policing online spaces. So whole communities and online cultures are now thriving on this abuse. People are making serious money from perpetrating this. Specific websites like AnonIB, which is set up to request intimate images of women from specific towns or schools, or MrDeepfakes.com, the world’s biggest deepfake porn website, are allowed to grow and face no consequences.
Things might be changing. In the US, Vice President Kamala Harris introduced an online harassment task force. The EU is introducing laws that will begin a process of requiring tech platforms to take more accountability. The UK might follow suit with a bill called the Online Safety Bill, and might make some offenses illegal. But these new ideas are controversial, patchy, unclear, and they may not even pass. Governments must prioritize legal and policy reforms as a critical step to tackling intimate image abuse. But, we can’t wait around for governments to tackle this, we need to send a clear message that this is not OK.
What we can
do about it
We need to shut down websites dedicated to intimate image abuse.
Help us achieve this by signing our petition here !
Sites driving this culture are thriving. Trying to pursue each individual perpetrator won’t tackle this, but stopping these websites might. We ask that:
- MrDeepfakes is shut down.
- AnonIB is shut down.
- Sites dedicated to intimate image abuse are shut down.
- Google and other search engines de-index these websites.
- Internet providers block these websites.
We need to bridge the perception gap between perpetrators and victims.
Help us achieve this by sharing your story, publicly or anonymously.
This abuse aims to shame and silence, but survivors need to feel able to speak out about their experiences, because perpetrators need to know that their actions have consequences.
We need better survivor support.
Help us achieve this by emailing your local politician with our email draft here.
Survivors need support getting images taken down, as well as support emotionally to rebuild their lives. We are calling for each country to create a statutory body with the power to enforce takedowns and the capacity to offer trauma-informed guidance.
Who Are We
My Image My Choice is a coalition of intimate image abuse survivors and advocates from across the globe. We aim to amplify survivor voices, and ensure that law and policy solutions are grounded in lived-experience. Our collective includes a range of perspectives, converging around our calls for survivor support and dedicated intimate image abuse sites to be shut down.
My Image My Choice is proud to partner with Synthesia, the world’s largest AI video generation platform, to safely bring anonymous stories to life with our AI avatars. Synthesia believes in the good that AI has to offer, all content uploaded to the site is consensual. The problem doesn’t lie with technology, but how we use it.
Our survivor testimonies have contributed to Whitepapers and reports by the World Economic Forum, Oxford Digital Ethics Lab, Equality Now, The UK Law Commission, Panorama Global and others, and featured across global press from BBC News to Now This, The Sunday Times, US Vogue. The campaign has been supported by politicians across political spectrums, NGOs, activist groups and many others. We would love for organizations to join our coalition, please reach out to myimagemychoice@gmail.com!






