The Issue
#MyImageMyChoice is a cultural movement tackling intimate image abuse from the creators of the documentary ANOTHER BODY.
Intimate image abuse can have devastating, even life-threatening, impacts. But governments and tech platforms aren’t doing anything to address it. Intimate image abuse websites, based on violating consent, have become thriving online businesses. Companies like Google, Visa, Verizon are enabling and profiting off this abuse, and normalizing misogyny.
We are campaigning for governments and tech companies to #BlockMrDeepfakes and the 3000+ sites dedicated to online gendered abuse.
The Facts
It’s when someone takes, creates or shares intimate images of another person without their consent, or threatens to do so. Content can include imagery that is sexual, nude, or suggestive, including:
- Sharing nude images (so-called ‘revenge porn’)
- Creating or sharing fake porn imagery like deepfake videos
- Spycam footage
- Videos of sexual assaults or rapes
- Threats to share (often for coercion)
In the US, 1 in 12 U.S. adults reported that they have been victims of image-based abuse.
1 in 20 U.S. adults reported that they have been perpetrators of image-based abuse.
But real rates could actually be a lot higher – 1 in 3 people reported being victims in the UK and Australia.
The victims are overwhelmingly women (95%).
The perpetrators are mostly men (76%).
And it’s not just the creators, but those who consume this content that give rise to this abuse.
It’s not all ‘revenge’ – people share images because they want sexual gratification, control, money, or because of voyeurism, extortion, misogyny, obsession. Some want increased social status and feel entitled to share these images for a laugh. Research on unsolicited images shows that some people believe it’s flattering or flirtatious.
For those that consume this content, each individual who watches or forwards a video might not realize that their actions are also giving rise to the problem.
This can have devastating and even life-threatening impacts. It can lead to serious mental health challenges and ‘social rupture’ such as stigmatization and isolation. This can be ‘life-shattering’.
But what’s also chilling is the way swathes of society – in particular women in the public eye like politicians, journalists, Youtubers, actors – have to just get used to this abuse as part of the job. This creates the ‘silencing effect’.
Because of this abuse, many victims are silenced. They modify their behavior, retreat from online spaces, and are shut out from full participation in public discourse – especially online. Most people don’t want to risk speaking out about their experiences because this might provoke retaliation, or drive more viewers to their intimate content.
On top of this, victims deal with a HUGE amount of victim blaming. Police, moderators, bosses, lawmakers routinely downplay or dismiss victims’ experiences, or question the victims actions. Victims are scrutinized and judged for pictures they chose to take, clothes they chose to wear, the job they chose to do, the privacy settings they chose to assume, the job they chose to do. But the people who should be scrutinized are the ones breaching the victim’s consent.
Most governments aren’t taking action. Most don’t have laws, or their laws are full of loopholes. Most countries don’t have a framework around who is responsible for policing online spaces. So whole communities and online cultures are now thriving on this abuse. People are making serious money from perpetrating this. Specific websites like AnonIB, which is set up to request intimate images of women from specific towns or schools, or MrDeepfakes.com, the world’s biggest deepfake porn website, are allowed to grow and face no consequences.
Things might be changing. In the US, Kamala Harris introduced an online harassment task force. The EU is introducing laws that will begin a process of requiring tech platforms to take more accountability. The UK is soon to introduce the Online Safety Bill, which criminalizes deepfakes and should force tech platforms to take action to remove them. However, it is unclear how this will be effectively implemented. We cannot wait around for governments to tackle this, we need to send a clear message that this is not OK.
Survivor stories
The issue of deepfake sexual abuse is growing exponentially, it has risen by 3000% since 2019. We created a film and a cultural movement to combat this.
#MyImageMyChoice is a cultural movement changing laws and culture around deepfake sexual abuse, from the creators of the documentary ANOTHER BODY.
ANOTHER BODY is a SXSW winning and Emmy nominated documentary that follows a US college student’s quest for justice after she finds deepfake pornography of her online.