How to get help if you've been a victim of sexually explicit deepfakes
Sexually explicit deepfakes have skyrocketed in the past few years, with almost one in 10 people in the UK becoming a victim to the digital crime, according to ESET.
A deepfake is a digitally altered image or video that has been created or manipulated by AI, to replace the face of one person with another. They can be extremely distressing to victims whose likeness has been stolen, and is a growing concern among women (61%), men (45%) and under 18-year-olds (57%) who are worried they may fall victim of this form of abuse.
It's a subject matter that former Geordie Shore star, Vicky Pattison, explores in a controversial new documentary airing on Channel 4 this evening. Vicky Pattison: My Deepfake Sex Tape digs into deepfake abuse and its effect on women, as a staggering 99% of sexually explicit deepfakes are of females.
The 37-year-old British personality wanted to show how "vulnerable all of us are" to this digital abuse, so she created her own deepfake sex tape and released it online. Pattison said it gave her a "glimpse into the powerlessness" that victims experience.
Watch: Vicky Pattison defends controversial deepfake documentary
Sexually explicit deepfakes have also been circulated by celebrities too, such as global popstar Taylor Swift, who was victim to this abuse, which was generated by AI, on X last year.
It comes after the government was encouraged by Baroness Charlotte Owen to expedite the proposed law to make it a criminal offence to create or share sexually explicit images without people’s consent, as the largest site dedicated to this abuse has over 13.4 million hits per month. Under this new law, anyone found guilty of one of the offences would face a fine or a jail sentence of up to six months.
In the UK, it has been a sexual offence to share revenge porn since 2015, which is intimate, private photos or videos of another person which has been shared without their consent.
Professor Clare McGlynn expert in the legal regulation of image-based sexual abuse, at Durham University says: "It is now so easy to create and share sexually explicit deepfakes. The platforms such as Instagram profit from deepfake abuse by advertise nudify apps.
"Deepfake abuse has been normalised by platforms and search engines that facilitate this abuse. Google makes it easy to find 'how to' tutorials, they highly rank nudify apps and deepfake websites."
If you have been a victim of a sexually explicit deepfake, there are several things you can do.
What to do if you've been a victim of a sexually explicit deepfake
Gather evidence of the deepfake abuse
Start to keep track of the deepfake abuse - if this feels triggering, maybe ask a friend or someone you trust for help with this. Copy the URL, take screenshots or save the video files to have evidence of this AI image abuse.
There’s multiple ways that AI generates these images, either from an existing image of someone online to create an entirely fake explicit video or image, or by superimposing that individual’s face onto a pre-existing explicit image.
If you can identify the image they used of you from your social media or online presence include this in your evidence too, as it will all help when reporting this abuse.
Report it on the social media platforms
If the Deepfake appears on a social media app, such as Instagram, X or TikTok, you can directly report the content for violating the app’s community guidelines. If you feel comfortable asking friends, then encourage them to report the content too as this can lead to the account being suspended or the content being deleted. McGlynn adds: "They can report it to the platforms where the material is shared."
Report the deepfakes on search engines like Google
You can submit a request on Google and Bing to remove non-consensual deepfake imagery, as well as doing this on the social media platform. McGlynn adds: "Victims can report a website to Google if it has been shared there. Google say that they will only downrank nudify apps and deepfake websites if they get lots of reports of abuse."
Speak to a helpline to help get the content removed
There are several helplines that you can phone to help you get the content removed online such as SWGFL, Revenge Porn Helpline or Victim Support. McGlynn explains that these organisations "will be able to help get material removed online. They can contact the websites or platforms where material has been shared." They can also offer support and guidance for what to do next and how to cope mentally with this experience of deepfake sexual abuse.
Report it to the police
The Met Police state that it is "illegal to share or threaten to share intimate photos or videos of someone without their consent and this includes deepfake images." You can report this online to the police and start a criminal case, McGlynn says.
Read more about deepfakes:
UK government to crack down on sexually explicit deepfakes (Euronews, 2-min read)
Deepfakes of children: how the government can get to grips with them (The Conversation, 5-min read)
Vicky Pattison defends controversial deepfake documentary (Yahoo Entertainment, 2-min read)