How sexually explicit deepfakes undermine democracy and women’s role in the EU

Female politicians have been disproportionately impacted by AI-generated pornographic content.
A solidarity protest of women in the UK against gender-based violence, 2021.

By Roos Döll

Editorial Assistant at The Parliament Magazine

02 Oct 2024

Every iteration of digital innovation brings new threats to people and institutions. One gaining increasing attention is the sexually explicit deepfake – hyper-realistic, AI-generated videos and images depicting people in fabricated explicit scenarios. 

This form of online harassment disproportionately impacts women. A study in late 2023 found that 98 per cent of online deepfake videos were pornographic, and that 99 per cent of those targeted were women or girls. 

“The software to create these images is freely available and relatively easy to use,” Dean Fido, an associate professor in forensic psychology at the University of Derby in the UK, tells The Parliament. 

All it takes is a few images, often pulled from social media or public events, to create viral content designed to damage the reputation of their mostly female targets. Italian Prime Minister Giorgia Meloni, the UK’s Deputy Prime Minister Angela Rayner and Germany’s Foreign Minister Annalena Baerbock are among the more prominent examples of female politicians who have been targeted by sexualised deepfake campaigns. These often coincide with critical election periods. 

As AI-powered image generators become more widely available, women are at increased risk of falling victim to this type of attack, which can “deeply harm their mental health and professional standing,” Fido says. 

In many instances, he adds, the victims are unaware that their manipulated images are circulating online, while the creators of these deepfakes rake in revenue from them. 

A ‘chilling effect’ on democracy 

The EU has stepped in with policies aimed at protecting women from online abuse. Legal efforts include the Gender-Based Violence Directive, which criminalises gender-based violence, including online harassment. The Digital Services Act (DSA) also addresses this issue. 

“It’s something that women have been calling for a long time,” Alexandra Geese, a Greens MEP, tells The Parliament. “There was a huge sense of urgency, especially as far as deepfake images are concerned, which are exploding.” 

Geese, who says she has not been the victim of deepfakes, has been a strong advocate in the EP for tighter enforcement against them.  

Both male and female politicians face online abuse, but the nature of these attacks differs. Men are often targeted based on their competence or policies, while women are more likely to be targeted in ways that exploit their bodies and sexuality, making this kind of deepfake a uniquely gendered form of harassment.  

“The targeting of women politicians is particularly concerning because it adversely impacts our democracy,” Clare McGlynn, a professor of law at Durham University in the UK, tells The Parliament. “More women and girls are going to be reluctant to go into politics. Those who are in politics have to constantly second-guess what they’re seeing online and worry about how even innocent interactions might be manipulated and weaponised against them.” 

However, current regulations do not cover the full range of potentially harmful content. Some doctored images may not be explicit enough to fall under enforcement guidelines. Also, the legal standard of “serious harm” can be difficult to prove. 

“Any woman knows that having a nude image of herself anywhere out there is the definition of serious harm,” MEP Geese says. “There shouldn’t be any need to prove this.” 

More enforcement needed 

Sexually explicit deepfakes are on the rise just as EU institutions push for greater gender parity within their ranks. In putting together her next Commission, President Ursula von der Leyen requested that each member state nominate both a male and a female candidate. In the end, she assembled a Commission that, if approved by the EP, will be approximately 40 per cent female. 

“If we don’t act soon, we risk losing a generation of female leaders. We could do so much more even with the laws and regulations we have, but we don’t yet have tech companies doing enough about it,” McGlynn says. 

That’s what the DSA sets out to do. Article 34 specifically identifies gender-based online harm as a “systemic risk” that large online platforms must monitor and address. If they fail to do this, the European Commission can intervene, but the Commission has been criticised for moving too slowly.  

Advocates for stronger enforcement also want to see improved AI detection tools, stricter identity verification processes for content uploads, and harsher penalties for platforms that do not act swiftly. 

“Without that, women will continue to suffer, and the creators of these deepfakes will operate with impunity,” Geese warns. 

Read the most recent articles written by Roos Döll - What happens to Schengen in an era of renewed border control?

Related articles