How Artificial Intelligence Is Fueling Incel Communities
In late January 2024, X was flooded with graphic, deepfaked images of Taylor Swift. While celebrities have long been the victims of photo leaks and cyber-attacks, this time it was different because these were generated using artificial intelligence.
The images were quickly reported by the “Shake It Off” singer’s fanbase and taken down after being live on the poster’s profile for less than a day. However, it was enough time for them to go viral despite the platform having policies against non-consensual nudity. A report from disinformation research firm Graphika later found that the images had been created on 4chan, where users encouraged each other to generate sexually charged deepfakes in an attempt to skirt content policies surrounding nudity using famous female celebrities.
How AI Deepfakes Are Being Used to Find Mexico's Missing People
Unfortunately, Swift’s experience isn’t a one-off. Marvel actress Xochitl Gomez, who is only 17 years old at the time of reporting, said on the podcast The Squeeze that she struggled to get deepfakes of her taken down from X and shared the mental impact that had on her. Gomez and Swift are just two of the countless women who’ve recently become victims to deepfakes depicting them in sexual ways.
“People have always used media to try and defame people, that hasn’t changed. What’s changed is how accessible it’s now gotten,” Siwei Lyu, professor of Computer Science at the University of Buffalo, told The Daily Beast.
Late last year, AI image generation platform CivitAI became popular for its “Bounties” feature, which encouraged users to create deepfakes in exchange for virtual rewards. Almost all the bounties created were of women, according to reporting from 404 Media. Some included women who weren’t celebrities or public figures either, but rather private citizens.
Experts expect it to only get worse—especially as more and more incel communities online use these technologies. Henry Ajder, an AI and deepfake adviser and expert, told The Daily Beast that this has been a growing problem for years now and CivitAI is an example of a platform heavily linked to that kind of evolution.
He said that CivitAI has become a “hotbed for not just artistically created content, but also content that’s erotic. It’s a specific place to find specific knowledge and people have started using it for pornographic content.”
Ajder also describes the technology on the platform as “agnostic or dual use,” saying once it’s there it can be used in any way, “while, others are explicitly designed for creating pornographic content without consent.” The tools have only gotten popular within incel culture via platforms like Reddit and 4chan.
“There’s such a low threshold,” Hera Husain, founder of Chayn, a nonprofit supporting victims of gender-based violence and trauma, told The Daily Beast. “It’s an easy-to-access method which allows people to fulfill the darkest fantasies they may have. [...]They may feel it is victimless, but it has huge consequences for those people.”
It’s not just deepfakes that have penetrated incel culture either. There’s even research that shows that AI girlfriends will be making incels even more dangerous. With this tech allowing them to form and control their perceptions of a so-called “ideal woman,” there’s a danger that they may push those perceptions on real women. When they find themselves unable to do so or when a woman seems unattainable like in the case of Swift or Gomez, incels begin deepfake campaigns. At least, then, incels can make these women do what they like.
“Governments are simply trying to play catch-up; the technology has gone faster than their ability to regulate,” Belinda Barnet, senior lecturer in media at Swinburne University, told The Daily Beast.
‘A different kind of trauma’
This gets even more dangerous as we look at global contexts. Patriarchal norms in different nations often further endanger women who become victims to such campaigns. In many more conservative countries, even a deepfake of a woman can be enough for her family to ostracize her or, in extreme cases, use violence against her. For example, in late 2023, an 18-year-old was killed by her father over an image of her with a man which police suspect was doctored.
It doesn’t matter that the image is fake. The fact that their image is associated with such a depiction is enough for society to ostracize them. “It’s not so much about people believing the images are real as it is about pure spite. It’s a different kind of trauma to revenge porn,” Ajder explained.
With AI generation becoming more accessible, this also makes it an easier barrier to entry for global incels who may have struggled with language barriers. In South Asia, where Husain focuses much of her work, it also becomes harder to counter incel radicalization, both socially and on a policy level. “They don’t have as strong a counter to the radicalization they’re seeing in the incel community,” she explained.
Lyu says that policies regarding free speech and tech access across the world vary so there can be different impacts. “In the U.S., using AI generation tools to create content... is freedom of speech—but people can take advantage of that as well. Drawing that line becomes very hard. Whereas in China, there’s very strong limitations on the use of this technology, so that is possible but prevents positive uses of the same line of technology.”
Incel culture existed long before AI generation tools became popular. Now that they’re mainstream, these communities will be quick to adopt them to further cause harm and trauma. The issue is sure to get worse before it gets better.
“In terms of incel culture, this is another weapon in their twisted arsenal to abuse women, perpetuate stereotypes, and further make visceral the twisted ideas they have about women,” Ajder said.
Get the Daily Beast's biggest scoops and scandals delivered right to your inbox. Sign up now.
Stay informed and gain unlimited access to the Daily Beast's unmatched reporting. Subscribe now.