Women lawmakers 70 times more likely to be victims of deepfake porn, study finds
Women in Congress are more likely to be victims of deepfake pornography than their male counterparts, according to a national organization that combats disinformation online.
Deepfakes, also known as AI-generated non-consensual intimate imagery, use artificial intelligence or machine learning to create believable pictures, audio, and text of events that never happened, according to the Department of Homeland Security.
The images are also called deepfake pornography, according to the American Sunlight Project, or ASP, the Washington D.C. organization responsible for the study.
Fighting misinformation: How to keep from falling for fake news videos
Deepfakes mainly targeted women
Researchers found 35,239 mentions of U.S. lawmakers across 11 websites that focus on sexual deepfake content. During the analysis, the team found that women were 70 times more likely to be depicted in deepfake images than men.
The findings were published in The 19th, a Texas-based nonprofit newsroom.
“The report underscores the urgent need for legislative action to address the rise of malicious AI technology,” wrote the American Sunlight Project in a news release on Wednesday.
Deepfakes were found on well-known websites
The ASP team used a custom search engine to gather its data by searching for each member’s first and last name on 11 well-known deepfake websites.
The team found the websites by Googling terms such as “deepfake porn,” “deepfake celebrities” and “deepfake congress.” From there, they recorded the number of hits or mentions each member received.
The ASP research team searched for each member’s name in the custom search engine, including their formal names, abbreviations, and nicknames.
According to the team, they searched for lawmakers in each region, including New York, Virginia, Ohio, and California.
What else did the researchers find?
The American Sunlight Project found that 26 members of Congress and senators – 25 women and one man – were currently or previously on these websites.
“This isn’t a small difference,” the ASP wrote in its report. “It’s a massive disparity.”
As the researchers tracked how frequently each official’s name appeared on known deepfake websites, they also looked at factors such as age, political party, state representation, and self-identified ethnicity.
According to the ASP, older members of Congress are slightly less likely to appear in such imagery. The team found no link between the party or state members represented.
Women lawmakers had deepfakes removed within 48 hours
The ASP let members of Congress know about the deepfake content containing their likenesses, and less than 48 hours later, content targeting 14 members had been removed.
Still, results showed for at least nine members on landing or search result pages, as well as indexed links on Google.
“This highlights a large disparity of privilege,” the ASP wrote, adding that not all women have the resources to get content taken down so quickly.
The fact that people are creating these images of women in Congress is a symptom of a larger issue, the ASP said, arguing that gendered disinformation and harassment such as this makes life more challenging for women everywhere.
Are deepfakes illegal?
While Congress has yet to pass bills targeting deepfake pornography, some states such as California, Florida and Georgia have passed laws making sexual deepfake images punishable by law.
Legislators have introduced such bills at the federal level.
Congresswoman Yvette Clarke (D-New York) introduced the DEEPFAKES Accountability Act to the House of Representatives in September 2023, calling for civil penalties of up to $150,000 for those convicted of making deepfakes to harass others.
U.S. Representatives Madeleine Dean (D-Pennsylvania) and María Elvira Salazar (R-Florida) also introduced legislation in this year to “federally protect individuals from AI-generated deepfakes,” calling it the The Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act.
“As AI’s prevalence grows, federal law must catch up,” Dean said in a news release in September. “By granting everyone a clear, federal right to control digital replicas of their own voice and likeness, the NO FAKES Act will empower victims of deepfakes.”
If signed into law, the NO FAKES Act would’ve created “a nationwide solution to a patchwork of state laws and regulations by January 2, 2025.”
U.S. Senators Chris Coons (D-Delaware), Marsha Blackburn (R-Tennessee), Amy Klobuchar (D-Minnesota), and Thom Tillis (R-North Carolina) also introduced legislation in the Senate.
In their report, ASP researchers said the study comes at a time where Congress is considering bills to regulate the creation of deepfake pornography, encouraging “lawmakers to move quickly to pass these measures before the conclusion of the 118th Congress.”
What are the study’s limitations?
According to ASP researchers, some of the study’s limitations impact their ability to generalize their findings.
They note that the results represent just “a snapshot of a moment in time,” or early November 2024. The number of published deepfake content will change as some are added and taken down, the ASP said.
Another limitation is the fact that their data analyzes how frequently each member’s name appears in videos on deepfake websites, but it does not include details about the context.
“Factors such as how common a person’s name is in the general population could also lead to inflated counts or incorrect associations,” the researchers wrote.
To limit the number of unrelated results such as results with similar names, the researchers searched for names in quotation marks. Using quotation marks when searching lowers the likelihood of partial or unrelated references but it doesn’t eliminate false positives altogether, the team said.
Lastly, the team said its statistical methods helped them recognize associations but not causation. This means it’s not so easy to say one factor causes, increases, or decreases the other.
The researchers reiterated that having deepfake content removed within 48 hours the way the officials in their study did is a privilege, and it’s not one many women have been afforded.
“ASP urges the 118th congress to prioritize the passage of bills that aim to protect victims of (deepfakes), who include not only members of congress and other women in public life but also minors and private citizens who lack congressional resources,” the organization wrote.
Contributing: Kayla Jimenez, Elizabeth Weise and Jeanine Santucci, USA TODAY
Saleen Martin is a reporter on USA TODAY's NOW team. She is from Norfolk, Virginia – the 757. Follow her on Twitter at @SaleenMartin or email her at [email protected].
This article originally appeared on USA TODAY: Women lawmakers are 70 times more likely to be deepfake victims: Study
Solve the daily Crossword

