'Form of violence': Across globe, deepfake porn targets women politicians
From the United States to Italy, Britain, and Pakistan, female politicians are increasingly becoming victims of AI-generated deepfake pornography or sexualized images, in a troubling trend that researchers say threatens women's participation in public life.
An online boom in non-consensual deepfakes is outpacing efforts to regulate the technology globally, experts say, with a proliferation of cheap artificial intelligence tools including photo apps digitally undressing women.
The intimate imagery is often weaponized to tarnish the reputation of women in the public sphere, jeopardizing their careers, undermining public trust, and threatening national security by creating conditions for blackmail or harassment, researchers say.
In the United States, the American Sunlight Project, a disinformation research group, identified more than 35,000 instances of deepfake content depicting 26 members of Congress -- 25 of them women -- across pornographic sites.
A study published by the group last month showed that nearly one in six women in Congress have been victims of such AI-generated imagery.
"Female lawmakers are being targeted by AI-generated deepfake pornography at an alarming rate," said Nina Jankowicz, chief executive of the ASP. "This isn't just a tech problem -- it's a direct assault on women in leadership and democracy itself."
ASP did not release the names of the female lawmakers depicted in the imagery to avoid public searches, but it said it privately notified their offices.
- 'Wage this war' –
In the United Kingdom, Deputy Prime Minister Angela Rayner was among more than 30 British female politicians found to be targeted by a deepfake porn website, according to a Channel 4 investigation published last year.
The high-traction site, which was unnamed, appeared to use AI technology to "nudify" about a dozen of those politicians, turning their photos into naked images without their consent, it said.
The tech advancements have given rise to what researchers call an expanding cottage industry around AI-enhanced porn, where users can turn to widely available AI tools and apps to digitally strip off clothing from pictures or generate deepfakes using sexualized text-to-image prompts.
In Italy, Prime Minister Giorgia Meloni is seeking 100,000 euros ($102,950) in damages from two men accused of creating deepfake porn videos featuring her and posting them to American porn websites.
"This is a form of violence against women," Meloni told a court last year, according to the Italian news agency ANSA.
"With the advent of artificial intelligence, if we allow the face of some woman to be superimposed on the body of another woman, our daughters will find themselves in these situations, which is exactly why I consider it legitimate to wage this war."
- 'Silencing effect' -
In Pakistan, AFP's fact-checkers debunked a deepfake video that showed lawmaker Meena Majeed publicly hugging an unrelated male minister, an act culturally deemed immoral in a conservative Muslim-majority nation.
In a separate episode, Azma Bukhari, the information minister of the Pakistani province of Punjab, said she felt "shattered" after discovering a deepfake video online that superimposed her face on the sexualized body of an Indian actor.
"The chilling effect of AI-generated images and videos used to harass women in politics is a growing phenomenon," the nonprofit Tech Policy Press said last year, warning that the trend will have a "silencing effect on the political ambitions" of women.
The proliferation of deepfakes has outstripped regulation around the world.
Pakistan lacks legislation to combat sexualized deepfakes. UK laws criminalize sharing deepfake porn and the government has pledged to ban its creation this year, but so far no firm timetable has been laid out.
A handful of US states including California and Florida have passed laws making sexually explicit deepfakes a punishable offense and campaigners are calling on Congress to urgently pass a host of bills to regulate their creation and dissemination.
While high-profile politicians and celebrities, including singer Taylor Swift, have been victims of deepfake porn, experts say women not in the public eye are equally vulnerable.
After ASP notified the targeted US Congresswomen, the fake AI-generated imagery was almost entirely scraped from the websites, reflecting what it called a "disparity of privilege."
"Women who lack the resources afforded to members of Congress would be unlikely to achieve such a rapid response from deepfake pornography sites if they initiated a takedown request themselves," ASP said.
burs-ac/md
(F.Jackson--TAG)