The 8-Second Deepfake Industry Weaponizing Women's Photos
A single photo can now generate explicit 8-second videos through industrialized deepfake abuse networks. With 1.4 million accounts on Telegram alone, this ecosystem is bigger than most realize.
One photo is all it takes. A few clicks later, you have an 8-second explicit video clip inserting women into graphic sexual situations they never consented to. "Transform any photo into a nude version with our advanced AI technology," reads the text on websites that have turned sexual harassment into a point-and-click operation.
The deepfake sexual abuse ecosystem has evolved far beyond what most people understand. It's no longer crude image manipulation—it's a sophisticated, industrialized network offering 65 different sexual scenario templates, AI-generated audio, and video quality that's becoming disturbingly realistic.
WIRED's investigation reveals an industry that's likely generating millions of dollars annually while causing immeasurable harm to women and girls worldwide.
1.4 Million Accounts on Telegram Alone
A WIRED review of more than 50 deepfake websites—receiving millions of monthly views—found nearly all now offer high-quality video generation with dozens of sexual scenarios. But the scale becomes truly staggering on Telegram, where 1.4 million accounts were signed up to 39 deepfake creation bots and channels.
After WIRED inquired about these services, Telegram removed at least 32 deepfake tools. "Nonconsensual pornography—including deepfakes and the tools used to create them—is strictly prohibited under Telegram's terms of service," a spokesperson said, noting they removed 44 million pieces of violating content last year.
Yet the ecosystem continues expanding. In June, one deepfake service promoted a "sex-mode," advertising: "Try different clothes, your favorite poses, age, and other settings." Another promised "more styles" were coming, allowing users to "create exactly what you envision with your own descriptions."
The Open Source Foundation of Abuse
The rapid proliferation stems from advances in generative AI and sophisticated open-source photo and video generators. "This ecosystem is built on the back of open-source models," says Stephen Casper, an AI safeguards researcher at MIT. "Oftentimes it's just an open-source model that has been used to develop an app that then a user uses."
Sexual deepfakes first emerged in late 2017, requiring technical knowledge to create. The generative AI revolution of the past three years has made the technology accessible to anyone. "It's no longer a very crude synthetic strip," says deepfake expert Henry Ajder, who's tracked the technology for over five years. "We're talking about a much higher degree of realism of what's actually generated, but also a much broader range of functionality."
Independent analyst Santiago Lakatos found that multiple larger deepfake websites have consolidated market position, now offering APIs to other nonconsensual image and video generators. "They're consolidating by buying up other different websites or nudify apps. They're adding features that allow them to become infrastructure providers."
The Victims Are Always Women and Children
Victims of nonconsensual intimate imagery (NCII), including deepfakes, are nearly always women. The false images and videos cause harassment, humiliation, and feelings of being "dehumanized." While celebrities and politicians face public abuse, the technology is also weaponized by men against colleagues and friends, and by boys in schools against classmates.
"Typically, the victims or the people who are affected by this are women and children or other types of gender or sexual minorities," says Pani Farvid, associate professor at The New School and founder of The SexTech Lab. "We as a society globally do not take violence against women seriously, no matter what form it comes in."
An Australian study led by researcher Asher Flynn interviewed 25 creators and victims of deepfake abuse. They found three factors driving the problem: increasingly easy-to-use tools, normalization of creating nonconsensual sexual images, and minimization of harms. Of 10 perpetrators interviewed, 8 identified as men.
Their motivations included sextortion, causing harm to others, peer reinforcement, and curiosity about the technology. "You just want to see what's possible," one abuser told researchers. "Then you have a little godlike buzz of seeing that you're capable of creating something like that."
The Banality of Digital Violence
Unlike the public sharing that happened with nonconsensual images created using Grok on X, explicit deepfakes are more often shared privately with victims or their social circles. "I just simply used the personal WhatsApp groups," one perpetrator said. "And some of these groups had up to 50 people."
Multiple experts noted the "cavalier" attitude within communities developing these tools. "There's this tendency of a certain banality of the use of this tool to create NCII," says Bruna Martins dos Santos, policy manager at human rights group Witness.
For many abusers, the technology represents power and control—a digital manifestation of gender-based violence that lawmakers have been slow to address effectively.
This content is AI-generated based on source articles. While we strive for accuracy, errors may occur. We recommend verifying with the original source.
Share your thoughts on this article
Sign in to join the conversation
Related Articles
New data reveals the shocking scale of the xAI Grok nudifying scandal 2026. 3 million images were sexualized in 11 days, including 23,000 depictions of children.
X's new restrictions on Grok AI image generation are proving ineffective. Despite policy changes, users continue to find ways to generate harmful deepfakes.
Religious leaders like Father Mike Schmitz are warning against AI pastor deepfake scams. Discover how scammers are using voice cloning and deepfakes to exploit faith and influence.
Explore the 2024 South Korea deepfake crime suspects data showing that 97.6% of suspects are male and 83.7% are teenagers. A deep dive into the demographics of digital sex crimes.
Thoughts