Your Photos Aren't Safe: The Nudify App Invasion
Apple and Google app stores host over 100 AI-powered nudify apps that can turn innocent photos into explicit images. Here's what's really happening.
What if 100+ apps could turn your everyday photos into pornographic images? That's not a dystopian future—it's happening right now on Apple and Google's app stores.
A January investigation by watchdog group Tech Transparency Project (TTP) found 55 nudify apps on Google Play and 47 in the Apple App Store. These applications use AI to generate fake nude images from regular photos of people, primarily targeting women.
Too Little, Too Late
After TTP and CNBC raised the alarm, Apple announced on Monday that it had removed 28 apps identified in the report. But a follow-up review by TTP found only 24 apps were actually taken down. Some apps even returned to the store after developers submitted "fixed" versions that addressed guideline concerns.
Google said it suspended "several apps" for policy violations but declined to specify how many, claiming its investigation was ongoing. Both companies tout their commitment to "user safety and security," yet they've been hosting what amounts to digital sexual abuse tools.
These apps have collectively generated over 700 million downloads worldwide and $117 million in revenue, according to app analytics firm AppMagic. Both Apple and Google take their cut from every transaction.
The Technology Gets Scarier
The sophistication is alarming. TTP tested these apps using AI-generated images of fully clothed women and found two main types: those that use AI to render women without clothes, and "face swap" apps that superimpose faces onto nude images.
"These are not just 'change outfit' apps," TTP director Katie Paul told CNBC. "These were definitely designed for non-consensual sexualization of people."
Of particular concern: 14 apps were based in China, raising additional security questions. "China's data retention laws mean that the Chinese government has right to data from any company anywhere in China," Paul explained. "So if somebody's making deepfake nudes of you, those are now in the hands of the Chinese government."
Real Victims, Legal Gaps
This isn't theoretical harm. A CNBC investigation in September followed a group of women in Minnesota whose public social media photos were fed into nudify services without consent. Over 80 women were victimized, yet no apparent crime was committed because the victims were adults and the perpetrator didn't distribute the images.
The case highlights a critical gap: existing laws struggle to address AI-powered image manipulation that doesn't technically involve "real" nudity or distribution.
Regulatory Whack-a-Mole
The problem extends beyond app stores. Elon Musk'sxAI faced backlash this month when its Grok AI tool generated sexualized images of children in response to user prompts. The European Commission opened an investigation into X over Grok's content policies on Monday.
In the US, the National Association of Attorneys General wrote to payment platforms including Apple Pay and Google Pay in August, requesting removal of non-consensual intimate image services. Democratic senators have urged Apple and Google to remove X from their app stores entirely.
Yet the whack-a-mole continues. Google Play's developer policy explicitly prohibits "apps that claim to undress people or see through clothing, even if labeled as prank or entertainment apps." Apple's guidelines ban "overtly sexual or pornographic" material. But enforcement remains inconsistent, and developers find creative workarounds.
The Trust Problem
"The fact that they are not adhering to their own policies, which are designed to protect people from non-consensual nude imagery and non-consensual pornography, raises a lot of questions about how they can present themselves as trusted app platforms," Paul said.
Both companies profit from these apps through their revenue-sharing models, creating a potential conflict between safety and profit. The question isn't whether they can police their platforms—it's whether they truly want to.
This content is AI-generated based on source articles. While we strive for accuracy, errors may occur. We recommend verifying with the original source.
Share your thoughts on this article
Sign in to join the conversation
Related Articles
OpenAI's Sam Altman, Anthropic's Dario Amodei, and other Big Tech CEOs broke their political silence for the first time, carefully criticizing federal immigration raids after a deadly Minneapolis incident.
China Vanke receives crucial $339 million loan from state-owned Shenzhen Metro, marking potential government re-intervention in the struggling property sector after years of deleveraging policies.
Silver futures hit $1.25B volume on Hyperliquid exchange, driving HYPE token up 24%. A sign of crypto derivatives evolution beyond traditional crypto assets?
Pinterest announces layoffs affecting less than 15% of its workforce as the company shifts resources to AI-focused teams and products, reflecting broader tech industry transformation.
Thoughts