The website promises to make “men’s dreams come true.”
Users upload a photo of a fully clothed woman of their choice, and in seconds, the site undresses them for free. With that one feature, it has exploded into one of the most popular “deepfake” tools ever created.
Far more advanced than the now-defunct “DeepNude” app that went viral in 2019, this new site has amassed more than 38 million hits since the start of this year, and has become an open secret in misogynist corners of the web. (HuffPost is not naming the site in order to avoid directing further traffic to it.) It went offline briefly Monday after HuffPost reached out to its original web host provider, IP Volume Inc., which quickly terminated its hosting services. But the site was back up less than a day later with a new host – as is often the case with abusive websites.
Launched in 2020, the site boasts that it developed its own “state of the art” deep-learning image translation algorithms to “nudify” female bodies, and that the technology is so powerful there is not a woman in the world, regardless of race or nationality, who is safe from being “nudified.” But it doesn’t work on men. When fed a photo of a cisgender man’s clothed body, the site gave him breasts and a vulva.
With female and female-presenting subjects, on the other hand, the results are strikingly realistic, often bearing no glitches or visual clues that the photos are deepfakes – that is, synthetically manipulated imagery. This drastically increases the potential for harm. Any vindictive creep, for example, could easily “nudify” and post photos of his ex online to make it seem, very convincingly, as if her actual nudes had been leaked. “Holy mother [of] god,” one user raved about the tool’s “amazing” abilities in an online forum where men consume and discuss deepfake porn. “Ive never seen [results like this] before ... the future is now.”
The website’s stunning success lays bare the grim and increasingly dangerous reality of being a woman on the internet as malicious deepfake technology continues to advance undeterred. Women were already being digitally inserted into porn and stripped naked against their will via similar, less powerful deepfake tools. But the problem is spiralling out of control, with the technology becoming ever more accessible and the imagery becoming ever more believable.
“The vast majority of people using these [tools] want to target people they know.”
The victims of deepfake porn, who are almost exclusively women and girls, are often left with little to no legal recourse as their lives are turned upside down. And although the new site claims that it doesn’t store any images, it does generate shareable links to every “nudified” photo, making it easy for users to spread these pictures all over the internet as well as consuming them privately.
It’s unknown who is behind the site, which is riddled with spelling and syntax errors, just as it’s unclear where they are based. The operators did not respond to multiple interview requests from HuffPost. Last month, the US was by far the site’s leading source of traffic, followed by Thailand, Taiwan, Germany and China. Now-deleted posts on blogging site Medium demonstrating how to use the tool featured before-and-after pictures of Asian women exclusively.
Despite the immeasurable harm this site and others like it pose to women everywhere, there has been little meaningful intervention to date. US lawmakers have shown scant concern about abusive deepfakes outside of their potential to cause political chaos. Social media companies are often slow to respond to complaints about nonconsensual pornographic content, real or fake, that spreads across their platforms, and they typically face zero liability for it.
Photos and links from this new, so far unrivalled “nudifier” have spread across Twitter, Facebook, Reddit, Telegram and other major platforms, in private and public channels alike.
“This is a really, really bleak situation,” said UK-based deepfake expert Henry Ajder. “The realism has improved massively,” he added, noting that deepfake technology is typically weaponised against everyday women, not just celebrities and influencers.
“The vast majority of people using these [tools] want to target people they know,” Ajder said.
There’s nothing stopping the owners of this website and those like it from profiting off the sexual humiliation of countless women. The site isn’t confined to the dark web, or even delisted from Google’s search engine index. It operates out in the open, and encourages users to promote it on social media for all to see.
A key part of the site’s explosive growth appears to be its “referral” program, through which users can publicly share a personalised link and earn rewards for each new person who clicks on it. Users are normally only able to “nudify” one picture for free every two hours; alternatively, they can pay a fee in cryptocurrency to skip the wait time for a specified number of additional photos, or earn up to 100 freebies without restrictions through their referral rewards.
This has incentivised thousands of people to advertise the website all over the internet, posting referral links sometimes accompanied by pictures of the ordinary, unwitting women they’ve chosen to “nudify” – female co-workers, friends, classmates, exes, neighbours, strangers. In this way, these users aren’t only drawing new attention to the site; they’re also further exploiting their victims. Hundreds of Reddit threads have been created expressly for people to share and click on each other’s links. “Help me and I’ll help you,” one poster recently begged. “Want to see your friend naked?” another offered. “You’re welcome boys,” a third wrote.
“As these apps get more sophisticated, it will be impossible to tell the difference between something that’s manipulated to make you look naked and an actual naked picture.”
From a business perspective, the referral program is a shrewd growth strategy that has proved to be highly effective. But it also has a clear vulnerability: Twitter, Facebook, Reddit, Telegram and the other social networks that play a crucial role in spreading the referral links could take an ax to the website’s traffic flow by banning its URL from their platforms outright. It’s a simple moderation step that would yield immediate results. But so far, only Facebook has done so, after being alerted by HuffPost to the site’s use of its platform.
“If they are a responsible platform and they care about this issue, they definitely should be taking that kind of action,” said Mary Anne Franks, a law professor at the University of Miami and president of the Cyber Civil Rights Initiative. She was alarmed that the social media companies had either overlooked or ignored the site for so long.
“It really is incumbent upon them to be proactively looking for threats like this before they become problems,” Franks said. “It shouldn’t be individuals or reporters or anyone else pointing this out.”
A spokesperson for Reddit said the platform’s “site-wide policies prohibit involuntary pornography across all types of content, including deepfakes,” and that “we will continue to remove content that violates our policies and action users that engage in such content.” Twitter is also taking down any violative content from the website, according to a spokesperson. But links and photos from the website can still be found on both platforms, and nothing is stopping users from posting more.
Telegram did not respond to a request for comment.
Google said it removes pages from its index when required by law. “For victims of involuntary synthetic pornographic imagery, we have a policy under which you can request the removal of web pages from Google Search,” a spokesperson said. “To address these challenges at scale, we also fundamentally design our systems to promote high quality information, and avoid showing harmful or explicit content unless someone is searching for it directly.”
Tech platforms enjoy immunity from liability for almost all user-generated content thanks to Section 230 of the Communications Decency Act, a controversial and decades-old law. As a result, these companies can independently decide what they feel like removing from their sites — and what they don’t. Though most all have policies explicitly prohibiting sexual harassment (and in some cases, deepfake porn specifically), such rules are often poorly enforced. As many women can attest, you’d likely have a much easier time getting a social media site to remove copyrighted material, as required by law, than deepfake nudes.
While social media giants have allowed the “nudifying” website to reach millions of potential new users through their platforms, two other web companies are doing their part to keep it online and functioning: Ab Stract, the site’s new and obscure Finnish host provider, and TLD Registrar Solutions, a domain registrar operating out of the U.K.
Both are free to choose whether they wish to do business with a website that digitally removes women’s clothing without their consent en masse. Still, even if they did force it offline by yanking access to their services, as IP Volume did, it would likely only be a temporary solution — underscoring the impunity with which malicious deepfake creators operate.
TLD Registrar Solutions did not respond to a press inquiry. Ab Stract could not be reached for comment.
In the not-so-distant past, forging highly realistic nudes of a woman would have required a considerable amount of time, skill and effort. Today, anyone with an internet connection can do it at no cost, and millions of people are. This means every woman and girl with publicly accessible images of herself – a Facebook selfie, a school yearbook photo, a TikTok video – is vulnerable. And as the “nudifier” website continues to blow up, it’s likely to earn more money and further improve its technology.
“As these apps get more sophisticated, it will be impossible to tell the difference between something that’s manipulated to make you look naked and an actual naked picture,” warned Franks, who’s drafting model criminal legislation focused on “digital impersonation forgery” that would punish those who create or knowingly distribute nonconsensual deepfake porn, and would limit platforms’ CDA 230 protections under specific circumstances.
Victims of revenge porn and other nonconsensual porn have lost jobs, relationships and more. Some have died by suicide. It no longer matters if these kinds of images are real – they create a new reality in which the victim must live.
The “nudifier” website is on track to reach its highest ever monthly traffic. With apathetic tech companies aiding its global spread and no legal framework to hold them accountable, the question isn’t how such a dangerous tool could become so popular, but rather, why it wouldn’t.