‘Nudify’ Apps Using AI to Make Deepfake Porn of Women Skyrocket in Popularity

A deepfake victim covers herself
AlexShalamov/Getty

Apps that use AI to generate images — popularly known as deepfakes — making clothed women appear to be naked are reportedly soaring in popularity.

In September alone, 24 million people visited websites offering “Nudify” AI tools, researchers with the social network analysis company Graphika found, according to a report by TIME.

Moreover, many of these AI undressing apps reportedly use popular social media platforms to market their products, with the number of links advertising apps with “Nudify” tools increasing more than 2,400 percent on social media since the beginning of this year.

Breitbart News previously reported that Facebook and Instagram allowed hundreds of ads for deepfake tools promising explicit images of Hollywood starlets to run on Mark Zuckerberg’s platforms:

The ad campaign, which ran on Sunday and Monday, rolled out more than 230 advertisements on Facebook, Instagram, and Messenger, the report noted.

While the ads did not feature any actual sex acts, they were suggestive in nature, and were made to mimic the beginning of a porn video, even adding Pornhub’s intro track playing in the background.

Among the ads that swapped out other people’s faces with those of celebrities, 127 of them featured Watson, and another 74 featured Johansson.

Graphika also noted that the rise in popularity of such tool corresponds to the launch of several open source AI platforms that can generate images far superior to those created just a few years ago.

“You can create something that actually looks realistic,” Graphika analyst Santiago Lakatos said, noting that previous deepfakes were often blurry.

A Google spokesperson insisted that the tech giant doesn’t allow ads “that contain sexually explicit content,” telling TIME, “We’ve reviewed the ads in question and are removing those that violate our policies.”

Meanwhile, a Reddit spokesperson told the magazine that the website forbids non-consensual sharing of fake sexually explicit material, and has even banned several domains as a result of this research.

As Breitbart News reported, there is an ever-increasing rise of deepfake porn, which now makes up 98 percent of all deepfake images. Meanwhile, authorities say there is nothing they can do about such deepfakes.

In one instance last month, however, a child psychiatrist in North Carolina was sentenced to 40 years in prison for using Nudify apps on photos of his patients, making it the first prosecution of its kind under law banning deepfake generation of child sexual abuse material.

You can follow Alana Mastrangelo on Facebook and X/Twitter at @ARmastrangelo, and on Instagram.

COMMENTS

Please let us know if you're having issues with commenting.