AI Apps That ‘Nudify’ Women Soar In Popularity

Thanks to AI advancements, deepfakes—images created by the tech that make women look nude while wearing clothing—are becoming more popular. Researchers from Graphika, a social network analysis business, found that 24 million individuals visited websites providing “Nudify” AI solutions in September. There has been a 2,400% increase in social media links promoting applications with “Nudify” facilities since the beginning of the year, suggesting that many of these undressing apps are using popular platforms to promote their wares.

Since the photos are often copied and pasted from social media without the subject’s knowledge, permission, or control, deepfake pornography faces significant ethical and legal challenges. Several open-source diffusion models—a kind of artificial intelligence—have recently been released, and their publication has coincided with the surge in popularity. These models can produce far better photos than those developed a few years ago. The app developers use open-source models, which are freely accessible to the public.

Celebrities are the target of a preponderance of the imagery. Psychologically, it is more titillating to think you are seeing a known person nude rather than an anonymous person. Profiteers count on this and focus on this.

Mark Zuckerberg’s social media platforms enabled hundreds of adverts for deepfake technologies that promised to display pornographic photographs of Hollywood starlets. Throughout Sunday and Monday, the advertising campaign released over two hundred and thirty ads across Facebook, Instagram, and Messenger. The advertisements did not include explicit content, but they were very provocative.

According to them, no action can be taken by the authorities over such deepfakes. Teenage girls from Spain were found in September with artificial intelligence (AI) produced nude photographs, with the majority of the images made from the girls’ Instagram accounts while they were completely dressed.

While it is against the law to share these images with children, no federal statute forbids their development.