Apps That Use AI to Undress Women in Photos Soar in Use

The use of apps and websites employing artificial intelligence to digitally undress women in photos is witnessing a surge in popularity, as revealed by recent research.

In September alone, undressing websites saw a visitation from 24 million people, according to the social network analysis company Graphika. A significant number of these services, commonly referred to as "nudify," utilize popular social networks for marketing purposes. Since the start of the year, the number of links advertising undressing apps has surged by over 2,400% on various social media platforms, including X and Reddit, according to researchers. These services leverage AI to digitally manipulate images, resulting in the appearance of nudity, and notably, many such services focus exclusively on women.

This disturbing trend aligns with the development and distribution of non-consensual pornography facilitated by advancements in artificial intelligence, commonly known as deepfake pornography. The legal and ethical challenges surrounding its proliferation are serious, given that the images are often sourced from social media and disseminated without the subject's consent, control, or knowledge.

The rise in popularity corresponds with the release of several open-source diffusion models or artificial intelligence capable of creating images far superior to those produced just a few years ago, according to Graphika. As these models are open source, app developers can freely access and utilize them.

Santiago Lakatos, an analyst at Graphika, noted that the ability to create realistic-looking images is a key factor in the increased prevalence of these undressing apps. He highlighted that previous deepfakes were often blurry, whereas the latest models offer higher quality.

Despite the disturbing nature of these apps, they have become a lucrative business. Some services charge users $9.99 per month, and their websites claim to attract a substantial number of customers. Lakatos mentioned that one undressing app, based on its own advertising, asserts it has more than a thousand users daily.

Privacy experts are increasingly alarmed at the rise of deepfake software facilitated by AI advancements, making such technology more accessible and effective. Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, noted an increase in ordinary individuals using deepfake software against everyday targets, including high school students and college attendees.

Victims of non-consensual pornography, facilitated by deepfake technology, often struggle to discover the existence of such images. Even those who do may face challenges getting law enforcement to investigate or securing funds for legal action.

Presently, there is no federal law in the US specifically banning the creation of deepfake pornography. While the US government prohibits the generation of such images involving minors, there is a notable absence of legislation addressing adult non-consensual pornography created through deepfake technology. A landmark case in November involved the sentencing of a North Carolina child psychiatrist to 40 years in prison for using undressing apps on photos of his patients, marking the first prosecution under laws banning the deepfake generation of child sexual abuse material.

In response to the growing issue, TikTok has taken measures to block the keyword "undress," a common search term associated with these services. The app warns users searching for this term that it "may be associated with behavior or content that violates our guidelines." Meta Platforms Inc. has also started blocking keywords related to searches for undressing apps.

Source: XV XV XV XV XV XV XV