Mobile apps fueling AI-generated nudes of young girls: Spanish police

ByEmmanuelle Saliba ABCNews logo
Monday, October 2, 2023

A town in Spain made international headlines after a number of young schoolgirls said they received fabricated nude images of themselves that were created using an easily accessible "undressing app" powered by artificial intelligence, raising a larger discussion about the harm these tools can cause.



"Today a smartphone can be considered as a weapon," Jose Ramon Paredes Parra, the father of a 14-year-old victim, told ABC News. "A weapon with a real potential of destruction and I don't want it to happen again."



Over 30 victims between the ages of 12 and 14 years of age have been identified so far, and an investigation has been ongoing since Sept. 18, Spanish National Police told ABC News.



And while most of the victims are from Almendralejo, a town in the southwest of Spain at the center of this controversy, Spanish National Police say they have also found victims in other parts of the country.



A group of male perpetrators, who police say knew most of the victims, used photos taken from the social media profiles of female victims and uploaded them to a nudify app, authorities told ABC News.



Nudify is a term used to describe an AI-powered tool designed to remove clothing from a subject in a photo. In this case, the service can be used via Telegram or via an app you download on your phone.



These same perpetrators, also minors, created a group chat on WhatsApp and on Telegram to disseminate these non-consensual fabricated nude images, authorities told ABC News. The fake images were used to extort at least one victim on Instagram for real nude images or money, said the parent of one of the victims.



Telegram told ABC News they actively moderate harmful content on their platform including the distribution of child Sexual abuse material (CSAM). "Moderators use a combination of proactive monitoring of public parts of the app and user reports in order to remove content that breaches our terms of service." Over the course of the month of September, Telegram says moderators removed 45,000 groups and channels related to child abuse.



WhatsApp spokesperson told ABC News that they would treat "this situation the same as any kind of CSAM we become aware of on our platform: we would ban those involved and report them to the National Center for Missing & Exploited Children."



"This is a direct abuse of women and girls, by technology that is specifically designed to abuse women and girls," said Professor Clare McGlynn, a law professor at Durham University in the U.K. and an expert on violence against women and girls.



ABC News reached out to the email address listed on the app's website and received a response. The team behind the app said their main reason for creating this type of service was to make "people laugh" by "processing their own photos and laugh together by processing each other's photos."



"By them laughing on it we want to show people that they do not need to be ashamed of nudity, especially if it was made by neural networks," the team explained via email.



When pressed on what safeguards were in place regarding the use of the app with photos of minors, they responded that they have protections in place for photos of people below the age of 18. If a user tries to upload a photo of a minor they will receive an error, and be blocked after two uses, they added.



The team behind the app said they investigated how their app was used after the news of the case in Spain broke out and found that the perpetrators had a workaround and likely used a combination of their app and another app to create the non-consensual nudes.



Experts tell ABC News all it takes to make a hyper-realistic non-consensual deepfake is a photo, an email address and a few dollars if you want to create them in bulk.



ABC News reviewed the nudify app Spanish authorities say was used to create these AI-generated explicit images of young girls. The app offers a free service that can be used through Telegram, as well as an app that you can download on your phone.



When ABC News reviewed the app, it offered a premium paid service that listed payment methods such as Visa, Mastercard, and Paypal. These payment methods, along with several others, were removed after ABC News reached out.



A Visa spokesperson told ABC News that it does not permit the use of their network to be used for illegal activity. "Visa rules require merchants to obtain consent from all persons depicted in any adult content, including computer-generated or computer-modified content, such as deepfakes," added a spokesperson.



A Paypal spokesperson told ABC News that it "takes very seriously its responsibility to ensure that customers do not use its services for activities that are not allowed under its Acceptable Use Policy. We regularly review accounts and when we find payments that violate our policies, we will take appropriate action."



Mastercard did not respond to requests for comment.



Parra and his wife, Dr. Miriam Al Adib Mendiri, went directly to local police after they said their daughter confided in them that she had been targeted and they also decided that they would use Mendiri's large social following to denounce the crime publicly.



"Here we are united to STOP THIS NOW. Using other people's images to do this barbarity and spread them, is a very serious crime," Mendiri shared in an Instagram video. "[...] Girls, don't be afraid to report such acts. Tell your mothers."



Mendiri's public appeal led to many more victims coming forward to the police. Local authorities say that some of the perpetrators are under 14 years old, meaning they will have to be tried under the minor criminal law. Investigations are ongoing, confirmed Spanish National Police.



"If they do not understand what they did now, if they don't realize it, what they will become later?" said Parra. "Maybe rapist, maybe gender violent perpetrator... they need to be educated and to change now."



Experts like McGlynn believe the focus should be on how global search platforms rank non-consensual deepfake imagery and the apps that facilitate the creation of non-consensual imagery.



"Google returns nudify websites at the top of its ranking, enabling, and legitimizing these behaviors," McGlynn said. "There is no legitimate reason to use nudify apps without consent. They should be de-ranked by search platforms such as Google."



Another expert, who founded a company to help individuals remove leaked private content online, agreed with McGlynn.



"Apps that are designed to essentially unclothe unsuspecting women have zero place in our society, let alone search engines," said Dan Purcell, founder of Ceartas. "We are entering an endemic of kids using AI to undress kids, and everyone should be concerned and outraged."



A Google spokesperson responded by saying: "Like any search engine, Google indexes content that exists on the web, but we actively design our ranking systems to avoid shocking people with unexpected harmful or explicit content. We also have well-developed protections to help people impacted by involuntary fake pornography - people can request the removal of pages about them that include this content."



They added that as this space and technology evolves, "they are actively working to add more safeguards to help protect people, based on systems we've built for other types of non-consensual explicit imagery."



Microsoft's Bing is another search engine where websites containing non-consensual deepfake imagery are easily searchable. A Microsoft spokesperson told ABC News, "The distribution of non-consensual intimate imagery (NCII) is a gross violation of personal privacy and dignity with devastating effects for victims. Microsoft prohibits NCII on our platforms and services, including the soliciting of NCII or advocating for the production or redistribution of intimate imagery without a victim's consent."

Copyright © 2024 ABC News Internet Ventures.

Related Topics