Spain under shock as schoolboys create fake nudes using generative models

In a small Spanish town, several schoolboys used generative models to create fake nudes of their fellow pupils. Police, prosecutors, and parents are at a loss on how to pursue a case that shows, once again, that women are the main victims of deepfakes.

Since their inception in 2017, generative models have been used to artificially undress women in photos and create realistic pornographic videos. The problem has been identified for as long. But instead of being stopped, the tools for that purpose are becoming more accessible. So much so that ten schoolboys in the Spanish town of Almendralejo, in southwestern Spain, are under investigation after creating and disseminating fake nudes of their fellow pupils.

The case went public after the mother of one of the victims posted a video denouncing the situation on her Instagram account, where she has over 136,000 followers. Gynecologist Miriam Al Adib reported how her 14-year-old daughter came to her after she learned that pictures on which she appeared to be naked were circulating among her classmates. Several parents from the town came forward after the incident went public, since all their daughters claimed that the photographs were fake.

Although the case is ongoing and the details of the investigation are still confidential, the kids behind the ruse allegedly used an online app that uses generative models to create nudes from photographs where the girls were dressed. Many Spanish media outlets have identified specific applications with which the nude pictures could have been created, but neither of them has been confirmed as being the tool in case by police officers. All the platforms they mention require users to be over 18, but it is clear that there is no safeguard mechanism in place to prevent minors from using them, as elDiario.es pointed out.

Violating women’s bodies

That this event took place in a town of 34,000 inhabitants proves once more that new technology makes violating women’s bodies even easier. Since deepfake technologies became popular, some expected disinformation and propaganda to be the main use cases, leading to the collapse of institutions and war. Instead, women have been the main target.

This has been going on for a while: “Surprisingly, since the first fake pornographic videos and photos were created in 2017 using these techniques, this has been the main domain of application. Today, most studies estimate that around 90% of the fake content published online is pornographic,” summarizes Marta Beltrán, a doctor in computer science and mathematical modeling, in her book Mr. Internet.

The distressing case in Spain put the spotlight back on a never-ending problem. Back in 2019, an application of the same kind, DeepNude, allowed users to remove clothes from women’s pictures for 50 dollars. One year later, a Telegram bot appeared that enabled users to receive fakes nudes in return for sending pictures of girls and women.

Both services were taken down after the media reported on it and thereby created some public uproar. But, like mushrooms, similar tools keep popping up.

The problem goes further, as pointed out by Marta Beltrán to AlgorithmWatch: “Some of these services advertise themselves directly with slogans like ‘strip whoever you want.’ Moreover, they are offered on certain TikTok channels, video game chat rooms, and similar platforms where minors and very young people are active. It is clear that they are the target audience, and they offer tutorials so that they can learn quickly and get an idea of what can be done. They show it to them as something fun even.”

These platforms are not for adults only anymore. Instead, they are being advertised as an “open bar” for youngsters, Beltrán says. Creating artificial photos of someone we know is something that, with a little bit of effort, can be achieved with well-known tools such as Photoshop. But an automated service to strip women from their clothes that can be easily used by 13-year-olds is something entirely different, and should be treated accordingly by regulators.

In such cases, not a particular technology's use is punished but the outcome. Undressing underage girls using generative models is not directly a criminal offense, which makes the legal case even more difficult.

According to El País, confusion remains as to which specific offenses should be prosecuted. The prosecutors could consider charging the wrongdoers with “production or distribution of pornographic material” and “possession of child pornography” – although the law requires the images to show an “explicit sexual act” or the minor’s genitals to be displayed in a sexual way, as pointed out by Law Professor Paz Lloria. This does not seem to apply in this case. Another legal avenue could be to press charges because the victims' privacy and moral integrity were violated.

The Spanish National Data Protection Agency has also opened an ex officio investigation over a possible violation of data protection laws and is making use of its ‘Priority channel’, a free service that allows citizens in Spain to report the dissemination of non-consensual sexual photos or videos and that requires platforms to take them down.

Naiara Bellio (she/her)

Head of Journalism

Naiara Bellio covers the topics privacy, automated decision-making systems, and digital rights. Before she joined AlgorithmWatch, she coordinated the technology section of the Maldita.es foundation, addressing disinformation related to people's digital lives and leading international research on surveillance and data protection. She also worked for Agencia EFE in Madrid and Argentina and for elDiario.es. She collaborated with organizations such as Fair Trials and AlgoRace in researching the use of algorithmic systems by administrations.

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.