March 18th, Vladimir Putin’s nationalist mega-rally: If you zoom into a picture of the stage, you can see a man wearing big headphones and a blue raincoat pulled up over his mouth, lurking. Now run his face through the $5-per-month Russian facial recognition site FindClone, which lets users take pictures of people and find their faces on Russian social media in seconds. One match: a photo, now deleted, of a man with a cross around his neck, sitting on a bed with three other topless men.
After the rally, Czech-based news outlet Radio Liberty used FindClone to try and identify the man in the headphones. They found out that the man with the cross around his neck is most likely an undercover employee on Putin’s personal security team. “Not a sound engineer, but an employee of the Presidential Regiment of the FSO (Federal Guard Service),” they concluded. They also checked the match with Microsoft Azure face recognition, which assigned a 74% confidence score to the faces being the same.
Not everyone is convinced, though–“wrong OSINT (open source intelligence) about real people,” a software developer at Bosch called Uli Stopper posted under the Radio Liberty article. It’s "completely impossible" for any algorithm to “conclusively identify” someone from an image where they have their mouth, chin/jaw and ears covered, if that picture is also low resolution, he told me. It’s also not clear what face recognition software Microsoft Azure uses and what their “confidence score” means.
Concerns about facial recognition aren’t new. Clearview AI got a lot of attention for offering law enforcement the power to search for a stranger’s face in a database of 20 billion pictures–illegally scraped from Facebook and other social media. FindClone offers the same service using VKontakte, the state-controlled Russian Facebook. But they sell it to anyone who pays. (Because of Western sanctions, you now have to pay from a bank account in Russia or a friendly country like Belarus.)
Remember how a Google CEO said in 2011 that they could release a tool that would allow users to take pictures of people and identify them in seconds, but won’t, because of the bad consequences? Clearview AI in public hands is described as the privacy “nightmare scenario.” With FindClone and similar services like PimEyes and Search4faces, this scenario is already a reality: We found cases where people are being misidentified by experienced OSINT researchers, where anonymous online forum users are helping each other identify, harass and stalk people, and where protesters are being doxxed (having their personal information exposed).
Live True Crime Style Threads on Twitter
Also in March: A woman appears in a video posted by the Russian Embassy in London. She says that she is in the besieged city of Mariupol in the south of Ukraine and repeats the Russian state narrative that a Ukrainian Azov battalion blew up a theater where civilians were sheltering. Half an hour later, Dutch journalist Henk van Ess tweets the wrong identification of her as a woman in the Russian town of Magadan, based on a search result from the $29.99-per-month face recognition website PimEyes and a “confidence score” from Microsoft Azure.
“This is not the right match,” van Ess later posts. But the picture of the random Magadan woman, along with a link to a summer camp she once attended, are still in the thread.
“It’s always good to share your methods with peers so they can comment,” van Ess wrote to me. He isn’t the only researcher to do live Twitter investigations, where you share some preliminary results, then debunk them in follow up tweets.
“My feed is about the path that you have to travel,” he adds. “I show each step, hoping people who are new learn from it… the best way for me to practice OSINT [open source intelligence] is being vulnerable by sharing and allowing people to comment.”
This is not an approach shared by everyone.
The biggest risk of using FindClone and similar tools in OSINT investigations is that researchers rely on a facial recognition match alone to “accuse innocent people of having committed serious crimes” so that “other online users who see that information might start online harassment towards them”, says Johanna Wild, a researcher at Bellingcat, an open-source investigative organization.
Aric Toler, Director of Training & Research at Bellingcat, says he looks for “unique or semi-unique facial details, such as moles, ear lobe shape, freckles, scars'', as well as “contextual information” and “other data sources like matching a phone number from VK'' to verify a face recognition match. Toler says he only knows of three public face recognition tools: FindClone, PimEyes (which, unlike FindClone, scrapes websites but not social media), and Search4faces (which, similarly to FindClone, includes results from VKontakte, as well as TikTok, Instagram and Russian social media site OK.ru.)
Dvach: Can you find my dad?
“Gentleman, anons, I have 40 FindClone searches left, and the term will expire soon, it’s a pity if they disappear. Therefore, I offer my search services.” In the past years, again and again, posts like this appear on the 4Chan-esque Russian anonymous forum 2ch.hk (otherwise known as “Dvach”.) Replies stream in: Users post pictures of teens, stills from porn videos or homemade sex tapes, a lady who “refused to hire me”, “him please”, “her please.”
“I bought a bunch of requests in FindClone, for the sake of one, the rest are not needed and will simply expire...” Replies stream in. Someone posts a picture of a teenager in a rainbow backpack being interviewed at a demonstration.
“I have 40 searches in FindClone left, I will help you find your chicks..” Replies stream in. One anonymous user (or “anon”) asks: “Help me find my dad, he left me and my mother 10 years ago.” The FindClone donor’s response: “No thanks”.
“Because I search no more a month on average, I can search for someone for you for free.. In return I would be grateful to the anon who authorizes me in the GetContact application [a phone contact book app].” Replies stream in. One anon posts a picture, writing “I want to integrate myself and then make fun of her baldness.”
Dvach is hosted by the state-affiliated Mail.ru Group, which also operates VKontakte (the Russian Facebook). Since 2010, it has been run by a user called Abu–whose real identity is Nariman Namazov, a 27-year-old Azerbaijani SEO specialist from Moscow–who once said he bought the site for 10,000 dollars.
When internet users first discovered the FindClone site in February 2019, dvachers started using the service to look for the VKontakte pages of women in porn videos or escort dating services. They shared tips on how to “expose” them: who to message (the women’s friends, family, their children’s school friends) and what to say (introduce yourself as a journalist, and “blackmail and harassment”).
"Hi all. Some people probably already know, but most don't. A new face recognition service… has appeared,” Katya, an avatar in an online forum for escorts posted in December 2019. “They take a screenshot of your face from a video or photo, enter it into the search and it displays your profiles in VK, if there is a photo of your face there. Then they write to all friends and relatives, send out screenshots and photos and other information…"
In 2020, Dvach users doxxed the Russian actresses who starred in a porn horror film by Rammstein singer Till Lindemann. No signs they used facial recognition. “If someone needs to poison women, they would have recognized and poisoned them 10 years ago without these tools,” one Russian feminist, who received death threats after defending the women in the Lindemann video, told me, arguing that face recognition tools, “can also service for good, for example to recognize criminals and crooks”.
FindClone origins: For everyone, except Russian law enforcement
When FindClone first got attention in 2019, one journalist from a Russian independent news outlet used the service to identify, well, the developers of the software themselves: two young mathematicians from Dagestan, called Gadzhi Saidov and Yuri Zdanovich.
In the resulting interview, the journalist seems very excited about FindClone (“I explain to (Zdanovich) that justice, anonymity and search tools can be either for everyone or for no one”) and asks the inventors if they plan to sell their algorithm to law enforcement, like “most of their competitors." Saidov told him that he was unlikely to sell his software to the state, which would use it to fight political opponents, rather than crime.
One competitor that the journalist was referring to: NTechLab–a Russian subsidiary of a Cyprus-based company, which in 2016 released an app just like FindClone. The app got a lot of publicity, for example when people used it to dox sex workers and protesters at anti-corruption rallies. Two years later, NTechLab announced it would withdraw their face-matching algorithm from public access, and sold it, amongst others, to the Moscow government, who put it in the city's 200,000 CCTV cameras.
Moscow’s new surveillance network has been notorious for being used by police against protesters, political opponents and lockdown rulebreakers. What has hardly been reported, however, is that police also use images from the cameras to help abusive husbands track down their wives who escape from elsewhere in Russia to the capital–all the men have to do is file a “missing persons report.” This happens "even if a woman or her lawyer asks the police not to look for her," Olga Gnezdilova, a lawyer from Voronezh who has advised a number of domestic violence victims, told me.
Alena Eltsova, who runs Moscow’s domestic violence shelter Kitezh, has witnessed several such cases. She told me about one woman who was detained after being filmed on the subway. Another, who helped a friend run away to Moscow, was filmed in the city’s chamber of commerce. The police visited her and demanded to know where her friend was. In another case, a woman was tracked after being filmed at the entrance to a building in Moscow.
Optimism about FindClone: Doxxing police officers
“These services help identify who killed [Boris] Nemtsov and to investigate war crimes by the Russian military in Ukraine,” an administrator of the anti-corruption project Municipal Scanner told me in April, when I asked about their thoughts on FindClone. (Boris Nemtsov is an opposition politician who was assassinated in Moscow in 2015.)
On the 27th July 2019, Municipal Scanner’s Twitter profile started a campaign to post the names and VKontakte profiles of the police officers, equipped with riot shields and truncheons, who’d been photographed beating up Muscovites who demonstrated for democratic elections earlier that day. Ruslan Leviev, head of the research group Conflict Intelligence Team, proposed that the tools used to “deanonymize” the policemen were FindClone and the reverse image search function of Russian search engine Yandex (which allows users to link the image results to their VK accounts.) Several officers complained about receiving threats. The head of Moscow’s city police reportedly ordered employees to delete any online photos where their faces were visible.
A few days later, a website called "Criminal Justice Info" popped up, claiming to identify the July 27 Moscow protesters and posting links to their VKontakte profiles.
“It was the peak [of using face recognition to expose violent police],” says Anatoly Reshetnikov, a political scientist and activist, referring to July 2019. But afterwards, police started hiding their faces behind tinted helmets and masks. And officers began using face recognition to ID protesters: “This way they now manage to avoid the ugly image of beatings in public squares and simply come to activists' private homes to detain them a few hours after the event.”
The other nightmare scenario: a state monopoly on digital services
“When information about the existence of a face recognition search machine first appeared, many users were very worried,” says Sarkis Darbinyan, founder of the Moscow-based Digital Rights Center. He points out that almost 100,000 people have signed a petition against the use of facial recognition technology by “law enforcement and supervisory authorities”.
In contrast, there hasn’t been much outrage about FindClone and similar tools, according to one digital rights activist, who for security reasons does not want to be named in this article, because "those who understand facial recognition are more concerned about the monopolization of facial recognition by the state."
NTechLab, for example, now appears to be under total control of state companies, ever since the startup’s co-founder left in early 2022. Another big player installing facial recognition software in Russia is Sberbank, a majority state-owned bank. In 2021, Russia’s unified biometric system, which was created in 2018 to allow the use of facial recognition for online banking, received “state system” status.
VKontakte has threatened to sue FindClone for scraping its user database. Not because the company wants to protect the privacy of its users, Darbinyan says. VKontakte wants to be the only company to scrape its user database in order to provide new services.
Since Putin invaded Ukraine, several researchers have used FindClone to find the social media pages of dead or captured Russian soldiers, whose images appeared on a Telegram channel labeled "rf200_nooow" (200 is military slang in Russia for “cargo with dead bodies”). They then contacted and interviewed the soldiers’ relatives to find out what they knew about their close ones’ fates, in light of Russian state propaganda that refuses to acknowledge the war.
In May, journalists from Russian outlet IStories Media used FindClone to help ID a Russian soldier involved in the execution of nine unarmed civilians in Bucha. The soldier had appeared on surveillance camera footage that was made available to the New York Times. An Istories Media reporter called the soldier’s wife, who said that she considered the New York Times photo “fake”, because her husband is “clean-shaven” in them, which apparently meant he couldn’t have been out in Bucha.
“There is always a concern” that services like FindClone will be blocked, says Andrey, a Russian investigative journalist, who, for security reasons, does not want his full name published. But, he adds, “There is so much personal data about Russian citizens available already, that facial recognition is just one of the many tools journalists, law enforcement and private investigators use to violate people’s privacy to do their jobs.”
Did you like this story?
Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!