At least 11 police forces use face recognition in the EU, AlgorithmWatch reveals

The majority of the police forces that answered questions by AlgorithmWatch said they use or plan to introduce face recognition. Use cases vary greatly across countries, but almost all have in common their lack of transparency.

Nicolas Kayser-Bril
Reporter

Updated on 18 June 2020.

Police departments have long attempted to acquire, structure and store data on the populations they keep a watch on. Frenchman Alphonse Bertillon pioneered the use of anthropometrics by the police in the 1870s, creating a collection of tens of thousands of cards containing the measurements of homeless people (thought to be more likely to engage in crime), including the famous mug shot. His work was the foundation on which biometrics grew over the 20th century.

Current projects have far surpassed what Bertillon could have imagined. Face recognition is used to find missing children and to spot violent supporters in football stadiums. In Lyon, France, a man was caught stealing a car by a CCTV camera last October. His face matched a picture in a database. He was subsequently arrested and sentenced to one and a half year in jail.

Of 25 member states of the European Union reviewed by AlgorithmWatch, at least eleven have a police force that uses face recognition. Eight plan to introduce it in the coming years. Just two countries, Spain and Belgium, do not allow it yet. Two police forces have yet to answer our requests.

External content from datawrapper.com

We'd like to present you content that is not hosted on our servers.

Once you provide your consent, the content will be loaded from external servers. Please understand that the third party may then process data from your browser. Additionally, information may be stored on your device, such as cookies. For further details on this, please refer directly to the third party.

Ready to go

In Kortrijk, Belgium, and Marbella, Spain, the local police deployed “body recognition” technology. These systems use the walking style or clothing of individuals to track them. Both systems could recognize faces but the feature is disabled for now, pending a legal authorization to turn it on.

Face recognition is mostly used in criminal investigations, as in the Lyon car theft. Automated, real-time face recognition is spreading, too. In several countries, it is used around football stadiums to find people who were put on lists of violent supporters. In Ireland, it is routinely used to verify welfare claims.

The technology raised privacy concerns which are well covered by privacy-focused organizations such as Privacy International or Bits of Freedom. Face recognition also enables automated decisions that are not without problems.

False positives

Even if face recognition can match a face with 99% accuracy, the sheer amount of faces available in police databases makes false positives inevitable. (The 1% error rate means that, if 10,000 people who are not wanted by the police undergo face recognition, 100 will be flagged as wanted).

In the Netherlands, the police has access to a database of pictures of 1.3 million persons, many of which were never charged with a crime. A Vice investigation reported that, in 2017, 93 suspects “matched” against people registered in that database.

How many of these matches were false-positives is not known. In London, a test run in 2018 resulted in 104 matches, only two of which were true positives. (Another one in Wales produced similar results.) In Buenos Aires, Argentina, face recognition in the city’s subway system led to 1227 alerts in the second quarter of the year, of which 226 were true positives. However, some arrests were based on bogus data. One person was arrested on the basis of a 2004 court order. The case had been dismissed, but someone forgot to cancel the arrest warrant. Another person was arrested because a typo in the warrant matched his ID number.

Black boxes

In the case of the car thief of Lyon, the defense attorney claimed that face recognition evidence was inadmissible because the algorithm used for the match was unknown. His request was denied but all face recognition systems in use in Europe are, indeed, black boxes.

Opacity is conducive to misuse. An investigation in the United States published last May showed that, when no match was found, some police officers fed the faces of famous lookalikes (an actor that looks like the suspect, for instance) to the face recognition algorithm in place of the suspect's photo.

Some police forces disclosed the name of the companies providing their face recognition software, but others, like the Finns and the Croats, consider it “classified information.” A spokesperson for the Lithuanian police even refused to tell if they used face recognition.

This review by AlgorithmWatch is not comprehensive. Many EU countries have more than one police force, which might have different practices. It shows that face recognition is widely used in different contexts, with little or no transparency. It shows, too, that a registry of all automated decision-making processes is sorely needed.

External content from datawrapper.com

We'd like to present you content that is not hosted on our servers.

Once you provide your consent, the content will be loaded from external servers. Please understand that the third party may then process data from your browser. Additionally, information may be stored on your device, such as cookies. For further details on this, please refer directly to the third party.

Maris Männiste, Jose Miguel Calatayud und Eleftherios Chelioudakis contributed to this report.

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.