Spain’s largest bus terminal deployed live face recognition four years ago, but few noticed
Madrid South Station’s face recognition system automatically matches every visitor’s face against a database of suspects, and shares information with the Spanish police.
Around 20 million travellers transited last year through Madrid’s South bus terminal, known as Méndez Álvaro Station to locals. Those 20 million persons had their face scanned as they entered the station. They were tracked as they walked to the bays where their bus was parked, before leaving the Spanish capital. Unless the station’s face detection system produced an alert and they were arrested.
The terminal is a key transport exchange not only for Madrid, but for the whole country. It connects with subway stations and with Renfe, the national train service. Until 2010, the terminal did not have a security unit that was specifically tasked with coordinating the response to petty crime.
Running since 2016
The station is one of the few public buildings in Spain that deployed a live face recognition system. Miguel Angel Gallego, the station’s chief of security between 2010 and 2019, decided to deploy face recognition after he was contacted by a Spanish start up working with this type of software in 2016.
Mr Gallego faced an uphill battle. The Spanish police, who were not used to face recognition at the time, were not enthusiastic, and neither was Avanza ADO, the company that has been running the bus terminal since 2003. But he remained undeterred. The technology has been running for four years, without much scrutiny from privacy organizations or from the state.
A question of consent
I went to Madrid’s South Bus Station. Not many people seemed aware of the face recognition system. Not even people running small stores inside the station seemed to know that the face recognition system had been operating for four years already.
A jeweler, a profession that requires a keen sense of security, told me she did not know such technology was operating inside the building. She had been working at the store, which she owned, since 2014.
One of her neighbours, an older woman who has been selling pastries at the bus terminal for the past 16 years, said she felt there were more security guards in the station, but she also said, while knocking on wood, that her store had always been free from theft.
Claiming success
The firm behind the software that runs the bus terminal’s face recognition system is Barcelona-based Herta Security, which since expanded to Los Angeles, Montevideo and Singapore. Another company, Axis Communications, installed the hardware. Both companies are keen to stress that security at the bus station has improved since the system was deployed.
In 2019, Herta Security released a report, which AlgorithmWatch was given access to, detailing the “successful case” that the station represented. According to the numbers provided by the operators of the station, incidents in its facilities decreased by 75%.
A report by Axis Communications claims that the number of incidents went from “five a day to five a month” after the system was deployed, but provides no detailed data.
Laura Blanc, Chief Marketing Officer at Herta Security, claims that cases of vandalism started to diminish when the face recognition system started in 2016. She considers that this kind of surveillance alone is effective at chasing away delinquents that mug and bother people at the station.
Once experience tells you that you are getting caught in a specific place, it seems a good reason to reconsider if you still want to rob inside it or move on to the next building and commit crime in another place, Ms Blanc told AlgorithmWatch. The system is believed to act like a scarecrow in a garden, whether it gives positive results or not. (Ms Blanc did not provide an explanation why shop-owners failed to see the scarecrow).
9 cameras
Although the station has around 100 surveillance cameras, only nine are used by the face recognition system. They are deployed at strategic points in the facilities, such as entrance and exit points and connections with subway tunnels, Ms Blanc told AlgorithmWatch.
The cameras record constantly. The software analyzes the video feed in real-time, taking a snapshot of people’s faces every time they enter the frame. The images are analyzed in the center of operations, which was created between 2014 and 2016 as part of several improvements at the Méndez Álvaro station to increase safety, which included changes to the building’s layout and better lighting.
In the center of operations, screens show the live video feeds from the surveillance cameras. One screen in particular is split in two halves: the left side is constantly running the live recordings, displaying a column with snapshots of the faces of people that walk through the station. If there is a “match” with one of the images stored in the database of suspects, an alarm pops up on the right half of the screen, alerting operators that an identification has been made.
50% certainty
The software gives a score from 0 to 100 to each match, indicating its reliability. In what seems a logical contortion, Ms Blanc says that a score of 50-60% means that “the system is sure about it”. Operators can adjust the percentage as they wish.
By moving the threshold up, operators limit the number of possible false positives that the system will generate (people who are mistaken for faces in the database). Conversely, this increases the number of possible false negatives (people who are in the database but are not matched).
Human operators then decide whether or not to stop the person whose face produced a match.
Face masks
Officials from the bus terminal did not comment on how the system currently works, as the Covid-19 pandemic limited citizens’ movements and face masks limit face recognition technologies.
Even so, Ms Blanc insists on the system’s efficacy: “In March, we launched a new algorithm that allowed face recognition even though the person exhibited a big occlusion in the face, like a mask. We were already developing it before the coronavirus pandemic because we have clients in Asia, for example, where wearing one is usual. We also work in football stadiums, where people usually wear caps, scarfs, etc.”
She admits that working in such environments is “difficult” because the less information you have in the video feed, the more accuracy you lose for the identification. Nevertheless, she assures that they managed to overcome this problem and, because the pandemic has made face masks compulsory in closed public spaces, there were “even more reasons to commercialize it”.
City transit stops face recognition
Other operators of face recognition reacted differently to the pandemic. Madrid’s city council suspended a pilot project where people were invited to pay using face recognition in some buses of the city’s public transit network.
It was announced in late 2019, but recent norms making face masks compulsory inside public transport in Madrid forced the authorities to put the brakes on the project, claiming that the system is not yet perfected to recognize individuals wearing a mask.
Operators at Madrid’s Méndez Álvaro Station declined to provide AlgorithmWatch with precise data, or an audit, which would show that their system performs well with people wearing face masks.
A private-public partnership
At the Méndez Álvaro Station, a pilot study was conducted before the system was deployed in 2016, in order to test its effectiveness. But a source with detailed knowledge of the operation, who asked not to be named, said the pilot had a second goal: training the program itself. Employees from the security department of Madrid’s South bus terminal would upload pictures of themselves to the database of suspects wearing caps, glasses, scarfs, etc. to test the system and fine-tune it.
Despite their initial reluctance, both the company running the bus terminal and the police, which has a presence in the building, changed their minds about the system once it was installed.
The security team of Méndez Álvaro Station realized that cooperation with the national police was essential for the system to run properly. Law enforcement agencies provide the station’s security center with the details of people with outstanding warrants, and the station alerts the police when a match occurs. (This procedure is reserved for dangerous criminals or terrorists, whose pictures are sometimes made available by Interpol.)
But the database holds more than pictures of suspects with outstanding arrest warrants. Some of the pictures it holds come from recordings from the surveillance cameras at the station itself. If a person is caught committing theft, he or she can be identified in video recordings and their face can then be incorporated to the database, so that the software can spot them across the station – even if the case has not been decided by a judge.
According to our source, in some cases the police comes to the station and asks for personal information about the people that the security department catalogues independently. In other words, the police can rely on matches obtained using biometric data that includes information on people selected with absolute discretion by a private company.
Lost children
The surveillance system deployed at the Méndez Álvaro station works in real time, but it can also be used on past video footage.
This is how the security center exploits the “social” objective of automated surveillance, as Mr Gallego, the former head of security, described it. “The face recognition system is not only used to prevent crime, but also with a social objective: looking for lost children, people with Alzheimer and other collaborations with the security forces in something that goes beyond common vandalism”, he said in a recorded interview for Herta Security in 2019.
A mother turned to the police one day saying that her daughter had disappeared from home and that there was a chance that she had run to the station to catch a bus. The station’s security told her to bring a picture of the girl. (Our source did not provide details such as the precise date of the event).
Her image was introduced in the database and automatically scanned through the morning’s video recordings. Even though they estimated that the girl could have arrived at the station at around 11 am, the system found her wandering in the building at 9:10 am.
They virtually followed her trip through the facilities and saw which adults she had spoken to and what bus she had taken. The police could stop the bus she took and bring the girl home. People who ran the system at the time said this would never have happened if they had had to check manually all the video recordings of that morning.
Security justifies the means
The system is based on the defense of public and legitimate interest, two grounds that count as special conditions in the General Data Protection Regulation (GDPR). This provides the station’s operators with a great margin of discretion to act, according to Rahul Uttamchandani, a data protection lawyer working for the Spanish law firm Legal Army.
“If images used in the database are from people that are being pursued by the justice authorities, then they are protected by public interest”, he states. The fact that GDPR entered into force in 2018, two years after the system was started, did not make a huge difference, according to the people who built the system: they justify its use in maintaining ‘public interest’ in terms of security.
The use of personal data involved in surveillance made by law enforcement agencies is subjected to Directive 2016/680, which was approved in 2016 along with GDPR. But Spain has yet to transpose it into national law, although the deadline to do so was May 2018.
The main problem Mr Uttamchandani sees is that the snapshots that the cameras take of every single face that enters the station could be used to train the system. “People need to know all the finalities of a treatment of their biometric data and when you pass by a surveillance camera you can think that you are being observed or they are recording, but you cannot know that the shape and points of your face are being used to build a better technological model”, Mr Uttamchandani said.
Sources contacted by AlgorithmWatch said that the snapshots of people’s faces who are not flagged as thieves are kept 30 days with the original recordings and then erased, as is legally required.
Operating in the shadows
Very few concerns were raised since automated surveillance was deployed at the South Station. AEPD, the Spanish Data Protection Authority, is not aware of any complaint on the matter, according to a statement to AlgorithmWatch.
The Méndez Álvaro Station has kept silent for the whole period during which this article has been written, alleging that the system is catalogued as a “critical infrastructure” and that therefore no information about it can be disclosed. This argument is quite disingenuous, given that plenty of interviews and infomercials have been published in media and in Herta’s and Axis’ official channels since 2016.
An infrastructure being “critical” under Spanish regulation means that it is considered a strategic technological installation that functions as an essential service and that no alternative can reach the same purpose. Therefore, its destruction or perturbation would lead to “a great impact over essential services”. Citizen security is among “essential services”, according to the law.
Despite twelve days of practically daily calls to the Administration and Communication departments of the Méndez Álvaro station and four emails, the station’s personnel did not answer our questions. The head of station’s administration repeatedly assured us that we would be attended by the chief operator, which never happened. Instead, one of the operators called my number during his holidays to state that they would not disclose any information and that that was all he had to say to me. No further questions.
Face recognition in the supermarket
Just a few weeks ago the supermarket chain Mercadona announced the installation of live face recognition in 40 stores in three cities in order to keep suspected thieves from entering. The announcement provoked an immediate response. The media asked questions (few were answered), privacy experts openly wondered if the surveillance was legal and the Spanish data protection authority, known as AEPD in its Spanish acronym, announced an inquiry into the issue.
Few details are available about Mercadona’s face recognition system. Some media reported that Mercadona would build their own database of thieves based on the footage they captured in their stores.
Did you like this story?
Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!
Naiara Bellio (she/her)
Head of Journalism
Naiara Bellio covers the topics privacy, automated decision-making systems, and digital rights. Before she joined AlgorithmWatch, she coordinated the technology section of the Maldita.es foundation, addressing disinformation related to people's digital lives and leading international research on surveillance and data protection. She also worked for Agencia EFE in Madrid and Argentina and for elDiario.es. She collaborated with organizations such as Fair Trials and AlgoRace in researching the use of algorithmic systems by administrations.