Let the Games Begin: France’s Controversial Olympic Law Legitimizes Automated Surveillance Testing at Sporting Events

Building up from a decade of surveillance creep, the new French law is yet another example of how sporting events are used to normalize automated surveillance systems in public spaces.

On the 23rd of March 2023, overshadowed by months of protests against Emmanuel Macron’s controversial pension reform, the French National Assembly passed a new law allowing algorithmic video surveillance during the Olympic and Paralympic Games 2024. Professing to provide greater security for the estimated 600,000 Olympic visitors, the law permits trialing these invasive technologies beyond the Olympic Games until the end of 2024.

Unprecedented in the EU, Article 7 of the Olympic law also allows the algorithm-driven surveillance of concerts and festivals that could involve a high security or terrorism risk. Having been recently approved by the Constitutional Council, the first deployment of algorithm-driven surveillance cameras is already planned for the Rugby World Cup this September. 

No facial recognition, no problem?

Under the Olympic Law, the processing of personal data has been excluded, and banned the usage of facial recognition software. Instead, it is planned to adopt a ‘low-risk’ automated surveillance system that scans for unusual behavior or objects like left-behind luggage. Once a ‘suspicious event’ has been identified the system alerts security personnel to the location. The same technology has already been deployed at the FIFA world cup in Qatar last year to detect crowd swells.

However, the law has not specified what constitutes suspicious behavior. In addition, the vagueness of what is considered biometric data or the processing of personal data leaves room for wide interpretations. Although the usage of live facial recognition software has been banned, crowd-analyzing software has a lot of similarities, as it also analyzes live footage from surveillance cameras and makes decisions based on bodily features. While the software cannot ID an individual like facial recognition does, concerns remain over its impact on privacy and human rights. 

In a public letter signed by Algorithm Watch and 37 rights groups against the Olympic law, activists warn that crowd-analyzing CCTV scans and processes of people’s physiologies can also track an individual and is potentially biased against certain groups of people. They also warn of a potential chilling effect as public spaces move towards constant surveillance. 

Alouette from the French citizen advocacy group La Quadrature du Net, who uses a pseudonym, warns: “We think it is as dangerous as facial recognition. It’s being used in public spaces and can be used on CCTV all over the city. It can analyze our bodies and our behaviors. It still can track people and target individuals. Even without knowing someone’s identity, it can still lead to harassment.” 

Caroline Lequesne Roth, a law professor of public law and digitalization at the Côte d’Azur University, cautions that algorithm-driven CCTV can be particularly discriminatory for vulnerable members of society like people with disabilities, disadvantaged communities, or homeless people. For example, an ableist bias within AI surveillance software, that defines what ‘normal behavior’ looks like, can easily lead to the targeting of people with disabilities as behaving suspiciously. Similar biases have been found in software used to monitor student examinations during Covid-19, where disabled students were disproportionally flagged as potentially cheating.

Sporting Events as AI-surveillance Labs

The usage of algorithm-driven surveillance software in sports stadiums has mushroomed all over the world. Already in 2001 the city of Tampa, Florida secretly used facial recognition during a Superbowl. Russia used facial recognition in Moscow’s metro during the 2018 FIFA world cup and has since deployed it against protesters. Other mega-sporting events, like the 2020 Tokyo Olympic Games or the Beijing Winter Olympics 2022 have also deployed facial recognition.

The connection between algorithm-driven surveillance and sports is no coincidence. “Sporting events are attractive because they create a setting in which you have got a lot of people all aligned, sitting still for the most part, looking in the same predictable direction. So for systems that want to identify faces, for example, it's a pretty good testing ground”, explains Mark Andrejevic co-author of “Facial Recognition” and professor of  Media Studies at Monash University. “Sporting events are, in a way, exceptional sites that have associated security concerns. Historical occurrences include everything from terrorist attacks to hooliganism and various forms of disruptions. The stadium is a site that has been targeted for securitization.”

In Europe, small sporting events were also used to experiment with algorithmic video surveillance of a human crowd. In France, the football club FC Metz used an experimental facial recognition system to identify people who were subject to stadium bans and detect abandoned objects. The deployment was later found to be illegal. Similarly, facial recognition was used at the Brøndby stadium in Denmark, to keep ‘troublemakers’ out.

However, the most significant use of algorithm-driven CCTV in Europe can be found in the UK, where sporting events were the springboard to test the technology for day-to-day policing. The first proposed usage of facial recognition at a stadium was made by the Scottish Football League in 2016. Backlash followed quickly as fans and supporters’ groups protested against what they saw as invasive and promoting the harmful image of football fans as criminals.

In the coming years, live facial recognition cameras were trialed by South Wales and the Metropolitan Police. During the 2017 Champions League final in Cardiff, live facial recognition wrongly identified more than 2000 spectators as potential criminals. South Wales Police made 450 arrests based on these faulty results. “What we've seen basically, in total contrary to football fans and supporters’ groups wishes, is that sporting events have been particularly targeted by facial recognition in these trials,” says Madeleine Stones, Legal and Policy officer at Big Brother Watch. 

In the UK, the usage of facial recognition at sports events has since swept into the surveillance of shopping malls, museums, and cultural events such as London’s Notting Hill Carnival or most recently ordinary public spaces like the Highbury and Islington Underground station on 20th April 2023. 

In April 2023, the National Physical Laboratory (NPL) published an independent report on the accuracy of the algorithms used by the Metropolitan Police. The Met said in a press statement that: “At the setting we have been using it, there is no statistically significant bias in relation to race and gender and the chance of a false match is just 1 in 6000 people who pass the camera.” 

However, as Stone says: ”The police have framed it as giving them the green light to use facial recognition again. But once you dig into the report, it shows that there is significant racial and gender bias within the technology. However, police can adjust settings in order to mitigate this.”

Since the report was published, the Metropolitan Police has also used live facial recognition at King Charles III’s coronation - the biggest deployment to date. The Met scanned the faces of hundreds of thousands of spectators and was criticized for their repressive actions against pro-Republican protesters. Moreover, South Wales Police swiftly followed suit to the Met by scanning the faces of thousands of fans who attended the Beyoncé concert in Cardiff.

With the Olympic law, France could soon see a similar expansion of algorithmic surveillance technologies in public spaces as its neighbor across the channel. Several French cities have already invested in algorithm-driven CCTV over the past few years and the Olympics law is likely to further push the door open.

A profitable industry

The French government has been working on a proposal for years and French police have been pushing to introduce these technologies to assist their work. Lequesne Roth says: “The Olympics are accelerating the dynamics to adopt this kind of technology, but the dynamics were already there.” She adds that the law “is going to legitimize these tools and give legal grounds that will facilitate the deployment for the police and have a normalizing effect." 

Lequesne Roth is also concerned about the lack of safeguards and the rushing out of the law. "In order to use this kind of system you need authorities that understand how these technologies work and that can mitigate the risks in terms of fundamental rights. And that takes time.”

The AI surveillance industry is currently speculated to be worth 15 billion US dollars and is forecast to grow to 52 billion US$ by 2030. These numbers are not unbeneficial for Emmanuel Macron’s plans to make France a leader in AI and establish “la French tech” as a competitor to the US and Chinese tech industries. 

According to an investigation by la Quadrature du Net, the French National Research Agency in conjunction with the Ministry of the Interior, and the Olympic Organizing Committee 2024 awarded several French companies with research funding of up to 500,000€ to develop innovative security projects for the Olympic Games. Alouette describes the law as “a gift for French companies because they were given money from the state to test these technologies. The population will be treated like guinea pigs for these companies to experiment with their technologies.” While it is yet to be seen which companies will be contracted to provide the surveillance software during the Olympics, it is indicated that the preference will be for French and European companies. With such large-scale investments, infrastructures, and legal precedents put into place through the law, it is unlikely that algorithm-driven cameras will just disappear from French streets and stadiums at the end of 2024.

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.

Dr. Jennifer Krueckeberg (she/her)

Former Fellow Algorithmic Accountability Reporting

Jennifer has recently completed an EU-funded PhD in anthropology in which she explored how digital media affect young people’s personal memory practices. Before embarking on her PhD, she worked as Lead Researcher at a London-based non-profit organization researching facial recognition, data exploitation, and surveillance in schools. As part of her fellowship, Jennifer investigated the impacts of algorithms on education, surveillance, and people’s everyday lives.