The year algorithms escaped quarantine: 2020 in review

Amid the Covid pandemic, governments and corporations stepped up the deployments of automated systems. Civil society initiatives attempted to keep some of them in check.

Blog

28 December 2020

Auf Deutsch lesen

#COVID19

2020 probably began on 31 December 2019, when BlueDot, a self-styled “human and artificial intelligence” company, detected a cluster of pneumonia in Wuhan. Or maybe the year began on 30 December 2019, when ProMED, a monitoring service by the non-profit International Society for Infectious Diseases, sent an alert on the same topic. Given that BlueDot’s claim remains unaudited, it is impossible to know whether their automated system independently spotted the first signs of Covid-19 or if they forwarded ProMED’s warning.

In the wake of the pandemic, companies and governments alike opened the gates to automated systems. They probably hoped that technology would help them regain the upper hand. Cunning entrepreneurs were quick to ride the techno-solutionist wave. They offered companies to constantly film their employees and automatically monitor their body temperature (never mind that not all infected people have fever and that the system can be used to detect pregnancies). They came up with wearables that track body function (in use on some US campuses), and with cameras that automatically spot people not wearing face masks.

The Automating Society report, published by AlgorithmWatch and Bertelsmann Stiftung in October, shows how widespread the usage of automated systems has become in Europe. The automation of welfare and health management stood out, but the report documents dozens of cases impacting all aspects of our lives.

Backlash

The acceleration of the automation of society did not go unopposed. In August, British students took to the streets chanting that the algorithm that automatically assigned their grades was not legitimate (they used another word). As a result, the British government backed down and kept the grades teachers had given.

Regulators and courts throughout the European Union clamped down on some experiments in automation. The cameras that spotted people not wearing face masks in France were deactivated after the country’s data protection authority declared them illegal. In the Netherlands, a court ruled against SyRI, a system that attempted to automatically spot fraudsters in welfare services, calling it disproportionate. In Austria, an algorithm that attempted to sort unemployed people based on their “employability” was suspended.

Ramming through

Governments were unimpressed by these decisions. In the Netherlands, a bill was introduced to set up a new SyRI-like system, on a much larger scale. The automated grading of students was allowed in several countries, including Germany, despite grave concerns about its fairness.

In several EU countries, such as France and Greece, the police are preparing to deploy live face recognition. This would allow coercion forces to identify individuals in a group, in real time. Isolating protesters and taking away the protection afforded by anonymity in group action removes any substance to the freedom to assembly.

Despite calls for their interdiction, a short but intensive war in Karabakh demonstrated the usefulness of autonomous weapons. Azerbaijani forces used at least three different models of drones that are capable of identifying and destroying a target automatically. Some analysts believe that the tactical innovation brought by these new weapons contributed to Azerbaijan’s victory.

Inside the black boxes

Several groups of activists and journalists did shed light on some of the black-boxes used by governments and corporations. A taz investigation revealed that the photo booths used by German authorities refused to take pictures of citizens with darker skins. The BBC came up with similar results in the United Kingdom. AlgorithmWatch revealed several cases of discrimination by automated systems against individuals with darker skins, against People of Color and gays, and of blatant sexism.

Although most of these stories uncovered illegal behavior, regulators and enforcement agencies have yet to take action. In most cases, they do not have the necessary resources to do so. AlgorithmWatch published several recommendations that could, if put in practice, remove this bottleneck by forcing operators of automated systems to give researchers access to their data. Amsterdam and Helsinki took steps in this direction by launching a repository that lists some of the automated systems they use.

The other catastrophes

The pandemic eclipsed the climate catastrophe in our news feeds, but it continued at increased speed. Humans burnt a record area of the Amazon in 2019, but destroyed even more in 2020. The Atlantic hurricane season was the most active on record and thousands died from floods, droughts and other events brought by the man-made global heating.

Crucially, we could not find that any of these events impacted automated systems. On the contrary, there is evidence that they were more resilient than others. As typhoon Ulysses wreaked havoc in the Philippines, traditional means of communication were severed. Telephone lines were saturated and the television network that used to gather and verify information had been closed down by the government. As a result, both citizens and authorities relied on online social networks to organize. This probably left the prioritization of rescue operations up to the newsfeed algorithms.

2021

The year ended with a US election and a botched coup attempt. Contrary to our predictions, few journalists claimed that big tech’s algorithms influenced the results. However, there is ample evidence that Facebook and YouTube, in particular, are still knowingly propagating falsehoods.

As citizens spend even more time on their mobile devices due to the various lockdowns, platform power does not look set to diminish. How they address several key events of 2021 – the beginning of the new US administration, the Covid vaccination campaign and the election in Germany, to name a few – will profoundly impact our societies.

It is more important than ever to shed light on their black boxes. We, as a society, must hold them, and any operator of automated systems, to account. As long as this is not possible we, as AlgorithmWatch, will keep fighting, however modestly, to ensure that decision-makers work towards making this vision reality.

Let us keep you posted...
Subscribe to the AlgorithmWatch newsletter!

Nicolas Kayser-Bril

Reporter

Nicolas is data journalist and working for AlgorithmWatch as a reporter. He pioneered new forms of journalism in France and in Europe and is one of the leading experts on data journalism. He regularly speaks at international conferences, teaches journalism in French journalism schools and gives training sessions in newsrooms. A self-taught journalist and developer (and a graduate in Economics), he started by doing small interactive, data-driven applications for Le Monde in Paris in 2009. He then built the data journalism team at OWNI in 2010 before co-founding and managed Journalism++ from 2011 to 2017. Nicolas is also one of the main contributors to the Datajournalism Handbook, the reference book for the popularization of data journalism worldwide.