Europeans can’t talk about racist AI systems. They lack the words.

In Europe, several automated systems, either planned or operational, actively contribute to entrenching racism. But European civil society literally lacks the words to address the issue.

Nicolas Kayser-Bril
Reporter

In February, El Confidencial revealed that Renfe, the Spanish railways operator, published a public tender for a system of cameras that could automatically analyze the behavior of passengers on train platforms. One characteristic that the system should be able to assess was “ethnic origin”.

Ethnic origin can mean many things. But in the context of an automated system that assigns a category to people based on their appearance captured by camera the term is misleading. “It seems to me that ‘ethnic origin’ is code for a crude essentialist (biological) notion of ‘race’” Norma Möllers, an assistant professor of sociology at Queen’s University who focuses on the intersections of science, technology and politics, told AlgorithmWatch. “Take the following example: my mother is Batak, an ethnic indigenous minority in Indonesia. Renfe would likely not recognize that she’s Batak, it would recognize that she’s a brown woman. Hence, ‘ethnic origin’ appears to be a colorblind racist term to build race into the system without talking about race,” she said.

Renfe’s plan is not an exception. Several other projects rely on race without saying so. In the Dutch town of Roermond, the police uses automated systems to track “mobile banditry”, a category of crime they created which only applies to Roma people, a report by Amnesty International revealed last year. European hospitals routinely modify the scores of Black patients in some tests, making them healthier than they are and possibly denying them treatment, based on flawed research, as AlgorithmWatch Switzerland showed.

Taboo

Although racism was born of European practices (the Atlantic slave trade) and European thinkers, race disappeared from the European public discourse after the Second World War. The topic became taboo, probably because European governments were eager to distance their own racist policies from Nazi Germany’s. The parallel between the two was all too clear at the time. Aimé Césaire, a French intellectual, wrote in 1950 that “[European governments] tolerated Nazism before it was inflicted on them […] because, until then, it had been applied only to non-European peoples”.

Unlike the United States and, to some extent, the United Kingdom, there are no national museums of slavery in the European Union. Films and cultural productions on colonialism or on slavery are rare. Philosopher Achille Mbembe said in 2010 that France did not decolonize itself and “kept intact the mental structures that legitimated” a domination based on race. This probably applies to all countries of the EU.

Lacking words

As a result, talking about racial inequalities is close to impossible. Because so few European scholars study race, no vocabulary emerged to replace the explicitly racist terms of the 20th century. There is no way to say “racial justice” in French or German. “Rassengerechtigkeit”, the literal translation, sounds like Nazi language, Ms Möllers said.

While critical race theory is a growing field in many countries, it remains shunned in European academia. The French government even condemned scholars of race, arguing that they “corrupted” society. Such opposition makes it unlikely that we, Europeans, will develop a language to talk about racial justice that is accepted by all anytime soon.

Lacking action

This lack of words has real consequences. In 2017, the European Commission announced a review of how €42 billion earmarked for anti-discrimination between 2014 and 2020 were spent. When I asked the Commission what amount went to racial justice, their spokesperson not only did not answer the question, but managed to not mention “racial justice” at all in his 1400-word-long email. Instead, he spoke in general terms of “structural discrimination” and “reducing bias”.

Equinox, an initiative that brings together several civil society organizations, published a report in March that highlighted the lack of a racial justice policy at EU institutions. The current clutter of responsibilities (antisemitism and racism are under the remit of two different commissioners, for instance) is such that the European Commission finances projects that attempt to reduce racial bias in technical systems, such as Sienna, and projects that further entrench racism, such as Identity. The latter published a research article that claims to identify race automatically, the kind of pseudo-science that racist systems such as Renfe’s rely on.

A holistic view

Concrete action on racial justice, in automated systems and elsewhere, is unlikely to move forward as long as there are no words for it. “If you can’t name a problem, you can’t fix it,” Ms Möllers, the sociology professor, said.

Some initiatives exists, especially among English speakers. The Digital Freedom Fund and European Digital Rights, two European non-profits, recently launched a program called “Decolonising Digital Rights”. The project wants to initiate a process that challenges the historical, structural causes of oppression that automated systems are known to replicate and amplify.

Nani Jansen Reventlow, the director of the Digital Freedom Fund, told AlgorithmWatch that racial justice is rarely the focus of digital rights organizations. In her view, racial justice must be brought to the general conversation, and campaigners should develop a more holistic view of what “justice” means. They too often consider the average person to be white, male and able-bodied, she said.

As long as this does not happen, and as long as Europeans do not develop an adequate vocabulary to discuss race, white privilege will remain. “When developers say that they don’t use race, what it may actually mean is that they code default whiteness into the system,” Ms Möller said. This “color-blind” racism is no different from avowed racism. It is just harder to fight.

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.