Image classification algorithms at Apple, Google still push racist tropes

Automated systems from Apple and Google label characters with dark skins “Animals”.

Roy A., a Berlin-based lawyer who also campaigns against discrimination, recently told AlgorithmWatch that some pictures depicting Black people appeared under the label “Animal” (“Tier” in German) on his iPhone 8.

Apple added the automated labeling feature to iOS in 2016. According to the company, the process takes place entirely on the user’s device. Perhaps due to the phone’s limited resources, it labels images erratically. An AlgorithmWatch colleague, who is white, reported that her children had been labeled “Animal” in the past.

However, a slight modification to the image reported by Mr A. was enough for Apple not to give it the “Animal” label.

While both images were labeled “People”, only the characters with darker skin tones were labeled “Animals” by Apple.

The behavior is not unique to Apple’s software. We processed both images in Google Vision, an online service for image labeling. The image depicting characters with darker skin tones was labeled “Mammals”, the other was not. Other labels, including “Human Body”, were similar for both pictures.

While biologically correct, a cursory search on Google Images shows that, in Google’s taxonomy, the label “Mammal” applies only to animals.

We tested other cartoon characters in Google Vision. In most cases, a dark overlay on the character’s skin was enough for the system to label it an animal instead of a human.

Tracy Frey, a director for Responsible AI at Google, told AlgorithmWatch that "this result [was] unacceptable" and that "multiple teams ... [were] investigating to understand what went wrong. [Machine Learning] models learn from existing data collected from the real world, and so an accurate model may learn or even amplify problematic pre-existing biases in the data based on race, gender, religion or other characteristics. Avoiding the creation or propagation of unfair bias is one of our core AI Principles and the work of fighting unfair bias is never complete; we are always striving to be better," Ms Frey added.

Apple did not answer requests for comments sent by email on 12 May.

No isolated case

Google’s image labeling services have a history of producing discriminatory and racist outputs. In 2015, Google Photos labeled individuals with dark skin tones “gorillas”. The company apologized but, according to a report by Wired, did not solve the problem. Instead, it simply stopped returning the “gorilla” label, even for pictures of that specific mammal.

In 2020, AlgorithmWatch revealed that a hand with dark skin holding an infrared thermometer was labeled “gun”, while the same hand with a light overlay was labeled “tool”. The company apologized and now only displays the “gun” label when its software recognize one with a high level of certainty.

It seems, from these examples, that Google has yet to address the root cause of the racist biases in its image labeling services.

Lacking diversity

There are many reasons why Apple’s and Google’s image labeling systems display racist results. It could be that the training data sets are unbalanced and have few Black people in them. It could be that the people who tagged images in these training data sets purposefully used racist tropes in their tagging. It could be that the models developed by the engineers do not optimize for fairness. Absent an independent audit of these systems, the precise causes cannot be identified.

Both Apple and Google remain overwhelmingly white. The proportion of Black workers in tech and management roles at both companies increased in the past 7 years, but remains in the low single-digits. At Apple, the proportion of Black workers in tech remained flat.

Old prejudice

Denying racialized people their humanness is a core component of modern racism. Several racist scholars of the 18th and 19th centuries, such as Christoph Meiners or Louis Agassiz, claimed that non-whites had a different origin, thereby considering racialized people as belonging to a different species. In general, proponents of racism agreed that racialized people were closer to animals than whites.

Viewing racialized others as non- or less human had concrete embodiments. Racialized people were regularly displayed in ethnographic exhibitions, behind fences or alongside animals. The last such event in Europe was organized in 1994, in France.

Labeling Black characters “Mammals” or “Animals” is not a bug that can be “fixed”. It is the continuation of over two centuries of institutional racism.

A man from the Malabar Coast of India in Paris in 1902.

Edited 14 May 21:10 to add Google's statement.

Nicolas Kayser-Bril

Reporter

Photo: Julia Bornkessel, CC BY 4.0
Nicolas is data journalist and working for AlgorithmWatch as a reporter. He pioneered new forms of journalism in France and in Europe and is one of the leading experts on data journalism. He regularly speaks at international conferences, teaches journalism in French journalism schools and gives training sessions in newsrooms. A self-taught journalist and developer (and a graduate in Economics), he started by doing small interactive, data-driven applications for Le Monde in Paris in 2009. He then built the data journalism team at OWNI in 2010 before co-founding and managed Journalism++ from 2011 to 2017. Nicolas is also one of the main contributors to the Datajournalism Handbook, the reference book for the popularization of data journalism worldwide.