#discrimination (17 results)

Page 1 of 2

Publication, 7 September 2022

AutoCheck workshops on Automated Decision-Making Systems and Discrimination

Understanding causes, recognizing cases, supporting those affected: documents for implementing a workshop.

Read more

Publication, 21 June 2022

How to combat algorithmic discrimination? A guidebook by AutoCheck

We are faced with automated decision-making systems almost every day and they might be discriminating, without us even knowing about it. A new guidebook helps to better recognize such cases and support those affected.

Read more

Position, 24 March 2022

Algorithmic Discrimination – How to adjust German anti-discrimination law

In their coalition treaty, the new German government has signaled their intention to evaluate the German anti-discrimination law (Allgemeine Gleichbehandlungsgesetz – AGG). We demand for them to account for the special features of algorithmic discrimination, for instance by considering the right to collective redress mechanisms to better protect the rights of those affected.

Read more
Gareth Harrison ǀ Unsplash

Story, 4 February 2022

Costly birthplace: discriminating insurance practice

Two residents in Rome with exactly the same driving history, car, age, profession, and number of years owning a driving license may be charged a different price when purchasing car insurance. Why? Because of their place of birth, according to a recent study.

Read more

Story, 13 January 2022

Fixing Online Forms Shouldn’t Wait Until Retirement

A new Unding Survey is investigating discrimination in online forms. But operators are already getting angry emails. Behind some: a recently retired IT consultant with one of the most common surnames in the world and 30 years experience of not being able to sign up.

Read more
Photo by Souvik Banerjee on Unsplash

Story, 31 August 2021

LinkedIn automatically rates “out-of-country” candidates as “not fit” in job applications

A feature on LinkedIn automatically rates candidates applying from another EU country as “not a fit”, which may be illegal. I asked 6 national and European agencies about the issue. None seemed interested in enforcing the law.

Read more
Julia Bornkessel, CC BY 4.0

Interview with Jessica Wulf, 25 May 2021

“We’re looking for cases of discrimination through algorithms in Germany.”

The project AutoCheck investigates the risks for discrimination inherent in automated decision-making systems (ADMS). In this interview, project manager Jessica Wulf talks about the search for exemplary cases and how the project will support counselling centers and further education on the topic.

Read more
Fraktion DIE LINKE |Flickr.com

Story, 6 April 2021

Europeans can’t talk about racist AI systems. They lack the words.

In Europe, several automated systems, either planned or operational, actively contribute to entrenching racism. But European civil society literally lacks the words to address the issue.

Read more

Project, 9 March 2021

AutoCheck – Mapping risks of discrimination in automated decision-making systems

Read more

Story, 4 December 2020

Health algorithms discriminate against Black patients, also in Switzerland

Algorithms used to assess kidney function or predict heart failure use race as a central criterion. Continue reading the story on the AlgorithmWatch Switzerland website

Read more
Page 1 of 2