How to combat algorithmic discrimination? A guidebook by AutoCheck

We are faced with automated decision-making systems almost every day and they might be discriminating, without us even knowing about it. A new guidebook helps to better recognize such cases and support those affected.

Our new report “Automated Decision-Making Systems and Discrimination” takes a deeper look into algorithmic discrimination: what it is, how it’s caused and what can be done about it. With comprehensive instructions, diagrams, and case studies, the guidebook gives anti-discrimination counselors as well as people affected a basic introduction into the topic, and supports them in seeking help with checklists, sources and contact information.

One of the major problems in fighting discrimination by algorithmic systems is that it often goes unnoticed and those affected - in most cases - do not know they have been discriminated against. But when algorithmic systems decide in a discriminating way or suggest discriminating decisions, many people are potentially affected – because all decisions follow the same pattern. These risks are systematic and the overall societal impact can be large.

This publication is part of the AutoCheck project which is funded by the German Federal Anti-Discrimination Agency. The aim of the project is to reduce this lack of awareness by developing concrete and comprehensive instructions and tools for anti-discrimination counseling centers. We want to support employees of anti-discrimination counseling offices to better assess and recognize risks, and thus better support individuals affected. Project manager and author of the report, Jessica Wulf, started researching cases and conducting interviews with anti-discrimination counselors and experts in February 2021. The guidebook as well as training courses are based on this comprehensive body of work.

The case studies discussed in the report, show how urgently German anti-discrimination laws need to be updated. In the digital 21. century, where algorithmic decision-making permeates our everyday lives - whether while shopping online or applying for a loan, the legal framework for taking action against algorithmic discrimination must respond to its special demands - especially its invisibility and often also the lack of responsible parties. Anti-discrimination bodies need more rights and resources to better support those affected.

How should Germany’s legal framework, Allgemeine Gleichbehandlungsgesetz (AGG) be reformed?

Collective redress mechanisms
Interest groups and associations have the capacity to uncover algorithmic discrimination, draw attention to it, and take legal action against it through their specialized work. However, this requires the possibility for collective redress mechanisms so that, under the AGG, not only those affected can take action against possible discrimination.

More transparency and clear responsibilities in cases of suspected discrimination
More transparency and clear responsibilities must be ensured in cases of suspected discrimination by ADM systems, for example by strengthening obligations to provide information to anti-discrimination bodies and affected persons and creating audit procedures and appeal possibilities for affected people.

Expand the areas of application and discrimination features in the AGG
An expansion of the areas of application and discrimination features of the AGG is crucial with regard to algorithmic discrimination. So-called proxy variables can be used in ADM systems in place of the characteristics protected under the AGG as a basis for decision-making. For example, an ADM system used in job application processes might not reject individuals based on their age, since age is a protected characteristic under the AGG. However, as a proxy variable, the system could problematically use the length of previous work experience to nevertheless identify older people and exclude them from the application process.

The German government must recognize the need for action with regard to algorithmic discrimination and create sufficient protection for those affected as part of the planned evaluation of the AGG.

Interested in more reports like this? Support our work!

Help us unpack and analyze algorithmic systems, uncover their harms and debunk their promises by funding more research projects and in-depth reporting.