Algorithmic Discrimination – How to adjust German anti-discrimination law

In their coalition treaty, the new German government has signaled their intention to evaluate the German anti-discrimination law (Allgemeine Gleichbehandlungsgesetz – AGG). We demand for them to account for the special features of algorithmic discrimination, for instance by considering the right to collective redress mechanisms to better protect the rights of those affected.

Automated decisions increasingly permeate our everyday lives: for example, when our creditworthiness is assessed when shopping online, when we are shown personalized advertising based on online profiles, or when social benefits are automatically calculated. Here, discriminatory decisions occur again and again. Both human decisions about what criteria an automated decision making system (ADM system) relies on, and data sets with a bias – i.e. discriminatory tendencies in data sets – can lead to discriminatory decisions by ADM systems.

A legal framework to protect against discrimination was introduced in Germany in 2006 with the Allgemeine Gleichbehandlungsgesetz (AGG). It can also be applied in cases of discrimination through ADM systems. However, algorithmic discrimination places special demands on anti-discrimination laws. Discrimination through algorithms often goes unnoticed, such that those affected do not know they have been discriminated against. According to the AGG, however, only those affected can take legal action against the discrimination. Algorithmic discrimination thus often goes unchallenged.

Algorithmic discrimination is structurally embedded in an ADM system. Discrimination therefore takes place systematically. It is precisely in this regard that the legal framework for taking action against algorithmic discrimination must be altered. The German Data Ethics Commission (Datenethikkommission) has equally pointed out that the areas of application of the AGG in its current form are not comprehensively geared to algorithmic discrimination and that adjustments are necessary.

In the coalition agreement, the new federal government promised to evaluate the AGG, improve legal protection and adjust its scope of application. The following political recommendations ensure that the AGG becomes an effective instrument against algorithmic discrimination.

Policy Recommendations

1. Support for anti-discrimination bodies and awareness-raising

Educational work and awareness-raising with regard to algorithmic discrimination must be promoted, including through specific support for anti-discrimination bodies.

2. Expand the areas of application and discrimination features in the AGG

An expansion of the areas of application and discrimination features of the AGG is crucial with regard to algorithmic discrimination. So-called proxy variables can be used in ADM systems in place of the characteristics protected under the AGG as a basis for decision-making. For example, an ADM system used in job application processes might not reject individuals based on their age, since age is a protected characteristic under the AGG. However, as a proxy variable, the system could problematically use the length of previous work experience to nevertheless identify older people and exclude them from the application process.

3. Introduce collective redress mechanisms in the AGG

Interest groups and associations have the capacity to uncover algorithmic discrimination, draw attention to it, and take legal action against it through their specialized work. However, this requires the possibility for collective redress mechanisms so that, under the AGG, not only those affected can take action against possible discrimination. The state anti-discrimination law (LADG) in the state of Berlin could be a model for this.

4. More transparency and clear responsibilities in cases of suspected discrimination

More transparency and clear responsibilities must be ensured in cases of suspected discrimination by ADM systems, for example by strengthening obligations to provide information to anti-discrimination bodies and affected persons and creating audit procedures and appeal possibilities for affected people.

The German government must recognize the need for action with regard to algorithmic discrimination and create sufficient protection for those affected as part of the planned evaluation of the AGG.

Read more on our policy & advocacy work on ADM in the public sector.

Sign up for our Community Newsletter

For more detailed information, please refer to our privacy policy.