Press release

Algorithmic discrimination: New reporting form to collect cases

19 May 2025

#discrimination

Little Computer-Icon in Pixel-Style, besides: “Report algorithmic discrimination”

Berlin, 19 May 2025. German NGO AlgorithmWatch just launched a new online reporting form to document incidents of algorithmic discrimination.

More and more institutions and companies rely on automated or semi-automated decisions to sort job applications, grant loans and social benefits, or “predict” behavior and crimes. With this expanding use, the risk of systemic discrimination increases.

By introducing this new reporting tool, AlgorithmWatch not only gives people a platform to report incidents of algorithmic discrimination. The NGO will also investigate incoming reports in order to uncover discrimination and confront providers of such systems and politicians with evidence.

All over Europe, there have recently been repeated reports of discrimination by algorithmic systems, for example:

“Algorithmic systems are neither neutral nor objective. They reproduce the biases that exist in society. Data used for the systems‘ ‘machine learning’ reflects existing social disruptions and prejudices. As a result, algorithmic systems are prone to reinforcing discriminatory patterns,” says Pia Sombetzki, Senior Policy Manager at AlgorithmWatch.

More information:

The introduction of the reporting tool marks the launch of the campaign “Algorithmic Discrimination.” Over the summer, AlgorithmWatch will focus on raising awareness of discrimination by algorithmic systems in the context of work, migration, and gender issues.