Press release
Algorithmic discrimination: New reporting form to collect cases

Berlin, 19 May 2025. German NGO AlgorithmWatch just launched a new online reporting form to document incidents of algorithmic discrimination.
More and more institutions and companies rely on automated or semi-automated decisions to sort job applications, grant loans and social benefits, or “predict” behavior and crimes. With this expanding use, the risk of systemic discrimination increases.
By introducing this new reporting tool, AlgorithmWatch not only gives people a platform to report incidents of algorithmic discrimination. The NGO will also investigate incoming reports in order to uncover discrimination and confront providers of such systems and politicians with evidence.
All over Europe, there have recently been repeated reports of discrimination by algorithmic systems, for example:
- In France [1], 13 million households were affected by a social welfare system that marked people with disabilities, single mothers, and people with low income as a fraud risk. In many cases, this led to benefits being automatically cancelled.
- In Austria [2], a chat bot used by the Public Employment Service AMS reinforced antiquated gender roles: It recommended women to take up gender studies but men to go into IT.
- In the Netherlands [3], the Vrije Universiteit Amsterdam used face recognition software to prevent students from cheating in examinations. The system reported people with darker skin tones more often than the statistics would suggest.
“Algorithmic systems are neither neutral nor objective. They reproduce the biases that exist in society. Data used for the systems‘ ‘machine learning’ reflects existing social disruptions and prejudices. As a result, algorithmic systems are prone to reinforcing discriminatory patterns,” says Pia Sombetzki, Senior Policy Manager at AlgorithmWatch.
More information:
The introduction of the reporting tool marks the launch of the campaign “Algorithmic Discrimination.” Over the summer, AlgorithmWatch will focus on raising awareness of discrimination by algorithmic systems in the context of work, migration, and gender issues.