Report algorithmic discrimination!
Report a case
Report a case

Report algorithmic discrimination!

AlgorithmWatch wants to shine a light on where and how algorithmic discrimination can take place. Do you have reason to believe that algorithmic discrimination may have taken place? Then we ask you to report this to us to help us better understand the extent of the issue and the havoc algorithmic systems can wreak on our lives. Your hints can help us make algorithmic discrimination more visible and strengthen our advocacy for appropriate guardrails.

Want to learn more about this topic? You can read more about the causes of algorithmic discrimination and our demands for better protection from it here.

FAQs

What is algorithmic discrimination?

Institutions rely on automated decisions throughout our society. Algorithmic decision-making systems process tax returns, evaluate job applications, make recommendations, predict crimes or the chances of refugees being integrated into the labor market. Such systems are not neutral themselves. People with their particular assumptions and interests influence how the systems are developed and used. They reproduce patterns of discrimination that already exist in society. Some people can therefore be discriminated against when they have to deal with an algorithmic system. You can find more information on this here.

What can I do if I suspect that I was discriminated against by an algorithmic system?

Report the incident to us! We will help investigate it further. By reporting about such incidents, more and more people will come across this important issue and become aware of what they can do if they have been affected by algorithmic discrimination. With increased awareness, we can advocate for better safeguards and put pressure on politicians to curb the problem.

You can also contact anti-discrimination counselling offices to better assess whether discrimination has occurred. The Federal Anti-Discrimination Agency can help or refer you to a counselling office near you. In some cases, it may make sense to contact consumer protection agencies.

Not all cases of discrimination can be brought before a court. In Germany, the General Equal Treatment Act applies. It primarily protects people who fall into specific protected categories. This list is currently limited to six grounds: If you think that the unequal treatment has to do with your personal characteristics relating to race or ethnic origin, gender, religion or belief, disability, age, or sexual identity, this might be a case in which you could take legal action.

However, the General Equal Treatment Act also only applies to certain areas of life, including access to services and goods or to education and work. People are not yet sufficiently protected against discrimination by state authorities.

Where could the use of automated decision-making systems lead to unjustified or unequal treatment?

Automated decision-making systems can be found almost everywhere nowadays. A creditworthiness inquiry, e.g., is usually automatically initiated when we request installment payments in an online shop. Increasingly, companies also employ algorithms for making personnel decisions and coordinating work processes, and authorities use algorithms both overtly and covertly – for instance, to generate tax assessments, decide on the allocation of daycare spots, in policing, in the justice system, and so on.

It is often when comparing the experience of two individuals engaging with the same system that unequal treatment becomes evident. If one person was offered a different price for a service than another person, or if a request was declined while a similar one was accepted, this might suggest that the automated risk assessment was conducted differently, such as an assessment concerning the likelihood of filing an insurance claim. This can, of course, also happen when a person, not an automated system, is responsible for assessing an application. It would be our job to investigate further whether this is happening on a large scale because of the use of an automated system.

Many discriminatory effects are not immediately evident. If, for example, the police use a "predictive policing" tool that automatically generates a patrol plan for police based on certain predictions and crime statistics, this can lead to certain neighborhoods being patrolled more frequently. In such a case, it is not apparent to outsiders that an automated system is the reason why the police are intensifying their presence in a specific area.

What information is necessary to be able to investigate my case?

Please share your experience with us and briefly explain in what ways you perceived the incident to be unjust or discriminatory. Feel free to also upload any files that support the points you’re making, e.g., photos of letters or screenshots of websites, forms, or web applications, or send us a short audio recording or video in which you explain the case instead.

Here is an example of how a message to us could roughly sound like: 

Hello AlgorithmWatch Team,

I recently completed an online form and my insurance company X provided a quote at a higher price than someone I know has to pay. I suspect that the automated pricing system associated my personal data and the characteristics of my profile with a higher risk of claims. I assume that my personal situation is factored in...

Please note: Algorithms may discriminate differently from humans and in ways that don’t necessarily need to make sense to us. Your personal data might, for example, be correlated in an unapparent way or fall into a very specific group category that might be discriminated against, such as "female dog owners above the age of 40 living in postal code area 00000." Therefore, it is most valuable to reflect on the information you provided in a specific situation, such as when filling out a form for a particular service or purchase. 

What can AlgorithmWatch do about my individual experience?

We operate as an advocacy and research organization. We conduct journalistic investigations, do academic research, and advocate for improved protection from algorithmic discrimination. This may include holding companies accountable, drawing the attention of the media and the public to problems and reminding politicians that they must comply with the protection of fundamental rights. Your firsthand experience will help us raise awareness and use evidence to exert pressure on those employing discriminatory systems and advocate for necessary regulation.

Who can I contact for personal consultations in cases of unjust treatment or discrimination in general?

AlgorithmWatch can advise you on incidents specifically concerning algorithmic discrimination and analyze what is behind the phenomenon. We follow cases when we receive indications that the use of AI systems is leading to discrimination. However, AlgorithmWatch is not an official advice organization and cannot provide legal advice. We therefore recommend that you always contact equality or counselling offices that offer legal advice. Following this FAQ section, we provide you with a list of contacts to equality bodies in Germany and the EU that you are advised to contact additionally and that can offer you consultation concerning cases of unjust treatment or discrimination. For persons located in Switzerland, please see the information provided by our colleagues at AlgorithmWatch.ch.

How will AlgorithmWatch handle my data?

Any personal data, such as your name, email address, or phone number, will solely be used to facilitate communication with you. You can report to us anonymously, use a pseudonym, or simply not provide any contact information if you do not wish to be contacted further by us. Kindly indicate your preference in the options provided in the form and inform us whether and/or to what extent we can use the information you share, taking into account the privacy of others. We therefore ask you not to pass on data from third parties to us. For additional details, refer to our privacy policy available here.

Can I report a potential case that happened anywhere in the world?

The AlgorithmWatch team based in Germany can assist you if you are located in the EU, while the team based in Switzerland can help you if you are located there. If you are not located in either of these regions or are unsure about an incident you’d like to report, you can still get in touch with us. We may be able to direct you to colleagues who can provide you with expertise on your region and give assistance with your case. See our contact information at the bottom of this page.

We would like to thank the Federal Anti-Discrimination Agency of Germany for their feedback regarding the development of our reporting form.

Spread the word about our campaign and share it on

Would you like to stay informed about our efforts to combat algorithmic discrimination and other work by AlgorithmWatch? Then you can subscribe to our newsletter here.