Austria’s employment agency rolls out discriminatory algorithm, sees no problem

AMS, Austria's employment agency, is about to roll out a sorting algorithm that gives lower scores to women and to the disabled. It is very likely illegal under current anti-discrimination law.

Story

6 October 2019

#publicsector #work

Nicolas Kayser-Bril
Reporter

Update 21 August 2020: The Austrian data protection authority declared the system illegal ; its deployment should be suspended. Read more at Der Standard.

The Austrian employment agency, known by its German acronym AMS, is a state-owned company in charge of helping job seekers. In 2016, it started a program to evaluate the chances of specific groups on the labor market.

Three years later, satisfied with the results of a statistical analysis carried out by Synthesis Forschung, an external contractor, at a cost of €240,000, AMS announced that it would start automatically attributing a score to each job seeker based on several attributes. Depending on the score, job seekers will land in one of three groups: group A for people who need no help in finding a new job, group B for people who might benefit from retraining, and group C for people deemed unemployable, who will receive less help from AMS and may be discharged to other institutions.

Efficiency gains

The algorithm is made of a series of statistical models based on past employment records. Staff at Synthesis Forschung ran statistical regressions to find out which factors were best at predicting an individual’s chances of finding a job.

According to Johannes Kopf, who sits on the board of AMS, the algorithm increases efficiency as it ensures that the agency does not waste resources on giving support to people who will not, in the end, benefit from it. The scheme does not go against article 22 of the General Data Protection Regulation that prohibits purely automated decision-making on individuals, Mr Kopf wrote, because AMS case workers can overrule the algorithm’s judgement.

Discrimination

Parts of the analysis have been made public. One document shows that, under a certain model, women are given a negative weight, as are disabled people and people over 30. Women with children are also negatively weighted but, remarkably, men with children are not.

An excerpt from the AMS algorithm’s documentation.

In other words, the AMS algorithm is more likely to assign an unemployed women to a lower group even if her experience and qualifications match a man’s. For Catherine Barnard, a professor at Cambridge University’s faculty of law and a specialist of discrimination law, such a procedure is likely to fall afoul of Directive 2006/54, the main anti-discrimination instrument in the European Union. That case workers have some discretion over the algorithm “would be no defense,” she said, since any form of discrimination, direct or indirect, is prohibited.

“No justification”

Miriam Kullman, an assistant professor at the Vienna University of Economics and Business and an expert in discrimination through algorithms, concurs. “As far as I know, there is no way to justify [the AMS algorithm],” she said. EU law and case law take a broad view of discrimination, allowing it only in very specific cases, such as a theater recruiting for a male role, Ms. Kullman added.

In a written rebuttal of his critics, Mr Kopf did not deny that the algorithm was discriminatory but contended that AMS was committed to spending half of its resources to support women and that women were underrepresented among the unemployables of group C.

In an email statement to AlgorithmWatch, an AMS representative stated that the algorithm was “consistent with all anti-discrimination regulations”.

Non-transparency

The AMS algorithm has been widely criticized in Austria since the release of the documentation paper mentioned above. Despite promises of transparency, AMS only released two of the 96 statistical models claimed to be used to assess job seekers.

An analysis of the available evidence by Paola Lopez, a doctoral candidate at the mathematics faculty of the University of Vienna, found that the algorithm uses thresholds that are not related to employment policy but rather to technical considerations and could have been chosen very differently. The algorithm simply mirrors the discrimination faced by different groups on the Austrian labor market, she wrote.

Experts and activists, including AlgorithmWatch, have long warned that algorithms, under a veneer of scientificity, ran the risk of replicating structural injustice and prejudices – at scale. When the AMS algorithm enters into force next year, discriminated Austrian job seekers will be the latest to suffer the consequences.

Edited on 7 October to better reflect the nature of group C. Wolfie Christl, a digital rights activist involved in the topic, highlighted to AlgorithmWatch that AMS has yet to detail their plans for that group, apart from the fact that it will be allocated less resources than group B.

Edited on 11 October to include AMS's answer.

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.