Poland: Government to scrap controversial unemployment scoring system

The Polish government has been forced to scrap a controversial scoring system for the unemployed after criticism by campaigners, judges and a human rights watchdog. The automated system is used to make life-changing decisions about what support individuals get based on their personal data and answers to interviews at job centres. Critics say the system is discriminatory, lacks transparency and infringes data protection rights. As a result the government has decided to end its experiment with profiling the unemployed, and the system will be scrapped by December 2019.

In 2014 the Polish government introduced a complex reform of the public employment service Publiczne Służby Zatrudnienia (PSZ) which runs job centres in almost 350 towns and cities. These local offices are where people go to register when they become unemployed, receiving cash benefits and help getting back into the job market. But like so many sectors of the welfare system, resources at PSZ are scarce and assistance cannot be provided for everyone in need.

To tackle this problem, the government has introduced a profiling mechanism that was supposed to allow support to be tailored to individual needs, reduce bureaucratic inefficiency and result in better value for money.

How does it work?

The idea was simple. Unemployed people are divided into three groups, based on how close they are to finding a job. Each group is given a different type of assistance adapted to their situation. This categorisation is supposed to be carried out in a semi-automated way, with the help of a scoring system. The system assigns each person to one of the three profiles based on 24 data points.

Eight data points are collected during registration at the job centre, including age, gender, disability and duration of unemployment. Further data is gathered during a computer-based interview. This consists of questions that sound open-ended, although in reality the answers are recorded by selecting from a list. For example, PSZ staff has to match the open answer to “What is the main obstacle to returning to the job market?” with one of 22 predefined options.

Based on the final score an algorithm decides which category shall be given to the unemployed person. This determines the scope of assistance that a person can apply for. Each profile contains groups with different demographics and problems. For example, the first profile includes people with a high level of education, who are active and have enough professional qualifications to find a job relatively quickly. They would not require much attention from PSZ – but still, they might receive funds to open a business or get vouchers for training. According to official statistics, between 2% and 5% of the unemployed are assigned to this profile.

Assigned to the second profile are people considered to have more difficulty re-entering the labour market. They might lack skills or education. But they are promising, which is why PSZ directs the greatest resources to this group. Two thirds of people are assigned to this category and can apply for a variety of different types of assistance.

And then there is a third category containing around 30% of the unemployed – those facing serious difficulties like chronic disease, disability or addiction. In theory, they are supposed to be able to apply for some sort of assistance. In practice, financial and organizational problems mean local job centres offer little to those in this category. They end up being written off as a helpless group that is not worth investing in.

When the algorithm enters a complex environment

The profiling mechanism has been initially intended to be an advisory tool, with staff retaining the final say about which group to put someone in. At the same time, the Ministry of Labour wanted decision-making procedures on the ground to be more standardised.

This contradiction has influenced the design of the system and the decision rules. After a person’s profile is calculated, the system allows clerks to accept or refuse the decision made by the computer. But some early statistics indicate that clerks were deciding to override the result in just under 1 in 100 cases. Mostly, staff lacked the time to consider decisions in detail, but they also feared repercussions from supervisors if a decision was later called into question.

These conflicting aims and expectations of the profiling mechanism create a situation where staff is using the system in very different ways depending on the local organisational culture. In many areas, the computer is the ultimate decision-maker. In others, it is just part of the broader diagnosis process. And in some cases decisions might be adjusted to meet the expectations of the unemployed person.

So in practice, the way this automated system is used depends on various factors – organisational issues, individual preferences and the design of the tool itself. Because of this, the profiling system has not reached its creators’ goal of serving as a centralised categorising machine.

Algorithmic battles

The organisational shortcomings inside PSZ are not the only problems related to the profiling system. Even before it was launched, the system had already been criticised by civil society and Poland’s data protection authority. The main concerns have been related to the lack of transparency about how decisions are made and problems with processing personal data.

Under the new law that brought in the automated profiling, unemployed people have no rights to information about how the computer system was determining their situation (such as the logic behind it, exactly what data was being used and how that affected the decision). Neither can they challenge the decision made by the computer or demand human intervention.

Panoptykon Foundation (Fundacja Panoptykon), a leading digital rights organisation in Poland, used freedom of information provisions to request some essential details of the profiling mechanism. After some disputes with the Ministry of Labour, including one that was only resolved in court, the foundation was able to publish the questions asked during the computer-based interview and the scoring rules. This intervention has helped unemployed people to better understand the system (although also, in some cases, to game the final results) and also set a precedent to treat computer based interviews as public information.

In the meantime, the Supreme Audit Office (Najwyższa Izba Kontroli) which oversees the state budget, public spending and management of public property, carried out a general review of PSZ, part of which was dedicated to the profiling mechanism. The review found the system is ineffective and can lead to discrimination. Under the scoring rules, women are assessed in a in a different way than men. People belonging to the most vulnerable segments of society (single mothers, people with disabilities, and rural residents) have also more chances to be assigned to the third profile. So in practice their chances to receive assistance form PSZ are reduced.

PSZ staff are also unhappy with the profiling system. According to official evaluation by the government, 44% of local job centres confirm that profiling is useless in their day-to-day work. And 80% conclude that the system should be changed. Many unemployed also made complains on the outcomes of profiling mechanism using administrative law.

Another crucial critic of the profiling system was the Human Rights Commissioner, who decided to refer the profiling case to Poland’s Constitutional Court. Their main objection was rather formal: the scope of data used by the profiling tool should have been set out in the legal act adopted by parliament, not decided by the government. At the end in 2018, the Court ruled that this was a breach of the Polish constitution.

After this criticism and the Court’s ruling, the government has decided to end its experiment with profiling the unemployed. It has tabled an amendment to the law, which will abolish the scoring system by December 2019.

This story provides two interesting lessons on the development of automated-decision making systems.

First, democratic institutions can successfully debate and contest technology. Courts, human rights commissioners, civil society organizations and citizens have legal and political powers and tools that can be applied to challenges created by algorithms. However very often they should approach existing laws with greater creativity, trying to understand how the automation of decision processes may be regulated by different legal regimes. In the Polish case the rules on the access to public information, welfare law and the administrative law become a crucial element for the protection of fundamental rights vis-à-vis algorithms.

Second, the organisational environment and everyday use of automated systems are crucial elements that should be taken into account by designers and critics alike. There might be a huge gap between how we imagine people interact with technology and how it is actually used. The story of algorithmic systems in practice is more complex and nuanced than we expect.

More about profiling mechanism in this report by Panoptykon Foundation (PDF).


Jedrzej Niklas, PhD, is working at the intersection of data-driven technology, the state and human rights. His research focuses on the use of data and new technologies by public administrations and the social and legal impact of digital innovations, particularly the rights of marginalised communities and social justice. He is a Postdoctoral Researcher at the University of Leeds and Visiting Fellow at the London School of Economics and Political Science.

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.