How Dutch activists got an invasive fraud detection algorithm banned

The Dutch government has been using SyRI, a secret algorithm, to detect possible social welfare fraud. Civil rights activists have taken the matter to court and managed to get public organizations to think about less repressive alternatives.

Zicht op kruising Europaweg en Boerhaavelaan in Haarlem.

This story is part of AlgorithmWatch's upcoming report Automating Society 2020, to be published later this year. Subscribe to our newsletter to be alerted when the report is out.

In its fight against fraud, the Dutch government has been cross-referencing personal data from citizens in various databases since 2014. This system, called SyRI (for “system risk indication”), wants to find “unlikely citizen profiles” that warrant further investigation. Despite major objections from the Dutch Data Protection Authority and the Council of State, SyRI has been implemented without any transparency for citizens about what happens with their data.

The idea is this: if some government agency suspects fraud with benefits, allowances or taxes in a specific neighborhood, it can make use of SyRI. Municipalities, the UWV, the social security bank, inspectors of the Ministry of Social Affairs and Employment and the tax authority have access to the system. SyRI decides which citizens in the neighborhood need to be investigated further.

SyRI has not been a success for the government. In its first five years, five municipalities requested to analyze a neighborhood. Only two of these projects were actually executed, the other three were canceled. According to research from the Dutch newspaper De Volkskrant in 2019, none of these algorithmic investigations have been able to detect new cases of fraud.

False positives

There is a detailed procedure for government agencies that want to use SyRI. Two agencies should cooperate and ask the Ministry of Social Affairs and Employment (SZW in its Dutch acronym) to conduct an analysis. Before a SyRI project starts, SZW publishes an advisory in the online version of the official gazette. “The municipality has no obligation to inform citizens of a neighborhood that they are being analyzed”, says Ronald Huissen from the Platform Bescherming Burgerrechten (Platform for Civil Rights Protection). “And if they are informed, it is by a city bulletin that is not necessarily read by them, and in very vague terms, without the details of what data SyRI uses and how.”

The agency that asked for the analysis cannot just penalize the citizens that are flagged for an unlikely combination of data: it has to investigate, for every flagged citizen, whether an actual case of fraud took place. Moreover, the flagged citizens are first examined at the Ministry of Social Affairs and Employment for false positives. Data on citizens that are deemed false positives is not handed over to the agency that asked for the analysis.

No transparency

But even with these checks in place, the lack of transparency is still a big issue. Residents of whole neighborhoods were put under a magnifying glass without them even knowing which privacy-sensitive data SyRI had about them. Each ‘risk indication’ is logged into a register that citizens can look into if they ask. But citizens are not automatically warned if they are flagged for fraud risk by SyRI, and they cannot access the reasons why they have been flagged.

In the beginning of 2018, the Platform Bescherming Burgerrechten, together with a couple of other Dutch civil rights organizations, filed a case against the Dutch state to stop the use of SyRI. At the same time, they wanted to spark a public debate about SyRI with their media campaign Bij Voorbaat Verdacht (Suspected from the outset).

According to the official resolution that is the legal basis for SyRI, the system can cross-reference data about work, fines, penalties, taxes, properties, housing, education, retirement, debts, benefits, allowances, subsidies, permits and exemptions, and more. These are described so broadly that in 2014 the Council of State concluded in its negative opinion on SyRI that there is “hardly any personal data that cannot be processed”.

Black box

SyRI pseudonymizes the data sources it uses with a ‘black box’ method. That is, for each data source that is linked, all citizen names are replaced by a unique identifier for each individual. The identifier makes it possible to link data about the citizen from these various data sources. After the analysis, the result is a list of identifiers that represent possibly fraudulent beneficiaries. These identifiers are then translated back to their real names.

In the case of Platform Bescherming Burgerrechten versus the Dutch State, the latter gave some examples of “discrepancies” that could lead to a risk indication. One of these discrepancies is a low usage of running water. This could be a sign that someone who receives benefits is living together with someone else at another address and thus does not have the right to the higher benefit for singles. However, there are many other possible causes for low water usage, such as using rainwater, a frugal life, or even a broken water meter.

A secret sauce

It is still unclear what is happening in this ‘black box’, and the Dutch government blocked all attempts from concerned parties to shed light on this. In 2017 the Ministry of Social Affairs decided that the risk models it used should be kept secret. In 2018, the political party D66 wrote a motion (but did not file it) to publish SyRI’s algorithms or to conduct a technical audit if publishing the algorithms is not possible.

Tamara van Ark, State Secretary for Social Affairs and Employment, strongly advised against filing the motion (so that it was never put to a vote), and she warned that potential offenders could adapt their behavior if the state disclosed SyRI’s risk models. But many of the factors in the risk models are already known or expected, or have already been used before SyRI to detect fraud, such as low water usage. It is hard to imagine that someone who commits fraud to get a higher benefit will leave the faucet open to increase their water usage.

Primarily used in low-income neighborhoods

There’s another problem with SyRI: according to freedom of information requests by the Platform Bescherming Burgerrechten, it turns out SyRI has been primarily used in low-income neighborhoods. This exacerbates biases and discrimination: if the government only uses SyRI’s risk analysis in neighborhoods that are already deemed high-risk, it is no wonder that it will find more high-risk citizens there.

Philip Alston, United Nations Special Rapporteur on extreme poverty and human rights, expressed his concerns about SyRI in a letter to the Dutch court on 26 September 2019 : “Whole neighborhoods are deemed suspect and are made subject to special scrutiny, which is the digital equivalent of fraud inspectors knocking on every door in a certain area and looking at every person’s records in an attempt to identify cases of fraud, while no such scrutiny is applied to those living in better off areas.”

A repressive position

Mr Alston does not question that welfare fraud exists and that it should be punished, but he warns that SyRI’s focus seems to be wrong: “If the focus on fraud seen to be committed by the poor is highly disproportionate to equivalent efforts targeted at other income groups, there is an element of victimization and discrimination that should not be sanctioned by the law.”

Maranke Wieringa, a PhD candidate at Utrecht University doing research on algorithmic accountability in Dutch municipalities, adds another problem: “One goal of municipalities using SyRI for specific neighborhoods is to improve their living standards. However, SyRI is not designed for that purpose. If you take a national instrument that is designed for fraud detection and then apply it with a social purpose of improving living standards and social cohesion in a neighborhood, you can question whether you should depart from the same repressive position for both goals.”

SyRI is not necessary

On 29 November 2019, SyRI won the Big Brother Award of the Dutch digital rights organization Bits of Freedom. This prize is awarded to the biggest privacy intrusion of the year. When director general Carsten Herstel accepted the award in the name of the Ministry of Social Affairs and Employment, he told the audience “I find it logical that the government gets alerted when someone gets a rental allowance and owns the house at the same time.”

According to Mr Huissen from the Platform Bescherming Burgerrechten, the government does not need this kind of mass surveillance to prevent fraud: “The government already has information about who owns which house, so it could check this before granting the person a rental allowance. For all big fraud scandals in social security we have seen in the past decades it became clear afterwards that they could have been prevented with simple checks beforehand. That happens far too little. It is tempting to look for solutions in secret algorithms analyzing big data sets, but often the solution is far simpler.”

Ms Wieringa agrees that this is a better way. “SyRI has been introduced from the point of view of a repressive welfare state: it does not trust the citizens. But that is just one stance of many possible ones. For instance, the government could check, perhaps even while using fewer data sources, who has the right to an allowance.”

No fair balance

On 5 February 2020, the Dutch court of The Hague ordered the immediate halt of SyRI because it violates article 8 of the European Convention on Human Rights (ECHR), which protects the right to respect for private and family life. Article 8 requires that any legislation has a “fair balance” between social interests and any violation of the private life of citizens.

SyRI’s goal, or “social interest”, is to prevent and fight fraud. The Dutch state claimed that the SyRI legislation offered sufficient guarantees to do this while protecting the privacy of citizens, but the court disagreed. The legislation is insufficiently transparent and verifiable, and there are not enough safeguards against privacy intrusions, judges wrote.

The Dutch government can appeal against the decision until 5 May 2020. In a reaction on its website, the ministry said it will investigate the decision thoroughly.

According to Ms Wieringa, the court’s decision makes it clear that the biggest problem of SyRI is not that it wants to battle fraud (this is a legitimate aim, the court says), but the way it does it: “The system is deemed too opaque by the judges. If the government wants to ‘fix’ this problem, it will have to add more transparency to SyRI. A ‘SyRI 2.0’ will likely be SyRI but less opaque. Previous experiences point to that course of action. The “Waterproof” system, a forerunner of SyRI, was deemed illegal in 2007 on privacy grounds. Later, the government simply passed a law to make circumvent the problem, thus creating SyRI. Another ‘lazy fix’, this time geared towards increasing transparency, would be a logical step for SyRI.”

A new way of dealing with algorithms

On the other hand, two public organizations in the Netherlands, the UWV and the tax authority, have reacted to the court’s decision by reassessing their own algorithmic systems for fraud detection. “This is a sign that the court’s decision is urging everyone to find a new way of dealing with algorithms in the public sector”, Ms Wieringa maintains.

Tijmen Wisman, chairman of the Platform Bescherming Burgerrechten, hopes that the government will do more. “Just adapting SyRI to be more transparent will still result in information asymmetry. Citizens do not want to give information anymore to their government if the latter can use this information in all possible ways against them.”

According to Mr Wisman, public organizations need a change in their data management: “Data must no longer be allowed to roam freely, but must reside in an authentic data source. Each organization should keep logs for every time that their data is consulted. Citizens should be able to easily access these logs. This way it becomes clear to citizens what their data is used for, and they can challenge this use. This requirement of transparency also follows from the court’s ruling in the SyRI case, as well as the GDPR.”

In the Netherlands, welfare fraud is estimated at 150 million euros a year. Municipalities, the employee insurance agency (UWV) and the social security bank together have detected fraudulent claims for more than 744 million euros in the period from 2013 to 2018. This compares to an estimated 22 billion euros lost to tax fraud, each year.

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.