Dutch MP Kees Verhoeven wants a registry of “heavy” algorithms – but it shouldn’t be public

On September 10, Dutch MP Kees Verhoeven put forward a motion in the Dutch parliament, together with MP Harry van der Molen, to create a mandatory register for all public-sector algorithms. He explains how this register could be implemented.

Husky | Wikipedia

On September 10, Dutch MP Kees Verhoeven put forward a motion in the Dutch parliament, together with MP Harry van der Molen, to create a mandatory register for all public-sector algorithms. He explained to AlgorithmWatch how such a register could be implemented. The interview, done by telephone on December 4, was edited for clarity.

What are the algorithms that would fall under the definition brought forward in your motion?

We put in a motion because we saw all kinds of algorithms being deployed in different fields, whether by municipalities or by the national government and other public organizations. There is this invisible growth of algorithms that we don’t have any control over.

We want to know when “heavy” algorithms are used. With the word “heavy,” I mean algorithms that decide about things that really hit people. For example, that decide on the money they receive from the government for healthcare or social services. It’s about automated decision-making systems that have an impact on the lives of people, not about an Excel sheet or really simple systems that have been automated. It is about algorithms that have a heavy impact on households.

So Excel documents would be excluded? Should the definition of a heavy algorithm include the software? One could easily program a heavy algorithm in an Excel sheet.

I know, but many people ask me “do you mean all algorithms?” No, I don’t mean all algorithms because I know that in a lot of organizations algorithms are already in use and they don’t have a big impact. It’s really about big decision-making systems and not about smaller technical solutions.

What would the reporting obligation be like? Would organizations have to disclose the inner workings of their algorithms or just their existence?

The register would also include an explanation. When a municipality wants to deploy an algorithm that predicts behavior such as burglaries or school drop-out rates, we want the municipality to make clear that they use an algorithm for such a prediction and the way they built it up. We want to know when an algorithm has been deployed in a certain field of policy, what its goals are and how it works.

Would the register be public?

I don’t think a public register would be a good thing. With a public register, people might try to avoid certain algorithms or might behave differently to avoid control. There should be a body that controls the register in a non-public way. A municipality, for example, would declare that it uses an algorithm, and then the control body should control it. But it should not be a public register.

What about police algorithms? Would they also have to be registered?

Yes, the police should register their algorithms like other organizations.

You mentioned a control body. Which institution would that be?

That’s the second part of my motion. The first part was about the register. In the second part, I wanted two things: to make clear which laws already exist to control algorithms, for instance the GDPR, and to research the possibilities of a control body. It could be the existing data protection authority but it could also be a new body. I want the government to assess which institution should have oversight of the register and control that the algorithms are within the law.

Would the control body that you call for have the power to check that the algorithms work as described in the register?

Yes, that’s exactly what I want. Registering an algorithm would be the responsibility of an organization, and the control body should have the possibility to verify that the implementation of the algorithm actually matches what the organization has written down.

Should the control authority be available to other regulators (competition authority, media regulator etc.) that want to investigate an algorithm?

I think that the authority should be the one that decides if they investigate or not. One could always give a signal to the control body and that signal should be taken seriously. But in the end, that authority should be the one to decide whether or not they investigate.

France passed a law that requires administrative services to tell citizens if a decision was made automatically in 2016, effective in 2017. So far, none or very few administrations comply with the law. How would you ensure that Dutch administrations do contribute to the register?

It’s a good attempt to be more transparent to the people and to show which decisions are made automatically. That’s a real problem if administrations don’t comply, but I don’t know the situation in France. In the Netherlands, the ministry of social affairs uses a very controversial system [AlgorithmWatch reported on SyRI in the Automating Society report]. The public debate is a very important part of the discussion. If there’s a lot of public pressure, control bodies and politicians will be able to keep algorithms in check. It means that you need a strong control body, you need a strong government and you need strong politicians that do this control.

You said that public debate should be important but you said that the registry should not be public. How do you solve this conundrum?

The control body should report about the way it’s going. If they see that nobody is registering or that nobody does what is required, they could communicate that publicly. The register should not be public but the fact that municipalities or government are following the rules, that should be made public.

What are the current plans for implementation of the control mechanism you described?

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.