A Swedish town bought an AI to spot children at risk, but decided against deploying it

The Swedish municipality of Norrtälje bought an automated system to handle a spike in reports about children at risk. But it was shelved after concerns emerged regarding the software’s lawfulness and bias.

CC-BY Bengt Nyman

Between 2014 and 2019, reports concerning at-risk children increased by two thirds in Norrtälje, a city of 60,000 one hour north of Stockholm. Such reports, filed by teachers, neighbors or friends, usually describe issues like child neglect or a parent’s addiction.

“You don't get more money in the social services, and there’s always a lot to do,” said Jessica Gjertz, who is in charge of processes at Norrtälje's social service.

To deal with the spike in cases, the municipality decided to find alternatives to the manual handling of cases. In 2019, they started a Robotic Program Automation (RPA) project that involved an Artificial Intelligence (AI) component for 2.7 million Swedish crowns (ca. 270,000 euro). It was supposed to shorten the time spent on each report, streamline decisions, and create room for more meetings with families.

When it approved the investment, the municipality estimated that if used for seven years, the system would average a yearly cost of 333,000 crowns (ca. 33,000 euro) according to internal documents consulted by AlgorithmWatch. They compared this to the higher cost of one full-time and one part-time employee – 1.2 million crowns annually (ca. 120,000 euro) – who they would otherwise need to hire to handle the increased caseload.

Legal anchoring

The automated component of Norrtälje’s system compares words in new reports with earlier reports and, based on past actions, suggests whether to open an investigation. Social workers make the final decision.

But after a legal review, the decision-support system was not deployed.

“When we started the project, it was not clear if we could use it or not,” Ms Gjertz admitted.

Legal concerns prompted Norrtälje to contact SKR, an umbrella organisation for municipalities, who opened a legal investigation. But before the final review was available, the system had already been developed and trained.

In April 2021, SKR found that the Norrtälje system did not comply with Swedish data privacy laws. Social services cannot use data from reports which have not led to an investigation, the report said. But the decision-support system would need to process such information to fully function. Swedish privacy laws ensure that parents who have been falsely reported do not suffer the consequences, so that processing their data is unlawful. A government agency proposed changes to the law in 2019, but the changes have yet to be approved.

Promise of algorithms

Simon Vinge, chief economist at the union for social work, Akademikerförbundet SSR, argues that public sector estimations of algorithmic benefits often lack anchoring. "There are huge expectations, mainly to save money,” he said. It is hard to quantify how much money such solutions can really save, according to Mr Vinge.

A recurrent promise of automation is that social workers get more time to meet people.

Mr Vinge paints a blurrier picture. New technological tasks are created, and the number of cases that each social worker is expected to handle can increase. “Our members say they often get many more clients” after the automated system is deployed, he said.

Reproducing decisions

Using automated systems to detect children at risk opens questions beyond legality. Marta Nannskog, senior advisor at SKR, is positive about Swedish municipalities exploring AI, but says that municipalities handle reports of concern differently.

“If you build an AI in a municipality where you open an investigation on 25 percent of the reports, versus 65 percent in another municipality, you get two very different algorithms.”

Building AI this way, Ms Nannskog said, “would be like cementing the differences that exist between the municipalities today”.

Using previous decisions reproduces hidden values and norms, which "can lead to undesirable effects for social work such as exclusion and embedded beliefs", according to a report on social services automation by Lupita Svensson, a senior lecturer at Lund University.

Norrtälje, said Ms Gjertz, discussed internally that prejudice could be incorporated by their automated system. However, the outcome of the discussion was simply to “be aware of [bias] at all times.”

Exploring grey zones

The Norrtälje project was mainly funded by the municipality itself, but also received a 500,000-crown (ca. 50,000 euro) grant from the Swedish innovation agency, Vinnova. 

According to Daniel Rencrantz, head of the innovation unit at Vinnova, it is difficult to predict obstacles from the beginning in such projects. “Sometimes you have to start digging to see what data we have and what solutions.”

He added that projects must operate within the law, but that a project’s legal assessment was ultimately under the applicant’s sole responsibility.

Risk of black box effect

Another municipality, Strängnäs (population 37,000), received funding for a similar project but took a different approach. Looking at Norrtälje’s experience, they decided against automating the process.

"At the start we were inspired by the Norrtälje project," explained Frida Fallström, social worker in Strängnäs. "But then, we saw a black box effect and we didn't want to repeat it. That's why we decided to go for text analysis instead of automated decision making."

The software in Strängnäs municipality analyzes reports to inform social workers about signs of violence in the text. After receiving the warning sign, the social worker should check the case individually. This way "is a bit simpler for social workers to control that the machine conclusions are accurate," added Frédéric Rambaud, the leading project manager in Strängnäs.

"A black box is not an option for the chosen work process, we aim to avoid this effect and create transparency," he said. 

Now the municipality is “calibrating the model” and will know more in the fall.

In Norrtälje, Ms Gjertz remains optimistic about the investment in the decision-support system. “We think that in the future, they have to change the law. And then we have this technology already.”

The mayor of Norrtälje did not respond to a request for comment. 

Camille Schyns contributed to this story.

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.