Controversial service that ranked job seekers based on personal emails folds following AlgorithmWatch investigation

A Finnish company that automatically parsed the personal emails of job applicants to assess their corporate “fit” discontinued its service after reports by AlgorithmWatch and others raised questions about its legality.

In 2017, two Finnish psychologists created Digital Minds, a company that develops ‘third-generation’ assessment technology for employee recruitment. The company was featured in the Finnish chapter of the Automating Society report published by AlgorithmWatch in January 2019.

After the report was published, AlgorithmWatch’s Matthias Spielkamp and Nicolas Kayser-Bril wrote an opinion piece in Politico, underlining the need for critical reflection:

“A Finnish company has rolled out a new product that lets potential employers scan the private emails of job applicants to determine whether or not they would be a good fit for the organization. The company, Digital Minds, portrays its offering as something innocuous. If applicants give their consent, what’s the harm? The truth is: We don’t know the answer to that question. And that’s what makes new, potentially intrusive and discriminatory technologies like this one so scary.”

Legally problematic

For Juho Toivola, co-founder of Digital Minds, the piece in Politico was a “German take on the matter.” The critique of their service failed to understand the problem they were trying to solve, he said.

Digital Minds' goal was to make employee assessments more convenient and efficient by enabling the use of existing data rather than collecting data separately with each recruitment, he emphasized. To assess a job seeker’s “fit” with the recruiting company, Digital Minds performed a personality analysis using an off-the-shelf software from IBM. It focused among other things on how active people were on social media and how quickly they reacted to emails.

Mr Kayser-Bril reported Digital Minds to the Finnish data protection ombudsman in late 2018 to get an assessment of the service’s possible harms and legality. In May 2019, Finland’s national public service broadcasting company’s (YLE) aired a story about the ongoing process with the ombudsman, suggesting that Digital Minds’s offering was legally problematic.

The Finnish data protection ombudsman, Reijo Aarnio, told YLE he suspected that a personality assessment by means of parsing email correspondence violated the Finnish Labor Privacy Act. Mr Aarnio also questioned the validity of a job seeker’s consent when asked for permission to analyze emails. If candidates are in a vulnerable position and need the job, they might not be able to decline access to their data. Moreover, similarly to letters, emails are covered by the Confidentiality of Correspondence Act, he said.

Less than 10 assessments conducted

YLE also revealed that less than ten job seekers had used Digital Mind’s service. The fact that the company had very few actual clients gave a new twist to the story. Mr Toivola, Digital Minds' co-founder, explained that he had been talking about ‘a proof of concept’ rather than an actual service. Either intentionally or unintentionally, he had been mixing contexts: the founders of Digital Minds had experience with dozens of clients and had been involved in thousands of assessments of job applicants. But these had involved conventional analysis methods.

Shortly after the YLE report aired, Digital Minds was discontinued. Mr Toivola concluded that he is rethinking how to position the service in an ethically sound manner.

Taking the organizational environment into account

The case shows how important details can be when investigating automated decision-making (ADM) systems. When companies promote their services, they alter details and exaggerate numbers to support their marketing claims. Journalists and researchers need to engage in a critical dialogue with company representatives, and that dialogue can only be nurtured in a culture that values openness. Developers of ADM systems should understand that they are participants in an ongoing societal debate that extends beyond their own products.

Another lesson from the Digital Minds story is the importance of clarifying how ADM systems can have real-life consequences. Like other companies automating human resources, Digital Minds shunned responsibility by suggesting that they did not make actual hiring decisions but merely automated the process of assessment, based on which employers made decisions.

Ultimately, the organizational environment and the divisions of labor that support ADM systems determine whether automation is just and ethically robust. Machines do not care care about real-life consequences. The professionals who create and maintain ADM systems should take this into account.

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.