In April 2018, the National Non-Discrimination and Equality Tribunal of Finland ruled that Svea Ekonomi AB must stop using a specific statistical method in credit scoring decisions. The Non-Discrimination Ombudsman had requested the tribunal to investigate whether the company was guilty of discrimination in a case that occurred in July 2015, when the company did not grant a loan to a man who was denied a loan when he tried to buy building materials online. Svea Ekonomi is a so-called factoring company that collects claims other companies hold against customers.
Having received the loan rejection, the credit applicant, a Finnish-speaking man in his thirties from a sparsely populated rural area of Finland, requested the company to justify the negative decision. The company first responded that their decision required no justification, and then argued that the decision had been based on a credit rating made by a credit scoring service using statistical methods, combining data from various databases. Such services do not take the creditworthiness of individual credit applicants into account and therefore, the assessments made may significantly differ from the profile of the individual credit applicant. This, the credit company agreed, may seem unfair to a credit applicant.
The credit applicant petitioned the Non-Discrimination Ombudsman who then investigated the case for more than a year. The decision to reject the loan application in question was based on Svea Ekonomi’s own data, information from the credit data file – a register where the potential loan applicant’s payment defaults are displayed – and the score from the scoring system provided by an external service provider. Since the applicant had no prior payment defaults in the internal records of Svea Ekonomi, nor in the credit data file, the scoring system assigned him a score based on factors such as the place of residence, gender, age and mother tongue. Svea Ekonomi did not investigate the applicant’s income or financial situation, and neither was this information required for the credit application.
Instead, the automated system calculates, based on population information, the percentage of groups of people with poor credit history and awards points proportionate to how common deficient credit records are in the group in question. As men more often default on payments than women, men are awarded fewer points in the scoring system than women. Similarly, those with Finnish as their first language receive fewer points than Swedish-speaking Finns. Had the applicant been a woman, or Swedish-speaking, he would have met the company’s criteria for being granted the loan.
After failing to reconcile the case, the Non-Discrimination Ombudsman brought the case to the tribunal. The ombudsman argued Svea Ekonomi was guilty of discrimination under the Non-Discrimination Act. Based on its investigation, the tribunal agreed that the company acted in a discriminating manner because the scoring method it used was based on discriminating use of statistics.
That Svea Ekonomi solely used the applicant’s age, his gender (male), his mother tongue (Finnish) and the place of residence (in a rural area) in the decision to not grant the loan constituted a case of multiple discrimination.
This means that for granting a credit request only scores may be used that incorporate information about the actual credit history of a specific person. If such information about a person is not available, a score could still be calculated, but the decision on granting the credit cannot be solely based on it.
The tribunal noted that it was remarkable that the applicant would have been granted the loan if he were a woman or spoke Swedish as his mother tongue. Moreover, he was treated less favorably in the assessment of Svea Ekonomi than someone living in a more urban area, even though neither his nor anyone else’s ability to pay back his credit or any other matter related to credit payment can be deduced from his place of residence, even with statistical methods.
The National Non-Discrimination and Equality Tribunal prohibited Svea Ekonomi from continuing their discriminatory practices and imposed a conditional fine of 100.000 euros to enforce the decision. After receiving the decision Svea Ekonomi’s country manager gave a statement underlining that the company adheres to equality principles, but it still has to have a right to follow their own overall assessment, based on which they decide to whom they award credit. In that assessment, no outside body should be able to define credit criteria. Moreover, the company representative emphasized that the decision concerns not only Svea Ekonomi but the whole industry that uses statistical modeling and scoring as part of their credit assessment. The company said it would appeal the decision in administrative court, but no appeal was made after all and the decision is now final.
Financial services providers continue to discuss the decision’s consequences with regard to the legality of scoring mechanisms. The decision has also been received with great interest by European equality bodies, as it offers a reference case to potential future disputes. The ombudsman’s office is an active participant in ongoing discussions concerning algorithmic systems, reminding that automation by itself is not a problem, but it needs to be implemented with an understanding of its limitations and consequences. Companies granting consumer credits can investigate loan applicant’s creditworthiness by means of automated systems, but the selection of clients must be non-discriminatory. Statistical data cannot be more significant in the assessment of the applicant than individual factors.
A description of the case and the justification for the ruling can be found in English on the website of the Finnish Tribunal for Non-Discrimination and Equality.
This story was made possible with the support of Bertelsmann Stiftung
This article is an excerpt from the report Automating Society - Taking Stock of Automated Decision Making in the EU, published in January 2019. In this study, AlgorithmWatch and reseachers from twelve EU member states investigate which systems of automated decision making (ADM) are used in Europe and which measures for their regulation exist or are being discussed in the respective countries. The report is produced in cooperation with Bertelsmann Stiftung and is funded by Open Society Foundations.