Can Algorithms Close My Bank Account?

Banks use automated systems to monitor customers and their transactions to fight against money laundering and financing of terrorism. But sometimes, these systems make mistakes that lead to the blocking or closure of people’s bank accounts. The phenomenon is called de-banking.

A couple of atm machines sitting next to each other

Ecuadorian small business owner Atilio Andrade, in Spain, and Imam Koffi Agodjro, in France, share a troubling experience: Their bank accounts were abruptly closed, and they had to get through an administrative maze to understand why.

Together with Agence France-Presse (AFP), AlgorithmWatch and previous fellows from our Algorithmic Accountability Reporting Fellowship investigated the extent to which automated systems triggered the alert that led to these closures. We turned our findings into the podcast False Positives, which is now available in English, French and German.

False positives occur when data-driven automated systems make mistakes, falsely identifying a bank customer as suspicious for money laundering amidst thousands of others, for instance. Algorithms scan customers’ financial profiles and transactions and automatically compare them with data from a myriad of sources. This process saves countless hours of manual work but also leads to opaque decision-making pathways that can deprive legitimate customers of essential banking services.

Purposeful omissions

Banks are extremely secretive about how they determine who gets “de-banked.” Koffi Agodjro, one of the designated chaplains to provide religious support to athletes at the 2024 Paris Olympic Games, was at the Olympic Village when he discovered that the bank handling his mosque’s financial operations had closed his account. He learned from the French Ministry of Interior that his was not an isolated case. The bank did not provide any explanation. Atilio Andrade’s bank also never explained why they blocked the accounts that he used to manage a small remittance business in a neighborhood in Valencia (Spain). After he had switched to a new bank, it almost immediately froze his accounts.

After months of persistent inquiries, Atilio learned that the second bank had mistakenly labeled his money-transfer operations as currency exchange. One of the experts we talked to told us that currency exchanges are considered high-risk and might have triggered the automated system screening the account, leading to repeated freezes of some services.

The first bank Atilio used denied using Artificial Intelligence to enforce account closures. However, we obtained compliance documents proving that the bank used automated systems to “detect unusual and potentially suspected operations of money laundering or terrorism financing.” When confronted, the bank changed its stance: While algorithms were indeed used for decision-making, a person would always have the final say. Such a combination of denial and confusion makes it impossible to reliably confirm whether an algorithm caused Atilio’s or Koffi’s account closures.

Non-compliant algorithms

Banks have the legal right to close accounts without explanation if they become inactive, if too many bounced checks occur or if account policies are violated, for example. They are also required to implement mechanisms to fight money laundering and the financing of terrorism – failing to do so can result in fines amounting to billions. This leads to banks dropping thousands of legitimate accounts and preferring to deal with displeased customers rather than facing reprimands, according to industry insider Mariola Marzouk.

“They [bank investigators] choose to err on the safe side because if later on something happens, and indeed a money laundering case is reported, they don't want to be held responsible,” she said.

To operate at such a massive scale, banks rely on automated systems to flag suspicious activities, which are then allegedly reviewed by specialized teams. Jacques Sudre, Chief Compliance Officer at the French La Banque Postale, said that there were between 17,000 and 20,000 people in Paris alone dedicated to assessing algorithmic compliance alerts across all banks. He argues that these specialists protect customers and minimize errors.

Defenseless

However, this investigation shows that errors such as those that affected Atilio Andrade and Koffi Agodjro are very common and that there is no easy way to contest the decisions. Atilio battled for two years, and Muslim groups like Koffi’s had to repeatedly open new accounts with different banks to prevent their funds from being frozen or their accounts closed. In the European Union, the law provides a less-than-ideal backup solution: Customers can turn to the central bank that can choose a bank for them and force the bank to open a new account. Such accounts, however, only offer restricted services: no checks or overdrafts, no debit cards, no transfers, etc.

Tracing the chain of malpractice back to an automated system that triggered it is as difficult in banking as it is in other sectors like welfare or hiring. Industry insiders reported that banks often cannot – and are not inclined to – make the effort to correct algorithms’ mistakes at any rate. Meanwhile, the people involved don’t know why they are denied a service or how to appeal the decision. You can explore these and other stories by listening to the three-episode podcast False Positives here.


This is an excerpt from the Automated Society newsletter, a bi-weekly round up of news in automated decision-making in Europe. Subscribe here.

Naiara Bellio (she/her)

Reporter

Photo: Studio Monbijou, CC BY 4.0

Naiara Bellio covers the topics privacy, automated decision-making systems, and digital rights. Before she joined AlgorithmWatch, she coordinated the technology section of the Maldita.es foundation, addressing disinformation related to people's digital lives and leading international research on surveillance and data protection. She also worked for Agencia EFE in Madrid and Argentina and for elDiario.es. She collaborated with organizations such as Fair Trials and AlgoRace in researching the use of algorithmic systems by administrations.