
Flagged by the Algorithm: Klarna Thought I’m a Fraudster
Flexible payment company Klarna’s automated systems flagged a user for allegedly failing to pay for an online order. She had been told that a glitch in the platform prevented her from paying, and then Klarna’s algorithms sent her straight to a defaulter list she could not leave without an administrative hassle. Here is her story.

I loved Klarna’s pay-later option. Especially in the weeks before Christmas, when expenses add up, it was encouraging to be able to order gifts and outfits for New Year’s without having to pay for them until the holiday madness was over. Little did I know I was treating myself to a rage fit over a collection letter instead of just a new set of clothes.
On the due date in January, I initiated the payment in the Klarna app, as usual. Klarna provides flexible payment services, such as installments and pay-later options. The next day, an email arrives: Payment failed. I don’t think much of it and try again. Payment failed again. Confused, I hop on Klarna’s customer support chat: They tell me it’s a technical glitch on their end blocking payments, and instruct me not to retry but to wait until they get back to me once it’s fixed.
A week later, a collection letter hits my inbox, demanding nearly double the original amount. After calling the company for explanations, an unfriendly agent tells me there is nothing they can do – I should have just paid on time. Bewildered, I call the collection agency. Another unfriendly agent replies, but at least he explains how to file an objection.
What follows are weeks of back and forth with the collection agency, trying to convince them this was all a mistake. As a last resort, I cite a clause in the German Civil Code (BGB) stating that if you cannot pay on time for reasons beyond your control, you are not at fault. Still insisting they acted correctly, they offer to settle if I pay what I originally owed to Klarna plus a 10€ service fee. Relieved to put an end to it, I agree and pay immediately. Still, the consequences were disastrous: My entire credit score went downhill and I am now blocked from other platforms like PayPal.
I found out it was a filtering algorithm that put me in a tight spot. Klarna had automatically forwarded my case to the collection agency, with nobody supervising my emails, inquiries, or my conversation with the support team. And they admitted doing so.
A Guide to GDPR for Algorithmic Accountability
Article 15 of the General Data Protection Regulation (GDPR) provides individuals with the right to access all personal data a company, institution or organization has collected about them. This includes information on any automated decision-making. Article 13 requires companies to provide such information, including an explanation of the logic and criteria behind any automated decisions, and confirmation of whether a human was involved in reviewing or approving them. Finally, Article 22 grants the right not to be subjected to automated decision-making if it produces significant legal effects.
The formula to understand if an algorithm had triggered the forwarding of my bill to the collection agency was there. If my credit score was damaged because an automated system escalated my case, I could demand accountability using GDPR.
To my first data access request, I received a generic response explaining that “orders with overdue payments are forwarded to a debt collection agency in line with standard procedures,” and that, “although automated systems are used, human oversight is always in place to ensure fairness and accuracy in the decision-making processes.”
They nonetheless admitted using automated systems to escalate bills, so I sent a second request about the process. This time, the reply goes straight to the point: “At Klarna, we use automated systems to make decisions about forwarding claims to debt collection agencies. These decisions are based on various factors, including your payment history and outstanding balances. [They] are made automatically and cannot be overridden.”
Through a third data access request, I demanded a manual review of the forwarding of my bill (provided under Article 22). They then sent me to the collection agency for more information. However, I had proof that my bill was forwarded to the collection agency based solely on automated decisions, with zero human oversight. On top of that, they violated numerous other GDPR provisions – most notably my right to an explanation of any automated logic and to a manual review. I filed an official complaint with the responsible Data Protection Authority in May. Their investigations are still ongoing.
Understanding the Bigger Picture
Knowing how this happened doesn’t undo the damage, but it helps to understand how my case fits into a broader pattern of opaque automation.
Dr. Tim Kraft, a lawyer specializing in Data Protection and Media Law at Lausen Rechtsanwälte, points out that while the GDPR generally prohibits automated decision-making, there are exceptions. One is consent: If you accept Klarna’s privacy policy and it includes information about automated decision-making, you are effectively agreeing to it, along with any consequences. Klarna’s privacy policy – the one I blindly agreed to upon creating an account – does mention profiling and automated decisions for things like fraud prevention or credit checks, but not for debt collection processes. So, while I may have consented to automation in other areas, I did not agree to having my bill forwarded to a collection agency by an algorithm.
On that basis, Kraft concludes Klarna likely violated the GDPR, specifically Article 22, which prohibits decisions based solely on automated processing that significantly affect individuals: “Forwarding a bill to a collection agency is a decision that significantly affects the customer. If, according to Klarna’s own statements, this decision is based solely on automated processing and it is even impossible for a human to intervene or override that decision, this applies all the more.”
The problem with automated systems is not just the lack of human oversight. Klarna refused to provide data to give information on the process followed by their algorithms.
Unfortunately, my case is not an outlier. Similar opaque systems have caused harm elsewhere. Take the French welfare system, which has faced heavy criticism for using an algorithm that assigns seemingly arbitrary “suspicion scores” to beneficiaries. Or the UK Department for Work and Pensions, which uses algorithms to flag credit claims for potential fraud. An analysis revealed that the system disproportionately targeted people based on age, disability, marital status, and nationality.
Taking Back Control from the Algorithm
With my background in AI Governance and a solid understanding of the GDPR, I had the tools to fight back. But what about those who don’t? “Data protection rights are personal rights. Thus, only the person affected by a breach of data protection can enforce their rights,” Kraft points out. This sounds sobering at first, but he continues: “Exercising your rights is easy, and anyone can do it. A request does not have to be put in legalese and can be issued in common language. In fact, we see a lot of our clients facing an increasing amount of such requests, so it appears that people are becoming more aware of their rights.”
Still, if you’re unsure how to proceed, you can turn to consumer protection organizations like the Verbraucherzentrale, in Germany, or noyb, working across Europe. They can help by taking on your case and filing complaints on your behalf and have secured numerous victories in recent years.
Now, more than eight months later, I look at the Christmas outfit in my closet and feel a quiet sense of victory. I didn’t just buy it – I had to truly fight for it.
Theresa Adamietz

Theresa is a freelance journalist specializing in AI governance, technology ethics, and data protection. Certified in both AI governance and GDPR auditing, she also works as a consultant at the intersection of technology and regulation, contributing to AI strategy, compliance, and risk management. As a content and communications expert, she has authored numerous whitepapers and blog posts on AI Governance and emerging tech trends, including the AI Act and Agentic AI.