Meet ChatPal, the European bot against loneliness

Automated chatbots might help patients, but researchers and medical professionals warn that they are not a substitute for professional, in-person therapy.

Therapy chatbot ChatPal – which was the result of an EU-funded, three-year-long, community-based study – was created by Ulster University and partners from Northern Ireland, Ireland, Scotland, Sweden, and Finland. It was pitched as a support tool for people who may have periods of anxiety, depression, or stress and live in rural and sparsely populated areas in the northern fringe of Europe. 

A little less flashy than commercial therapy apps like Woebot and Wysa, Chatpal users answer questions in English, Scottish Gaelic, Swedish, or Finnish about their well-being (“I have daily habits to support my well-being”) and, depending on their answers, receive tips or exercises to do (e.g. writing a “gratitude statement”). Ulster University professor Maurice Mulvenna told AlgorithmWatch that, after a 12 week trial, at least two organizations are now interested in using ChatPal. 

For example, Northern Ireland’s charity Action Mental Health is currently trying to onboard ChatPal as an ongoing service within their organization. “The organization is actively using it. One of the uses is whenever people are on waiting lists for access to mental health counseling and support, they can be prescribed digital [services], such as ChatPal, which might help them enough that they don't drop off the queue,” claims Mulvenna. 

Mulvenna said that the Norrtbotten council in Sweden is also interested in using the bot. Norrtbotten is the largest of Sweden's counties, taking up almost one-quarter of the country, but with a sparse population that has been declining since the 1960s.

Therapy Chatbots

Despite their growing popularity, there are various controversies surrounding most chatbots. Most of them are developed by for-profit companies, as the digital mental health market is becoming – due to the apps' high demand – financially lucrative. However, these companies’ apps are not, in most cases, subject to scientific or clinical scrutiny and may be launched on the market before being properly trialed.

It's not just Chatpal. To meet the increased demand from the public for professional support, Wysa  - another prominent mental health app - is now being prescribed in the United Kingdom to people with mild to moderate symptoms of depression and anxiety during the waiting period. The app is approved by the National Health Service (NHS). Meanwhile, the commercial Woebot and its WB001 software are prescribed as a digital treatment of postpartum depression in the United States, where it gained approval by the Food and Drug Administration (FDA). 

However, skepticism about solely using chatbots to address these mental health states persists among some experts. 

Julia Brown, an anthropologist and postdoctoral scholar in Bioethics at the University of California, told AlgorithmWatch “It's not to say that these tools [mental health chatbots] cannot play a really important role in providing additional support. But I think they can't be a substitute, and, especially in the beginning of starting a therapeutic relationship, I think it is so important for a human to be on the other end.” 

This view resembles Şerife Tekin’s, director of  Medical Humanities at the University of Texas at San Antonio, who said: “I'm very critical of these chatbots being primarily pitched as the thing that will help people and basically aiming to substitute in-person mental health with these technologies. Technology is not there to substitute human mental health. And I don't think it ever will be, regardless of how much we improve this technology.” 

“But I'm not against finding a way for these chatbots to create a triangulation between the clinician, patient, and the bot. What might work is, let's say I can only see the clinician once a month, because they're busy, and I don't have enough insurance coverage (…) I might make use of an app like this, to track my mental states, my mood, my behavior, what kind of actions I'm engaging in to take care of myself. But this requires a certain level of self-reflection and self-understanding, which may not be there for all the patients”. 

Western Centrism

More generally, two academics that AlgorithmWatch interviewed also criticized the western centrism of therapy chatbot apps. Şerife Tekin said: “Chatbot is very intrinsically, I think, a western concept because it kind of assumes that everyone who feels depressed or sad wants to talk about their experiences, or seek help but I have a hard time [with this].”

Yi–Chieh Lee is assistant professor of Computer Science from the National University of Singapore and has been investigating self-disclosure through chatbots, including Woebot, one of the most widespread mental health chatbots in the world. He said: “The conversation’s style or the way to help people to have the awareness about their mental state maybe needs to have a customized design for Asian countries.” 

“When we asked some users to try to play with Woebot, they didn't feel quite comfortable chatting with Woebot. Because in the beginning, Woebot already asks you to discuss a lot, but they haven't chatted with Woebot itself.” 

Yet, most of the most prominent mental health apps have these characteristics embedded and these flaws may lead to a lower willingness to use them.

Unclear Future of ChatPal

Extensive deployment of ChatPal on the international market is off the table. The plan is to keep ChatPal active on the Google Store and the Play Store for about three years after the project finishes. However, the app will not be receiving any new updates.

While ChatPal’s developers are transparent about its limited lifespan, it may not be the case for the users of other mental health chatbots. The companies are not obliged to send a service termination notice ahead. An unexpected ending or collapse of the chatbot may have detrimental consequences on the individuals who might have built a relationship with the chatbot over time. Therefore, attention should be paid to this aspect, so that people do not end up being more mentally fragile after the use of the apps than they were before.

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.

Nathalie Koubayová (she/her)

Former Fellow Algorithmic Accountability Reporting

Nathalie is a PhD student with an academic interest in chatbots. She holds a research master’s degree in Communication Science from the University of Amsterdam. Her current research revolves around users’ responses to different framings of disclosure of customer care chatbots’ identity. During her fellowship at AlgorithmWatch, she looked into the use of chatbots in mental health, automatic fact-checking, and the digitization of the agricultural sector.