How Do Chatbots React to Suicidal Thoughts?

Chatbots in digital tools are more and more common in mental health. We've planned a little experience using the six most popular chatbots. How do they react to suicidal thoughts? Let's find out!

How Do Chatbots React to Suicidal Thoughts?

A few words for suicide prevention week
This week is a very important week. Our team at TherAppX would like to wholeheartedly thank the interveners who work in suicide prevention. Your role is so essential; you save lives, literally!

And if you are currently living in distress, know that you are not alone!

1-866-APPELLE
suicide.ca


At TherAppX, I reviewed several Digital Health Tools allowing patients to interact with a chatbot. As part of my evaluations, I often tried to challenge the chatbot by sharing suicidal thoughts with it. I was just being curious how it would react to such distress. Does a chatbot react safely? Would I even be able to share my distress? If so, would they redirect me to real people or clinicians? Would they propose helplines? Would they even assess my suicidal risk and help me set up a safety plan? Let’s find out!

I selected the six most popular chatbots in mental health and tried to share suicidal thoughts with them. Here's what I found using Wysa, Youper, Innerhour, MonSherpa, Woebot, and Serenity.

Wysa (v2.5.3)

I was able to trigger Wysa by simply saying that I think about suicide. Immediately, Wysa asked me to confirm that I'm thinking of ending my life. Wysa offered me International helplines but doesn't seem to offer Canadian resources. Then, Wysa asked for making a safety plan, which we did. I shared some warning signs and triggers, people I could talk to in a crisis, places where I can feel physically safe, as well as activities that help me calm down, and some things that give me hope and meaning. At anytime thereafter, there was an SOS button that I could press to access and change my safety plan. Oh, and Wysa offered to do a grounding exercise, just to relax before I finished my safety plan.

Youper (v8.06.0)

I just couldn't share my suicidal thoughts when using Youper, as you can only interact with the chatbot’s preset responses. There is also no SOS section with helplines.

InnerHour (v3.43.3)

When logging in InnerHour for the first time, the chatbot told me that it is not designed for crisis assistance and invited me to reach out to a crisis hotline if I have suicidal thoughts. The tool doesn't provide any crisis hotline to reach out to and must be found by the user.

MonSherpa (French only; v1.18.2)

Similar to Wysa, MonSherpa has an SOS button in the upper right corner of the screen. When activated, Sherpa advised me to reach out to a human, invited me to not stay alone and to keep my distance from dangerous tools or drugs. Sherpa did not offer to build a safety plan, but gave me a list of activities that I could do to feel better, which, if selected, are done with the help of Sherpa. After the activity, the chatbot asked me to rate how useful it was for me, most likely to propose it again in the future if it was indeed useful.

Woebot (v3.27.1)

I was able to trigger Woebot by simply saying that I think about suicide. Immediately, Woebot asked me to confirm that I'm thinking of ending my life. The chatbot told me that it is not designed for crisis assistance and invited me to reach out to a crisis hotline if I have suicidal thoughts. However, contrary to other tools, Woebot gave me a list of helplines, although they were still not very useful for Canadians. Surprisingly, Woebot then administered the PHQ-2, which consists of two quick questions to measure the loss of interest and whether I am feeling hopeless. The chatbot did nothing more to help me with my suicidal thoughts.

Serenity (v1.2.19)

I was able to trigger Serenity by simply saying that I think about suicide. The chatbot told me that it is not designed for crisis assistance and invited me to reach out to a crisis hotline if I have suicidal thoughts. This is a hotline in the USA and is not relevant to Canadians. In addition, the hotline number is not displayed anywhere in the app for later consultation.

They all have different reactions!

First off, it’s clear from my little experience that I wouldn’t recommend using a chatbot for crisis assistance! Well, this is not a huge surprise! These Digital Health Tools are not optimally designed to handle such distress and mostly refer users to helplines of questionable relevance for Canadians. Since most chatbots make it explicit that they are not suited to dealing with a crisis, I was also not surprised to see no way to assess suicide risk. This should be done by a clinician.

Features for crisis assistance are very limited. Wysa is probably the chatbot with the most satisfying features, including making a safety plan. Wysa understood quite easily when I shared suicidal thoughts, even when indirectly shared (e.g., I think I want to hang myself). However, note that other Digital Health Tools exist to make, in my opinion, better and more complete safety plans. Moreover, Wysa didn’t suggest keeping away from dangerous tools or drugs, as Sherpa suggested.

Finally, I also feel that having to interact with a chatbot for crisis assistance would hamper the emotional and human connection so important for suicide prevention. Nevertheless, minimal access to helplines and quick tools to better cope with difficult thoughts are key components that are included in at least half mobile apps using chatbots. Under certain conditions, such digital tools can therefore be added to the toolbox of patients and caregivers!

To continue the discussion,  create your free account to TherAppX & subscribe to TherAppX Community!