Would you spill your failed hopes, unfulfilled dreams and mental health challenges to a robot? How about a mental health chatbot? That’s not as strange as it sounds. Reporter Shirley Wang at The Wall Street Journal debates the value of mental health chatbots that use artificial intelligence (AI) technology.
Advances in AI technology are opening up new possibilities, researchers say, but chatbots are still no substitute for a human therapist.
Not only do some of these tools have trouble helping patients in crisis, they don’t always offer a sufficient level of personalization or provide advice that is guaranteed to be accurate.
The need for more mental health care providers is huge. Public health experts estimated that roughly 20% of Americans, or about 60 million people, are struggling with a mental health condition. The problem is that mental health counseling is labor intensive. There are not enough psychologists, psychiatrists and mental health therapists to treat all those who need care. Many of the providers that do exist aren’t taking on new patients and most do not accept insurance. The average cost of therapy runs from $100 per hour to $200 per hour. Thus, it is likely that many Americans who need counseling simply cannot afford it. Computer scientists and mental health advocates are optimistic that large language artificial intelligence models can help, but the technology won’t be perfected any time soon.
[C]hatbots are showing promise in helping people determine whether they need care and connecting them to the proper resources, in lifting people’s moods and in practicing skills taught in cognitive-behavioral therapy.
Newer chatbots that use AI technology based on large language models are capable of more humanlike conversation, opening up new possibilities for applications in mental-health care.
In the past I have written about online mental health counseling and mental health apps to help people get counseling remotely. Years ago, I read about counseling over email, which (at the time) supposedly worked better than expected because people could go back and reread old discussions. Yet, I wonder how well people will respond to chatbots that lack personalization. My concern isn’t that AI models can’t be trained to show empathy or provide information that is accurate and valuable. My fear is that the exact same message from a chatbot won’t have the same effect as one coming from another human being. Say your mother denigrated you growing up and you developed lifelong feelings of inadequacy and inferiority as a result. Now you get advice from a computer generated chatbot because you’re not rich enough to afford a human counselor to tell you the same information. That would seem to almost reinforce the feelings of inadequacy. Perhaps loneliness is a byproduct of your mental health challenges and instead of getting positive reinforcement from a human counselor you talk to a chatbot. I suppose that’s still better than just reading it in a book (or maybe not). Still, experts say AI chatbots hold promise, but have limitations. Indeed, digital mental health apps have high abandonment rates. Chatbots may suffer the same fate.
And studies have shown that digital interventions can help people with common mood and anxiety symptoms, though they tend to be more effective when a human also is involved, checking in with users.
Perhaps there’s another use for large language chatbot’s. Maybe in growing the field of mental health counselors in other ways or reinforcing human therapists.
Artificial intelligence also is showing promise in helping to train mental-health care providers, some experts say. One such tool, called Lyssn, uses AI-generated algorithms to rate recordings or transcripts of clinical sessions and provide feedback to clinicians about how well they did on certain metrics, such as how closely they implemented a type of therapy strategy.
An AI chatbot could also work alongside a human similar to a teaching assistant. I have written about group therapy sessions before and wonder if it would be possible to make group therapy sessions more personalized if the participants also had access to chatbots overseen by a mental health counselor. That would provide both the feedback and the peer support that would be missing with a simple computer algorithm.
I can see how AI could excel at diagnosing diseases of the body and boost productivity in primary care or even chronic disease management. Delivering care for a mental condition that requires hours of talking with positive feedback and cognitive behavior therapy is a much tougher nut to crack.
Read more at WSJ: How Helpful Are Mental-Health Chatbots?
If it’s anything like the Bing search AI, I would be afraid to take any kind of medical advice from it.
I’m both amazed at how well Bing AI “Copilot” works sometimes and how badly and spectacularly it fails in others. A diagnostic Chat GPT-4 type AI tool would need to be trained in the medical literature rather than allowed to peruse the Internet. At the very least trainers would need to somehow rank where the information came from. A truly intelligent algorithm to counsel someone struggling with a mental health problem is an entirely different problem that’s an order of magnitude more difficult to solve.