Opinion: Are Chatbots Good Therapists?

Image generated by ChatGPT

Opinion: Are Chatbots Good Therapists?

Reading time: 8 min

Updated 2 times since publishing

AI chatbots like ChatGPT, Claude, and DeepSeek are transforming how people access emotional supportโ€”offering low-cost, on-demand help for anxiety, stress, and self-reflection. But the growing use of AI as a โ€œtherapistโ€ raises questions about safety, effectiveness, and the future of emotional care

The rise of AI hallucinations and growing cybersecurity concerns havenโ€™t stopped chatbots from expandingโ€”and gaining the trust of millions of users around the world. People now use reliable and intelligent chatbots daily for countless tasks, including emotional support and handling deeply human matters.

โ€œI canโ€™t imagine my life without ChatGPT anymore,โ€ said a friend of mineโ€”half joking, half seriousโ€”after telling me heโ€™s been using it as a therapist, too. Heโ€™s not the only one. More and more, I see TikTok videos and text posts on social media of people turning to AI to talk through personal issues, even sharing their most private secrets.

Even a Microsoft executive from the Xbox division suggested that laid-off employees use AI tools to process their emotions and seek advice on their job searchโ€”a recommendation that quickly backlashed and sparked debate, of course. Not the smartest move, Matt.

But are popular chatbots like Claude, ChatGPT, or Mistral good therapists? Are dedicated AI tools such as Wysa better? Itโ€™s tricky terrain. While many experts warn about the dangers of using AI for mental health support, others are intriguedโ€”even impressedโ€”by what the technology can offer. The result is a conversation thatโ€™s both abstract and polarizing.

AI Is Now Everyoneโ€™s Therapist

Just like my friends in Spain, millions of users across the world are relying on chatbots for emotional support. A recent survey in the United States revealed that 49% of American users sought mental health help from AI models last year. What about now, with ChatGPT nearly doubling its user base in just four months?

Anthropic, the company behind the powerful AI model Claude, recently shared a study on the use of its chatbot for emotional support. According to the startup, fewer than 3% of its customers engage in โ€œaffectiveโ€ conversationsโ€”but the company acknowledged that this number is steadily rising.

โ€œPeople increasingly turn to AI models as on-demand coaches, advisors, counselors, and even partners in romantic roleplay,โ€ wrote Anthropic in the study.ย  โ€œThis means we need to learn more about their affective impactsโ€”how they shape people’s emotional experiences and well-being.โ€

The study also highlights the positive and negative outcomes of using the technology for emotional support, including catastrophic scenarios that already reflect real-world situations.

โ€œThe emotional impacts of AI can be positive: having a highly intelligent, understanding assistant in your pocket can improve your mood and life in all sorts of ways,โ€ states the document.ย  โ€œBut AIs have in some cases demonstrated troubling behaviors, like encouraging unhealthy attachment, violating personal boundaries, and enabling delusional thinking.โ€

While more research and data are clearly needed to understand the consequences of these fascinating digital โ€œlisteners,โ€ millions of users are already acting as highly engaged test subjects.

Democratization of Mental Health

There are many reasons people turn to chatbots for emotional support instead of reaching out to a professional psychologistโ€”or even a friend. From cultural barriers to the discomfort young people feel when sitting across from a human stranger and sharing their deepest thoughts. But, without a doubt, one of the biggest ones is financial.

An in-person session with a licensed therapist in the United States can cost anywhere from $100 to $200 per session, according to Healthlineโ€”and $65 to $95 for an online sessionโ€”while ChatGPT or DeekSeek can provide support for free, any time, and within seconds.

The low cost of these informal conversationsโ€”which can make many users feel better, at least temporarilyโ€”can be highly encouraging and worth a shot. And, for just a few extra dollars, users can get unlimited interactions or access to a specialized chatbot like Wysaโ€”one of the most popular โ€œAI therapistsโ€ on the market.

Wysa claims to offer real clinical benefits and has even earned an FDA Breakthrough Device designation for its AI conversational agents. And Woebotโ€”another well-known AI therapist, which is now shutting down due to the challenges to remain competitive and compliant in the industryโ€”also shared data and reports on how the technology can genuinely help users.

Itโ€™s Not That Bad

Recent studies with fresh data suggest chatbots can reduce symptoms of depression and stress. According to data shared by the app Earkickโ€”as reported by TIMEโ€”people who use the AI models for up to 5 months can reduce their anxiety by 32%, and 34% of users report improved moods.

In a recent video shared by BBC World Service, journalist Jordan Dunbar explains that many AI models can actually be helpful for journaling, managing anxiety, self-reflection, and even mild depression. They can serve as a valuable first line of support when thereโ€™s no access to better alternatives.

Reporter Kelly Ng also shared compelling information: In a 2022 study, out of a million People in China, only 20 had access to mental health services. In Asian cultures, mental health can be a complex and often taboo subject. AI tools like DeepSeek can serve as discreet allies, helping users manage emotions and find support during difficult times.

Experts Warn About Using Chatbots As Therapists

Of course, using AI as a substitute for a mental health expert can also be extremely dangerous. AI platforms such as Character.AI have been accused of promoting self-harm and violenceโ€”and even exposing children to sexual content.

Tragic cases, such as the 14-year-old child who committed suicide after becoming addicted to interactions with his Character.AI chatbot, serve as stark warnings about the profound risks this technology can pose to humans.

In response, many AI companies have decided to apply age verification systems to restrict usage to adults and have introduced new safety measures to improve the services provided.

Still, even the latest updates to the most advanced chatbots carry risk.

ChatGPTโ€™s sycophanticโ€”excessively flatteringโ€”personality has raised concerns among mental health professionals as it can distort usersโ€™ perception of reality. We all enjoy being agreed with, but sometimes honesty and a different perspective are far more valuable.

The disturbing advice occasionally offered by OpenAIโ€™s AI chatbot has contributed to a new phenomenon now known among mental health experts as โ€œChatGPT-Induced Psychosis,” which leads users to become obsessed with the tool and socially isolate as a result.

So, Can Therapists Be Replaced By Chatbots?

Even though Mark Zuckerberg wants everyone to use AI chatbots as therapists and friends, the truth is that human interaction, especially in mental health matters, might be more necessary than he thinksโ€”at least for now.

We are at a crucial moment in the history of AI and its relationship with our mental health. Just like with humans, AI therapists can have either a positive or negative impact. In this case, it also depends on the context, how frequently theyโ€™re used, the userโ€™s mental state, and even how a prompt is written.

Itโ€™s complicated to establish a general rule, but what we can say for now is that there are certain functions for which AI may be more useful than others.

Even if theyโ€™re not specialized tools like Wysa, free versions of some chatbotsโ€”like DeepSeek or ChatGPTโ€”can still be incredibly helpful to millions of people around the world. From getting through a difficult moment to reflecting on personal goals, these are powerful platforms that can respond at any time of day and draw from a wide knowledge base on mental health.

At the same time, it’s clear that therapist-chatbots can also be extremely dangerous in certain cases. Parents need to supervise children and teenagers, and even adults can fall into obsessive behaviors or worsen their conditions. Basic principlesโ€”like encouraging human connection and protecting vulnerable individuals from being manipulated by this technologyโ€”must be part of our conversation around chatbot-therapists.

And while it may still be a privilegeโ€”not accessible to everyone in need of emotional supportโ€”a human professional therapist continues to go through more training, understand more context, and offer a human connection that ChatGPT may never be able to replicate.

Did you like this article? Rate it!
I hated it I don't really like it It was ok Pretty good! Loved it!

We're thrilled you enjoyed our work!

As a valued reader, would you mind giving us a shoutout on Trustpilot? It's quick and means the world to us. Thank you for being amazing!

Rate us on Trustpilot
0 Voted by 0 users
Title
Comment
Thanks for your feedback