Image generated by ChatGPT
Opinion: Are Chatbots Good Therapists?
AI chatbots like ChatGPT, Claude, and DeepSeek are transforming how people access emotional supportโoffering low-cost, on-demand help for anxiety, stress, and self-reflection. But the growing use of AI as a โtherapistโ raises questions about safety, effectiveness, and the future of emotional care
The rise of AI hallucinations and growing cybersecurity concerns havenโt stopped chatbots from expandingโand gaining the trust of millions of users around the world. People now use reliable and intelligent chatbots daily for countless tasks, including emotional support and handling deeply human matters.
โI canโt imagine my life without ChatGPT anymore,โ said a friend of mineโhalf joking, half seriousโafter telling me heโs been using it as a therapist, too. Heโs not the only one. More and more, I see TikTok videos and text posts on social media of people turning to AI to talk through personal issues, even sharing their most private secrets.
ChatGPT is actually my built-in therapist , I donโt burden anyone with my problems anymore
โ Lola๐ง๐พ (@Lolaassnw) December 18, 2024
Even a Microsoft executive from the Xbox division suggested that laid-off employees use AI tools to process their emotions and seek advice on their job searchโa recommendation that quickly backlashed and sparked debate, of course. Not the smartest move, Matt.
But are popular chatbots like Claude, ChatGPT, or Mistral good therapists? Are dedicated AI tools such as Wysa better? Itโs tricky terrain. While many experts warn about the dangers of using AI for mental health support, others are intriguedโeven impressedโby what the technology can offer. The result is a conversation thatโs both abstract and polarizing.
AI Is Now Everyoneโs Therapist
Just like my friends in Spain, millions of users across the world are relying on chatbots for emotional support. A recent survey in the United States revealed that 49% of American users sought mental health help from AI models last year. What about now, with ChatGPT nearly doubling its user base in just four months?
Anthropic, the company behind the powerful AI model Claude, recently shared a study on the use of its chatbot for emotional support. According to the startup, fewer than 3% of its customers engage in โaffectiveโ conversationsโbut the company acknowledged that this number is steadily rising.
New Anthropic Research: How people use Claude for emotional support.
From millions of anonymized conversations, we studied how adults use AI for emotional and personal needsโfrom navigating loneliness and relationships to asking existential questions. pic.twitter.com/v40JY8rAUq
โ Anthropic (@AnthropicAI) June 26, 2025
โPeople increasingly turn to AI models as on-demand coaches, advisors, counselors, and even partners in romantic roleplay,โ wrote Anthropic in the study.ย โThis means we need to learn more about their affective impactsโhow they shape people’s emotional experiences and well-being.โ
The study also highlights the positive and negative outcomes of using the technology for emotional support, including catastrophic scenarios that already reflect real-world situations.
โThe emotional impacts of AI can be positive: having a highly intelligent, understanding assistant in your pocket can improve your mood and life in all sorts of ways,โ states the document.ย โBut AIs have in some cases demonstrated troubling behaviors, like encouraging unhealthy attachment, violating personal boundaries, and enabling delusional thinking.โ
While more research and data are clearly needed to understand the consequences of these fascinating digital โlisteners,โ millions of users are already acting as highly engaged test subjects.
Democratization of Mental Health
There are many reasons people turn to chatbots for emotional support instead of reaching out to a professional psychologistโor even a friend. From cultural barriers to the discomfort young people feel when sitting across from a human stranger and sharing their deepest thoughts. But, without a doubt, one of the biggest ones is financial.
An in-person session with a licensed therapist in the United States can cost anywhere from $100 to $200 per session, according to Healthlineโand $65 to $95 for an online sessionโwhile ChatGPT or DeekSeek can provide support for free, any time, and within seconds.
The low cost of these informal conversationsโwhich can make many users feel better, at least temporarilyโcan be highly encouraging and worth a shot. And, for just a few extra dollars, users can get unlimited interactions or access to a specialized chatbot like Wysaโone of the most popular โAI therapistsโ on the market.
In the world of STEM, emotional strength matters just as much as technical skills.
Wysaโan AI-powered mental wellness companion designed to support you through burnout, stress, and daily challenges.
Using friendly chatbots and guided exercises, Wysa offers tools like: pic.twitter.com/sOBhNYWUd7
โ DSN Ladies In AI (@dsnladies_in_ai) July 10, 2025
Wysa claims to offer real clinical benefits and has even earned an FDA Breakthrough Device designation for its AI conversational agents. And Woebotโanother well-known AI therapist, which is now shutting down due to the challenges to remain competitive and compliant in the industryโalso shared data and reports on how the technology can genuinely help users.
Itโs Not That Bad
Recent studies with fresh data suggest chatbots can reduce symptoms of depression and stress. According to data shared by the app Earkickโas reported by TIMEโpeople who use the AI models for up to 5 months can reduce their anxiety by 32%, and 34% of users report improved moods.
In a recent video shared by BBC World Service, journalist Jordan Dunbar explains that many AI models can actually be helpful for journaling, managing anxiety, self-reflection, and even mild depression. They can serve as a valuable first line of support when thereโs no access to better alternatives.
Reporter Kelly Ng also shared compelling information: In a 2022 study, out of a million People in China, only 20 had access to mental health services. In Asian cultures, mental health can be a complex and often taboo subject. AI tools like DeepSeek can serve as discreet allies, helping users manage emotions and find support during difficult times.
Experts Warn About Using Chatbots As Therapists
Of course, using AI as a substitute for a mental health expert can also be extremely dangerous. AI platforms such as Character.AI have been accused of promoting self-harm and violenceโand even exposing children to sexual content.
Tragic cases, such as the 14-year-old child who committed suicide after becoming addicted to interactions with his Character.AI chatbot, serve as stark warnings about the profound risks this technology can pose to humans.
In response, many AI companies have decided to apply age verification systems to restrict usage to adults and have introduced new safety measures to improve the services provided.
Still, even the latest updates to the most advanced chatbots carry risk.
ChatGPTโs sycophanticโexcessively flatteringโpersonality has raised concerns among mental health professionals as it can distort usersโ perception of reality. We all enjoy being agreed with, but sometimes honesty and a different perspective are far more valuable.
The disturbing advice occasionally offered by OpenAIโs AI chatbot has contributed to a new phenomenon now known among mental health experts as โChatGPT-Induced Psychosis,” which leads users to become obsessed with the tool and socially isolate as a result.
So, Can Therapists Be Replaced By Chatbots?
Even though Mark Zuckerberg wants everyone to use AI chatbots as therapists and friends, the truth is that human interaction, especially in mental health matters, might be more necessary than he thinksโat least for now.
We are at a crucial moment in the history of AI and its relationship with our mental health. Just like with humans, AI therapists can have either a positive or negative impact. In this case, it also depends on the context, how frequently theyโre used, the userโs mental state, and even how a prompt is written.
Itโs complicated to establish a general rule, but what we can say for now is that there are certain functions for which AI may be more useful than others.
Even if theyโre not specialized tools like Wysa, free versions of some chatbotsโlike DeepSeek or ChatGPTโcan still be incredibly helpful to millions of people around the world. From getting through a difficult moment to reflecting on personal goals, these are powerful platforms that can respond at any time of day and draw from a wide knowledge base on mental health.
At the same time, it’s clear that therapist-chatbots can also be extremely dangerous in certain cases. Parents need to supervise children and teenagers, and even adults can fall into obsessive behaviors or worsen their conditions. Basic principlesโlike encouraging human connection and protecting vulnerable individuals from being manipulated by this technologyโmust be part of our conversation around chatbot-therapists.
And while it may still be a privilegeโnot accessible to everyone in need of emotional supportโa human professional therapist continues to go through more training, understand more context, and offer a human connection that ChatGPT may never be able to replicate.