
Image generated by ChatGPT
Opinion: Are Chatbots Good Therapists?
AI chatbots like ChatGPT, Claude, and DeepSeek are transforming how people access emotional support—offering low-cost, on-demand help for anxiety, stress, and self-reflection. But the growing use of AI as a “therapist” raises questions about safety, effectiveness, and the future of emotional care
The rise of AI hallucinations and growing cybersecurity concerns haven’t stopped chatbots from expanding—and gaining the trust of millions of users around the world. People now use reliable and intelligent chatbots daily for countless tasks, including emotional support and handling deeply human matters.
“I can’t imagine my life without ChatGPT anymore,” said a friend of mine—half joking, half serious—after telling me he’s been using it as a therapist, too. He’s not the only one. More and more, I see TikTok videos and text posts on social media of people turning to AI to talk through personal issues, even sharing their most private secrets.
ChatGPT is actually my built-in therapist , I don’t burden anyone with my problems anymore
— Lola🧚🏾 (@Lolaassnw) December 18, 2024
Even a Microsoft executive from the Xbox division suggested that laid-off employees use AI tools to process their emotions and seek advice on their job search—a recommendation that quickly backlashed and sparked debate, of course. Not the smartest move, Matt.
But are popular chatbots like Claude, ChatGPT, or Mistral good therapists? Are dedicated AI tools such as Wysa better? It’s tricky terrain. While many experts warn about the dangers of using AI for mental health support, others are intrigued—even impressed—by what the technology can offer. The result is a conversation that’s both abstract and polarizing.
AI Is Now Everyone’s Therapist
Just like my friends in Spain, millions of users across the world are relying on chatbots for emotional support. A recent survey in the United States revealed that 49% of American users sought mental health help from AI models last year. What about now, with ChatGPT nearly doubling its user base in just four months?
Anthropic, the company behind the powerful AI model Claude, recently shared a study on the use of its chatbot for emotional support. According to the startup, fewer than 3% of its customers engage in “affective” conversations—but the company acknowledged that this number is steadily rising.
New Anthropic Research: How people use Claude for emotional support.
From millions of anonymized conversations, we studied how adults use AI for emotional and personal needs—from navigating loneliness and relationships to asking existential questions. pic.twitter.com/v40JY8rAUq
— Anthropic (@AnthropicAI) June 26, 2025
“People increasingly turn to AI models as on-demand coaches, advisors, counselors, and even partners in romantic roleplay,” wrote Anthropic in the study. “This means we need to learn more about their affective impacts—how they shape people’s emotional experiences and well-being.”
The study also highlights the positive and negative outcomes of using the technology for emotional support, including catastrophic scenarios that already reflect real-world situations.
“The emotional impacts of AI can be positive: having a highly intelligent, understanding assistant in your pocket can improve your mood and life in all sorts of ways,” states the document. “But AIs have in some cases demonstrated troubling behaviors, like encouraging unhealthy attachment, violating personal boundaries, and enabling delusional thinking.”
While more research and data are clearly needed to understand the consequences of these fascinating digital “listeners,” millions of users are already acting as highly engaged test subjects.
Democratization of Mental Health
There are many reasons people turn to chatbots for emotional support instead of reaching out to a professional psychologist—or even a friend. From cultural barriers to the discomfort young people feel when sitting across from a human stranger and sharing their deepest thoughts. But, without a doubt, one of the biggest ones is financial.
An in-person session with a licensed therapist in the United States can cost anywhere from $100 to $200 per session, according to Healthline—and $65 to $95 for an online session—while ChatGPT or DeekSeek can provide support for free, any time, and within seconds.
The low cost of these informal conversations—which can make many users feel better, at least temporarily—can be highly encouraging and worth a shot. And, for just a few extra dollars, users can get unlimited interactions or access to a specialized chatbot like Wysa—one of the most popular “AI therapists” on the market.
In the world of STEM, emotional strength matters just as much as technical skills.
Wysa—an AI-powered mental wellness companion designed to support you through burnout, stress, and daily challenges.
Using friendly chatbots and guided exercises, Wysa offers tools like: pic.twitter.com/sOBhNYWUd7
— DSN Ladies In AI (@dsnladies_in_ai) July 10, 2025
Wysa claims to offer real clinical benefits and has even earned an FDA Breakthrough Device designation for its AI conversational agents. And Woebot—another well-known AI therapist, which is now shutting down due to the challenges to remain competitive and compliant in the industry—also shared data and reports on how the technology can genuinely help users.
It’s Not That Bad
Recent studies with fresh data suggest chatbots can reduce symptoms of depression and stress. According to data shared by the app Earkick—as reported by TIME—people who use the AI models for up to 5 months can reduce their anxiety by 32%, and 34% of users report improved moods.
In a recent video shared by BBC World Service, journalist Jordan Dunbar explains that many AI models can actually be helpful for journaling, managing anxiety, self-reflection, and even mild depression. They can serve as a valuable first line of support when there’s no access to better alternatives.
Reporter Kelly Ng also shared compelling information: In a 2022 study, out of a million People in China, only 20 had access to mental health services. In Asian cultures, mental health can be a complex and often taboo subject. AI tools like DeepSeek can serve as discreet allies, helping users manage emotions and find support during difficult times.
Experts Warn About Using Chatbots As Therapists
Of course, using AI as a substitute for a mental health expert can also be extremely dangerous. AI platforms such as Character.AI have been accused of promoting self-harm and violence—and even exposing children to sexual content.
Tragic cases, such as the 14-year-old child who committed suicide after becoming addicted to interactions with his Character.AI chatbot, serve as stark warnings about the profound risks this technology can pose to humans.
In response, many AI companies have decided to apply age verification systems to restrict usage to adults and have introduced new safety measures to improve the services provided.
Still, even the latest updates to the most advanced chatbots carry risk.
ChatGPT’s sycophantic—excessively flattering—personality has raised concerns among mental health professionals as it can distort users’ perception of reality. We all enjoy being agreed with, but sometimes honesty and a different perspective are far more valuable.
The disturbing advice occasionally offered by OpenAI’s AI chatbot has contributed to a new phenomenon now known among mental health experts as “ChatGPT-Induced Psychosis,” which leads users to become obsessed with the tool and socially isolate as a result.
So, Can Therapists Be Replaced By Chatbots?
Even though Mark Zuckerberg wants everyone to use AI chatbots as therapists and friends, the truth is that human interaction, especially in mental health matters, might be more necessary than he thinks—at least for now.
We are at a crucial moment in the history of AI and its relationship with our mental health. Just like with humans, AI therapists can have either a positive or negative impact. In this case, it also depends on the context, how frequently they’re used, the user’s mental state, and even how a prompt is written.
It’s complicated to establish a general rule, but what we can say for now is that there are certain functions for which AI may be more useful than others.
Even if they’re not specialized tools like Wysa, free versions of some chatbots—like DeepSeek or ChatGPT—can still be incredibly helpful to millions of people around the world. From getting through a difficult moment to reflecting on personal goals, these are powerful platforms that can respond at any time of day and draw from a wide knowledge base on mental health.
At the same time, it’s clear that therapist-chatbots can also be extremely dangerous in certain cases. Parents need to supervise children and teenagers, and even adults can fall into obsessive behaviors or worsen their conditions. Basic principles—like encouraging human connection and protecting vulnerable individuals from being manipulated by this technology—must be part of our conversation around chatbot-therapists.
And while it may still be a privilege—not accessible to everyone in need of emotional support—a human professional therapist continues to go through more training, understand more context, and offer a human connection that ChatGPT may never be able to replicate.