AI Chatbots Now Guide Psychedelic Trips

Image by Freepik

AI Chatbots Now Guide Psychedelic Trips

Reading time: 4 min

People can now use AI-powered chatbots to support their psychedelic experiences but mental health professionals warn about the dangers of using emotionless digital guides.

In a rush? Here are the quick facts:

  • ChatGPT helped a user plan and navigate a “heroic dose” of psilocybin.
  • Therabot clinical trial showed 51% reduction in depression symptoms.
  • Experts warn chatbots lack emotional attunement for safe therapy support.

Trey, a first responder from Atlanta, used a chatbot to overcome his 15-year struggle with alcoholism. In April, he took 700 micrograms of LSD, over six times the typical dose, while using Alterd, an app designed for psychedelic support. “I went from craving compulsions to feeling true freedom,” he says, as reported by WIRED.

Since then, WIRED reported he’s used the chatbot over a dozen times, describing it as a “best friend.”He’s not alone. WIRED reports how more people are seeking AI assistance as psychedelic therapy becomes more popular, despite legal restrictions remaining in effect outside Oregon and Australia.

Chatbots like ChatGPT are being used to prepare, coach, and reflect on intense trips with drugs like LSD or psilocybin. Peter, a coder from Canada, used ChatGPT before taking a “heroic dose” of mushrooms, describing how the bot offered music suggestions, guided breathing, and existential reflections like: “This is a journey of self-exploration and growth,” as reported by WIRED

Meanwhile, clinical trials are backing up some of these trends. Dartmouth recently trialed an AI chatbot named Therabot, finding it significantly improved symptoms in people with depression, anxiety, and eating disorders. “We’re talking about potentially giving people the equivalent of the best treatment… over shorter periods of time,” said Nicholas Jacobson, the trial’s senior author.

Specifically, Therabot showed a 51% drop in depression symptoms in a study of 106 people. Participants treated it like a real therapist, reporting a level of trust comparable to human professionals.

Still, experts raise major concerns. WIRED reports that UC San Francisco neuroscientist Manesh Girn warns, “A critical concern regarding ChatGPT and most other AI agents is their lack of dynamic emotional attunement and ability to co-regulate the nervous system of the user.”

More concerning, philosopher Luciano Floridi notes that people often confuse chatbots for sentient beings, a phenomenon called semantic pareidolia. “We perceive intentionality where there is only statistics,” he writes, warning that emotional bonds with chatbots may lead to confusion, spiritual delusions, and even dependency.

These risks grow more urgent as AI becomes more human-like. Studies show that generative AIs outperform humans in emotional intelligence tests, and chatbots like Replika simulate empathy convincingly. Some users mistake these bots for divine beings. “This move from pareidolia to idolatry is deeply concerning,” Floridi says. Fringe groups have even treated AI as sacred.

A U.S. national survey revealed that 48.9% of people turned to AI chatbots for mental health support, and 37.8% said they preferred them over traditional therapy. But experts, including the American Psychological Association, warn that these tools often mimic therapeutic dialogue while reinforcing harmful thinking. Without clinical oversight, they can give the illusion of progress, while lacking accountability.

Further complicating matters, a recent study from University College London found that popular chatbots like ChatGPT and Claude provide inconsistent or biased moral advice. When asked classic dilemmas or real-life ethical scenarios, AI models defaulted to passive choices and changes answers based on subtle wording.

Despite these risks, AI-assisted trips may offer accessibility for those unable to afford or access professional therapy. As Mindbloom CEO Dylan Beynon notes, “We’re building an AI copilot that helps clients heal faster and go deeper,” as reported by WIRED.

Still, researchers stress these tools are not replacements for human therapists. “The feature that allows AI to be so effective is also what confers its risk,” warns Michael Heinz, co-author of the Therabot study.

Did you like this article? Rate it!
I hated it I don't really like it It was ok Pretty good! Loved it!

We're thrilled you enjoyed our work!

As a valued reader, would you mind giving us a shoutout on Trustpilot? It's quick and means the world to us. Thank you for being amazing!

Rate us on Trustpilot
0 Voted by 0 users
Title
Comment
Thanks for your feedback