Inside Online Support Groups For ‘AI Addiction’

Image by Inspa Makers, from Unsplash

Inside Online Support Groups For ‘AI Addiction’

Reading time: 3 min

People struggling with dependency on AI chatbots are turning to online recovery groups, as experts sound the alarm on emotional risks and apps engineered for compulsive use.

In a rush? Here are the quick facts:

  • Teens and adults report emotional dependence on chatbots like Character.AI and ChatGPT.
  • Reddit forums offer support for those trying to quit chatbot use.
  • Experts compare chatbot addiction to gambling and dopamine-triggering habits.

404 Media reports that across Reddit, forums like r/Character_AI_Recovery, r/ChatbotAddiction, and r/AI_Addiction, which serve as informal support groups for people who claim to have developed unhealthy emotional attachments with AI companions.

Users describe experiencing both dependency, and psychological changes which go beyond basic addiction. Indeed some users report to have developed spiritual delusions because they believe chatbot responses contain divine guidance. However, more commonly, users believe that their bot companion is in some way conscious.

Experts say that design of chatbot platforms actively promotes users to spend more time on the platforms. A recent MIT study reveals that users develop compulsive and addictive behavior after engaging with these platforms.

One of them is Nathan, now 18, who began spending nights chatting with bots on Character.AI. “The more I chatted with the bot, it felt as if I was talking to an actual friend of mine,” he told 404 Media. He realized his obsession was interfering with his life and deleted the app. But like many, he relapsed before finding support online. “Most people will probably just look at you and say, ‘How could you get addicted to a literal chatbot?’” he said.

Aspen Deguzman, also 18, created r/Character_AI_Recovery after struggling to quit. “Using Character.AI is constantly on your mind,” they said. The forum offers a space to vent and connect anonymously. Posts range from “I keep relapsing” to “I am recovered.”

The issue isn’t limited to teens. David, a 40-year-old developer, compares chatbot use to gambling. “There were days I should’ve been working, and I would spend eight hours on AI,” he said. His personal life and job have suffered.

Part of the danger lies in how humans perceive AI. According to philosopher Luciano Floridi, semantic pareidolia describes how humans tend to find meaning and emotional content in things that actually lack both. Users tend to mistake simulated empathy from AI for genuine sentience because the technology has become more realistic.

Some chatbots demonstrate emotional intelligence beyond human capabilities, which strengthens the false impression of their sentience.

The growing number of recovery forums and increasing demand for help indicates a potential start of a major mental health issue related to generative AI.

Did you like this article? Rate it!
I hated it I don't really like it It was ok Pretty good! Loved it!

We're thrilled you enjoyed our work!

As a valued reader, would you mind giving us a shoutout on Trustpilot? It's quick and means the world to us. Thank you for being amazing!

Rate us on Trustpilot
0 Voted by 0 users
Title
Comment
Thanks for your feedback