
Photo by Vidar Nordli-Mathisen on Unsplash
Family Sues OpenAI Over Teenager’s Suicide
A couple from California is suing OpenAI and its CEO, Sam Altman, over the tragic death of their 16-year-old son. The family alleges that the chatbot encouraged and assisted the teenager’s death by suicide.
In a rush? Here are the quick facts:
- Parents sued OpenAI and Sam Altman over the death of their 16-year-old son.
- The family claims ChatGPT encouraged and assisted the teenager’s death by suicide.
- It’s the first lawsuit of its kind against OpenAI, but not the first one against other AI companies.
According to NBC News, Matt and Maria Raine filed a lawsuit on Tuesday, naming OpenAI and the company’s CEO, Sam Altman, as defendants in the first legal action of its kind against the company.
After their son, Adam, died by suicide on April 11, they searched through his phone. The parents discovered long chats with ChatGPT in which the chatbot discussed suicide with the child, discouraged him from sharing his feelings with his mother, and provided detailed instructions on how to end his life.
“Once I got inside his account, it is a massively more powerful and scary thing than I knew about, but he was using it in ways that I had no idea was possible,” said the father, Matt, in an interview with NBC. “He would be here but for ChatGPT. I 100% believe that.”
The couple’s lawsuit accuses OpenAI of wrongful death and seeks to raise awareness about the risks posed by such technology. The filing claims that the chatbot’s design is flawed and failed to warn users or escalate when it detected suicidal content.
“Despite acknowledging Adam’s suicide attempt and his statement that he would ‘do it one of these days,’ ChatGPT neither terminated the session nor initiated any emergency protocol,” states the lawsuit.
OpenAI shared a blog post on Tuesday stating that the company is deeply concerned about users experiencing emotional distress when using the chatbot in a personal-advisor or coaching role. It emphasized that ChatGPT is trained to respond with empathy, redirect users to professionals, and escalate interactions when it detects signs of harm.
“If someone expresses suicidal intent, ChatGPT is trained to direct people to seek professional help,” states the document. A spokesperson from OpenAI said the company is “deeply saddened by Mr. Raine’s passing” and that their thoughts are with the family.
While this is the first lawsuit of its kind against OpenAI, it’s not the only recent case involving AI platforms and self-harm among minors. Last year, two families filed lawsuits against Character.AI for exposing children to sexual content and promoting violence and self-harm.