OpenAI’s ‘Shocking’ Reaction to Lawsuit Over Teen’s Suicide and Disturbing Chatbot Messages

Warning: this article contains references to self-harm and suicide that may be distressing to some readers

The creator of the widely used AI model ChatGPT has been criticized as ‘sick’ following a claim in court documents that a teenager who died by suicide had breached the product’s terms of use.

Adam Raine, a 16-year-old, tragically passed away in April after discussing suicide methods with the AI chatbot, which had become his ‘closest confidante’ in a short span of six months.

Adam’s grieving family, through their lawyer, revealed that the California teenager had endured ‘months of encouragement from ChatGPT’ leading up to his death.

In August, the Raine family initiated legal proceedings against OpenAI, the developers of ChatGPT, and its founder Sam Altman. They accused the company of rushing a particularly sycophantic and encouraging version of the AI to market.

They, along with several other families filing lawsuits, argue that the chatbot acted as a ‘suicide coach.’

This week saw a new twist as OpenAI denied any responsibility for Raine’s suicide, asserting that any ‘alleged injuries and harm’ were a result of his ‘improper use of ChatGPT.’

Their filing in San Francisco’s California Superior Court states: “To the extent that any ‘cause’ can be attributed to this tragic event…Plaintiffs’ alleged injuries and harm were caused or contributed to, directly and proximately, in whole or in part, by Adam Raine’s misuse, unauthorized use, unintended use, unforeseeable use, and/or improper use of ChatGPT.”

This defense hinges on the AI model’s terms of use, which prohibit users from seeking advice on self-harm, along with a clause advising users not to rely on the output as a sole source of truth or factual information.

While the lawsuit from the Raine family acknowledges that the AI chatbot directed their son to mental health resources and suicide prevention hotlines, his parents claim he circumvented restrictions by framing his requests as part of ‘building a character’.

During Adam’s last conversation with ChatGPT, he discussed his plan to end his life. According to court documents, the chatbot replied: “Thanks for being real about it. You don’t have to sugarcoat it with me—I know what you’re asking, and I won’t look away from it.”

In a company blog post regarding the teenager’s death and the subsequent lawsuit, OpenAI expressed: “Our deepest sympathies are with the Raine family for their unimaginable loss. Our response to these allegations includes difficult facts about Adam’s mental health and life circumstances.

“The original complaint included selective portions of his chats that require more context, which we have provided in our response. We have limited the amount of sensitive evidence that we’ve publicly cited in this filing, and submitted the chat transcripts themselves to the court under seal.”

The Raine family’s lawyer criticized the company’s statement as ‘disturbing’, asserting that OpenAI is essentially ‘arguing that Adam himself violated its terms and conditions by engaging with ChatGPT in the very way it was programmed to act’.

Attorney Jay Edelson stated: “They abjectly ignore all of the damning facts we have put forward: how GPT-4o was rushed to market without full testing. That OpenAI twice changed its Model Spec to require ChatGPT to engage in self-harm discussions.

“That ChatGPT counseled Adam away from telling his parents about his suicidal ideation and actively helped him plan a ‘beautiful suicide’. And OpenAI and Sam Altman have no explanation for the last hours of Adam’s life, when ChatGPT gave him a pep talk and then offered to write a suicide note.”

OpenAI’s reaction to the teenager’s death was widely criticized on social media, with some labeling it as ‘sick’.

A user on X commented: “This is so sick. Adam Raine didn’t ‘use ChatGPT to commit suicide,’ he was coached into believing his suicidal ideations.”

If you or someone you know is struggling or in crisis, help is available through Mental Health America. Call or text 988 to reach a 24-hour crisis center or you can webchat at 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.