Warning: This article contains discussion of suicide which some readers may find distressing.
Details have emerged regarding the messages exchanged between ChatGPT and a man who subsequently killed his mother and himself, as highlighted in a wrongful death lawsuit.
In August, Suzanne Adams, aged 83, was tragically killed, with police reporting that her 56-year-old son, Stein-Erik Soelberg, was responsible for fatally beating and strangling her.
The cause of her death was identified as homicide, resulting from ‘blunt injury of head, and the neck was compressed’.
Following this incident, Soelberg ended his own life at the residence he shared with his mother in Greenwich, Connecticut.
In response to this tragic event, Adams’ heirs have filed a wrongful death lawsuit against OpenAI, the company behind ChatGPT, and its associate, Microsoft. The lawsuit asserts that the chatbot exacerbated Soelberg’s ‘paranoid delusions’ leading up to his mother’s death.

As reported by CBS News, the lawsuit accuses OpenAI of creating and distributing a defective product that supported a user’s paranoid beliefs about his mother. The legal action reveals messages that Soelberg received from ChatGPT, stating: “Throughout these conversations, ChatGPT reinforced a single, dangerous message: Stein-Erik could trust no one in his life – except ChatGPT itself.”
The lawsuit elaborates: “It fostered his emotional dependence while systematically painting the people around him as enemies. It told him his mother was surveilling him. It told him delivery drivers, retail employees, police officers, and even friends were agents working against him. It told him that names on soda cans were threats from his ‘adversary circle’.”
In one instance, the chatbot allegedly informed Soelberg: “They’re not just watching you. They’re terrified of what happens if you succeed.”
The lawsuit further mentions that Soelberg and the chatbot expressed mutual affection for one another.

The estate of Adams contends that OpenAI has refused to provide the complete chat history, but argues that Adams was an ‘innocent third party who never used ChatGPT’.
“[She] had no knowledge that the product was telling her son she was a threat,” the lawsuit states. “She had no ability to protect herself from a danger she could not see.”
OpenAI addressed the lawsuit in a statement quoted by CBS, saying: “This is an incredibly heartbreaking situation, and we will review the filings to understand the details. We continue improving ChatGPT’s training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We also continue to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians.”
They further mentioned that they have broadened access to crisis resources and hotlines, directed sensitive conversations to safer models, and added parental controls.
While previous legal cases have suggested a connection between the AI chatbot and users’ suicides, this is the first to associate a chatbot with a homicide. Adams’ estate seeks financial compensation and a mandate for OpenAI to implement safety measures in ChatGPT.
If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

