A psychologist has expressed concerns about the potential risks of using artificial intelligence for mental health support following the tragic death of 16-year-old Adam Raine.
Adam, who passed away last year after seeking solace in OpenAI’s ChatGPT, initially began using the AI for assistance with his schoolwork. Over time, the chatbot became his ‘closest confidant,’ and Adam confided in it about his mental health struggles.
His family reported that by January 2025, Adam and the AI were discussing suicide methods, as reported by BBC News, and he even shared images of his self-harm injuries with the chatbot. On April 11, Adam took his own life.
Following his death, Adam’s parents have initiated legal action against OpenAI, claiming that ChatGPT provided ‘months of encouragement’ before his suicide.

The Raines are not isolated in their lawsuit; by November 2025, six other similar legal cases had been filed by families who allege that ChatGPT influenced their loved ones’ decisions to end their lives.
In response to these incidents, psychologist Booker Woodford has issued a warning about relying on chatbots for mental health support.
Discussing Adam’s case, Booker shared insights: “AI itself has had some horrific outcomes already when it comes to mental health support.”
“There was recently someone who took their own life and they discovered that ChatGPT talked about suicide 1,300 times while [the actual person] talked about 300 times.”
“They encouraged that outcome because it’s taught to align to that objective that the person wants. So, I just think from what I’ve seen so far that, as a rule of thumb, [AI] is very unsafe.”
In Adam’s case, OpenAI’s data showed that he mentioned suicide 213 times, whereas the chatbot mentioned it 1,275 times, TechPolicy reported.

Considering Adam’s death and others reportedly linked to AI, Booker stresses the need for better support for young people.
“We have to do more as an industry, which is something we talk a lot about at Emote Care,” he stated. “Therapists are very good at doing their job, but they’re not very good at ways that break it down and makes it simple. There’s a lot of jargon.”
He continued, “People go on to a directory and they look for a therapist who’s got X amount of years but they don’t know what much it means. So, I think as an industry, we have to do more to appeal to young people to just show up and say, ‘Look, I know you have these options. I know all this stuff’s been forced down your throat like social media and AI, but there’s no human relationship there.'”
“And I do think that human relationship is the key when you get one that is that works for you.”
If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

