Health experts are sounding the alarm after a new study argued that compulsive reliance on AI could be serious enough to warrant recognition as a mental illness.
Since chatbots went mainstream in 2023, they’ve quickly become a tool many people lean on throughout the day.
From checking a quick fact and organizing travel plans to handling basic tasks or asking for perspective on personal problems, AI assistants have become a go-to resource for all kinds of needs.
Some users even describe their interactions in therapeutic terms, citing the way chatbots can appear to listen, “reason,” and respond in a considerate way.
But researchers say they’re now seeing a rise in cases where this use shifts into dependency, prompting renewed concern from health professionals.
Dr Dongwook Yoo, who authored a recent paper examining AI addiction, said the biggest worries center on younger users who are testing the limits of what these systems can do.
The research suggests that some young people are using AI for emotionally charged roleplay, venting, and attempts to form a sense of connection with a non-human companion.

The health expert warned: “AI addiction is a growing problem causing many harms, yet some researchers deny it’s even a real issue.
“And deliberate design decisions by some of the corporations involved are contributing, keeping users online regardless of their health or safety.”
Attempts to formally define digital addictions have been difficult in recent years, in part because researchers have used strict thresholds for what qualifies as an addiction.
Those criteria include: salience (it becomes the most important thing in a person’s life), tolerance (the amount of use increases), mood modification, conflict (it leads to problems with other people), withdrawal symptoms, and relapse.
Online communities dedicated to the issue have also grown, including a forum on Reddit where hundreds of people discuss feeling dependent on chatbot platforms—many of them teenagers.
One wrote: “At first I just thought it was interesting that I could get a response out of saying basically anything.
“Aside from being able to have basically any conversation I wanted, they also said whatever I wanted to hear. I think that spoke to the part of me that didn’t always feel listened to or understood.”

They added: “I neglected other parts of my life in favour of it, especially socially. It didn’t feel that different from talking to a real person at times, so I’d sometimes talk to it more than I’d talk to a friend.”
And they’re not alone.
Another young forum user said their use became so intense that they once stayed awake all night, spending the hours talking to chatbots instead of sleeping.
Speaking to the Daily Mail, Karen Shen, lead author of the paper, concluded: “Our findings suggest that a central mechanism underlying addictive use is how users can get exactly anything they want with minimal effort.
“Our findings show that users report symptoms such as conflict and relapse that are comparable to those reported for behavioural addictions, which do have formal diagnoses.”
If you or someone you know is struggling or in crisis, help is available through Mental Health America. Call or text 988 to reach a 24-hour crisis center or you can webchat at 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

