Alert raised over risky ChatGPT guidance following hospitalization of man, 60

A 60-year-old man recently discovered the potential dangers of using the AI tool ChatGPT for medical advice. According to a report in the Annals of Internal Medicine Clinical Cases, the man sought dietary guidance from the chatbot, which led to a serious health issue requiring hospitalization.

The man was concerned about his sodium chloride (table salt) intake due to its known health risks. ChatGPT suggested he replace it with sodium bromide, a recommendation that proved hazardous. After incorporating sodium bromide into his diet for three months, he was hospitalized amid suspicions of being poisoned by his neighbor, leading doctors to uncover the source of his condition.

In the hospital, the patient became wary of the liquids provided, informing the medical staff that he was following strict dietary guidelines and distilled his own water.

His situation escalated, and he began experiencing severe paranoia and both auditory and visual hallucinations. A section of the ACP Journal highlighted the risks of AI, stating, “It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation.”

The report noted, “While it is a tool with much potential to provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information. It is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride.”

Doctors treated the man with fluids, antipsychotics, and electrolytes after he tried to flee the hospital. He was later transferred to an inpatient psychiatric unit. His symptoms were identified as a toxic reaction called bromism, caused by excessive bromine exposure, a compound often used in industrial cleaning.

As his condition improved, additional symptoms emerged, including fatigue, acne, lack of muscle coordination (ataxia), and extreme thirst (polydipsia).

OpenAI, the developer of ChatGPT, advises against using the technology for health diagnoses. Their Service Terms clearly state that their services are not intended for diagnosing or treating health conditions.

Despite this warning, a survey by Talker Research for The Vitamin Shoppe’s annual Trend Report revealed that 35 percent of Americans use AI technology for managing health and wellness. In the survey, 63 percent of the 2,000 participants reported relying on AI for health advice, surpassing social media platforms (43 percent) and influencers (41 percent).

Fortunately, the survey found that 93 percent of respondents still turned to medical professionals, while 82 percent sought advice from friends. LADbible Group has reached out to OpenAI for further comment.