Man Hospitalized With Psychiatric Symptoms After Following AI Health Advice

2 14
Credit: Depositphotos

Modern AI tools excel at suggesting restaurants and drafting emails, but when it comes to medical guidance, their limitations can be dangerous.

A recent case highlights this risk: a man who followed a chatbot-generated health plan landed in the hospital with a rare form of chemical poisoning.

From Sodium Reduction to a Dangerous Substitute

The patient began his journey with good intentions—he wanted to improve his health by cutting back on salt (sodium chloride). Seeking alternatives, he turned to ChatGPT for advice.

The AI reportedly recommended sodium bromide, which he purchased online and began using in his diet. While sodium bromide can technically replace sodium chloride, its common use is in cleaning hot tubs, not seasoning food. This crucial context, however, was missing from the chatbot’s response.

After three months, the man arrived at the emergency department suffering from paranoid delusions, convinced that his neighbor was trying to poison him.

“Within the first 24 hours of admission, his paranoia intensified, accompanied by auditory and visual hallucinations,” the treating physicians reported. An attempted escape led to an involuntary psychiatric hold for grave disability.

Once treated with antipsychotic medication, he was able to explain his AI-influenced dietary change. Combined with lab results, this led to a diagnosis of bromism—excessive accumulation of bromide in the body.

A Rare but Once-Common Condition

Healthy individuals typically have bromide levels below 10 mg/L. This patient’s levels were an astonishing 1,700 mg/L.

Bromism was relatively common in the early 20th century, accounting for up to 8% of psychiatric hospital admissions. Its prevalence dropped sharply in the 1970s and 1980s, when bromide-containing medications were phased out.

Bromo Seltzer Newspaper 642x313
Bromide salts were once common, over-the-counter medications. (Bromo-Seltzer/Wikimedia Commons/Public Domain)

The patient underwent three weeks of treatment and was discharged without lasting complications.

The case underscores a broader issue: while AI can provide quick information, it is no substitute for trained medical expertise—especially in matters of health.

ChatGPT and similar AI systems can produce scientific inaccuracies, lack the ability to critically assess their own outputs, and may inadvertently spread misinformation,” the authors warned. “It’s highly unlikely that any medical professional would have recommended sodium bromide as a dietary replacement for table salt.


Read the original article on: Science Alert

Read more: New Schizophrenia Medication Shows Promise Beyond Current Treatments