A 60-year-old man landed in the hospital after unknowingly poisoning himself by following diet advice from ChatGPT. According to the case report A case of bromism influenced by use of artificial intelligence’ published in Annals of Internal Medicine: Clinical Cases, the US resident swapped regular table salt (sodium chloride) with sodium bromide, which is a chemical once used in old medicines but now known to be toxic in high doses. Three months later, he landed in the hospital with severe neuropsychiatric symptoms, paranoia, and skin changes, revealing the hidden dangers of relying solely on AI for health guidance.
What exactly happened in ChatGPT AI Advice case?
The man, with no prior medical or psychiatric history, was admitted to the hospital after believing his neighbour was poisoning him. At first, he didn’t reveal any unusual dietary habits. But further questioning uncovered that he had been on a highly restrictive vegetarian diet, distilled his water on his own, and had been consuming sodium bromide instead of salt for three months.
He had found the idea of swapping regular table salt with sodium bromide after asking ChatGPT how to remove salt from his diet. While the AI did mention that “context matters”, it suggested bromide as a chloride substitute without providing a health warning.
He then bought the chemical online, unaware of its dangers.
What is bromism poisoning?
Bromism is a form of poisoning caused by excessive bromide intake. In the early 1900s, bromide salts were common in over-the-counter remedies for anxiety, insomnia, and hysteria. But over time, doctors realised bromide could cause neurological, psychiatric, and skin problems, from hallucinations and paranoia to acne and coordination issues.
Also Read
The US banned bromide in ingestible products between 1975 and 1989. However, with bromide-containing substances easily available online, rare cases have reappeared in recent years.
During his hospital stay, the man, in this case, developed:
- Severe paranoia and hallucinations
- Insomnia and fatigue
- Acne and cherry angiomas on his face
- Ataxia (coordination problems)
- Extreme thirst
Lab tests revealed dangerously high bromide levels, 1,700 mg/L, compared to the normal range of 0.9 to 7.3 mg/L. He also had abnormal electrolyte readings, including hyperchloremia and a negative anion gap, which eventually helped doctors suspect bromism.
According to the case report, the treatment at the University of Washington in Seattle, Washington, involved stopping bromide intake immediately and giving intravenous fluids to flush it out of his system. His electrolyte levels and mental state gradually improved over three weeks. He was weaned off psychiatric medication before discharge and remained stable during follow-up.
Role of AI in this medical emergency
The case authors tested ChatGPT themselves, asking what could replace chloride. The AI also suggested bromide, but without explaining its health risks or questioning why the user wanted the substitution. This highlights a critical limitation that AI can present technically correct but contextually dangerous information without medical oversight.
The case report highlighted that while AI tools can be helpful for general education, they are not a replacement for professional medical advice. The study warns that AI can generate inaccurate or decontextualised health suggestions as it lacks the critical judgment of a trained medical professional. It also stressed that self-experimentation based on AI advice can have serious health consequences. If you are considering dietary or lifestyle changes, especially those involving chemicals or supplements, consult a qualified doctor first.
For more health updates, follow #HealthWithBS
This content is for informational purposes only and is not a substitute for professional medical advice.