Man Poisons Himself After Taking ChatGPT's Dietary Advice
A 60-year-old man developed bromism after replacing table salt with sodium bromide based on ChatGPT advice, resulting in psychosis and a three-week hospital stay, medical researchers reported.
- A case study published this week in the Annals of Internal Medicine Clinical Cases reports a 60-year-old man developed bromism after ChatGPT suggested replacing salt with sodium bromide.
- After learning about sodium chloride’s risks, ChatGPT suggested substituting it with sodium bromide without health warnings, leading to bromism in the patient.
- Symptoms included the patient exhibiting paranoia, hallucinations, psychosis, facial acne, thirst, fatigue, insomnia, and poor coordination due to bromism from long-term sodium bromide exposure.
- Upon stabilization, the patient spent three weeks in hospital before being treated with intravenous fluids, electrolytes, and antipsychotic medication.
- Experts urge healthcare providers to implement safeguards like medical knowledge bases and human oversight as AI tools become more widespread, to prevent cases like this one, according to Harvey Castro.
38 Articles
38 Articles
His psychosis was a mystery—until doctors learned about ChatGPT's health advice
A 60-year-old man arrived at a Seattle hospital convinced his neighbor was poisoning him. Though medically stable at first, he soon developed hallucinations and paranoia. The cause turned out to be bromide toxicity—triggered by a health experiment he began after consulting ChatGPT. The case, published in Annals of Internal Medicine: Clinical Cases, highlights a rare but reversible form of psychosis that may have been influenced by generative art…
The case in a paper of the magazine "Annals of Internal Medicine: Clinical Cases" . The man, 60 years old, ended up intoxicating himself with sodium bromide. The details
Coverage Details
Bias Distribution
- 48% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium