Man Hospitalized After Following ChatGPT Advice to Swap Table Salt with Chemical
A 60-year-old man developed severe bromism with hallucinations after consuming sodium bromide daily for three months following AI chatbot advice, highlighting risks of unverified health guidance.
- Amid rising concerns over AI misinformation, three physicians from the University of Washington published in the Annals of Internal Medicine that a 60-year-old man was hospitalized after following ChatGPT’s advice to replace salt with sodium bromide.
- Seeking salt alternatives, the man consulted ChatGPT, which suggested sodium bromide, a neurotoxic compound once used in sedatives, after reading about chloride risks earlier this month.
- Consuming the compound, he
developed bromism with paranoia and delusions after three months of daily sodium bromide intake, doctors diagnosed him accordingly. - Despite initial psychosis, the patient recovered over a three-week hospital stay as electrolytes normalized and psychosis resolved, warning about online health advice risks.
- Amid wider scrutiny of AI in healthcare, researchers warn AI-generated health advice lacks accuracy and critical judgment, citing OpenAI disclaimers and CEO Sam Altman’s promotion.
14 Articles
14 Articles
Man hospitalized after following ChatGPT advice to swap table salt with chemical
A 60-year-old man spent three weeks in the hospital after swapping table salt for a chemical once used in sedatives. According to a case published in the Annals of Internal Medicine, the man made the switch after seeking medical advice from the artificial intelligence chatbot ChatGPT. AI’s role in a rare medical case The study’s authors say the case raises questions about how artificial intelligence can influence real-world health choices. Inves…
Man asks ChatGPT for advice on how to cut salt, ends up in hospital with hallucinations
A 60-year-old man asked ChatGPT for advice on how to replace table salt, and the substitution landed him in the emergency room suffering from hallucinations and other symptoms. In a case report published this month in the Annals of Internal Medicine, three doctors from the University of Washington in Seattle used the man’s case to explain how AI tools, as they are designed right now, are not always the most reliable when it comes to medicine. “I…


Man who asked ChatGPT about cutting out salt from his diet was hospitalized with hallucinations
A case study in a medical journal reported that the 60-year-old man replaced sodium chloride with sodium bromide after consulting the AI bot.
A health advice from Chat-GPT ended for a 60-year-old man with paranoia and hallucinations. He had replaced his salt with bromide salt.
In a new case of artificial intelligence (AI) misuse, a man nearly died after replacing common salt with sodium bromide, following advice from Chatbot GPT. Man nearly lost his life after following AI advice. The 60-year-old man asked the generative AI chatbot for suggestions because he was concerned about the adverse effects of salt consumption on the body, according to research published in the Annals of Internal Medicine on August 5. The recom…
Coverage Details
Bias Distribution
- 50% of the sources lean Left
Factuality
To view factuality data please Upgrade to Premium