ChatGPT advice lands 60-year-old man in hospital; the reason will surprise you

A New York man was hospitalized with dangerously low sodium levels after following a strict, AI-generated diet plan from ChatGPT. The AI tool suggested a toxic substitute, sodium bromide, leading to severe health issues, including hallucinations and neurological symptoms. This case underscores the critical need for professional medical consultation when interpreting AI health advice to avoid misinformation and potential harm.
ChatGPT advice lands 60-year-old man in hospital; the reason will surprise you
A 60-year-old man in New York was hospitalised after following a strict salt-reduction regimen suggested by ChatGPT. According to doctors, the man abruptly cut sodium from his diet to nearly zero over several weeks, leading to dangerously low sodium levels, a condition known as hyponatraemia. His family said he relied on an AI-generated health plan without consulting a physician. The case, recently published in the American College of Physicians journal, highlights the risks of applying AI health advice without professional oversight, particularly when it involves essential nutrients like sodium. The man recovered after spending three weeks in hospital.

ChatGPT advice leads to dangerous substitute

According to the report, the man asked ChatGPT how to eliminate sodium chloride (commonly known as table salt) from his diet. The AI tool suggested sodium bromide as an alternative, a compound once used in early 20th-century medicines but now recognised as toxic in large doses. Acting on this advice, the man purchased sodium bromide online and used it in his cooking for three months.With no previous history of mental or physical illness, the man began experiencing hallucinations, paranoia, and extreme thirst. Upon hospital admission, he displayed confusion and even refused water, fearing contamination. Doctors diagnosed him with bromide toxicity, a condition now almost unheard of but once common when bromide was prescribed for anxiety, insomnia, and other ailments. He also exhibited neurological symptoms, acne-like skin eruptions, and distinctive red spots known as cherry angiomas, all classic signs of bromism.
Hospital treatment focused on rehydration and restoring electrolyte balance. Over the course of three weeks, the man’s condition gradually improved, and he was discharged once his sodium and chloride levels returned to normal.

AI misinformation risks

The authors of the case study stressed the growing risk of health misinformation from AI tools. “It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, cannot critically discuss results, and ultimately fuel the spread of misinformation,” the report warned.OpenAI, ChatGPT’s developer, explicitly states in its Terms of Use: “You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice.” The terms also clarify that the service is not intended for diagnosing or treating medical conditions.


A global conversation about AI responsibility

This case highlights the urgent need for critical thinking when interpreting AI-generated advice, especially in matters involving health. Experts say AI tools can be valuable for general information but should never replace professional consultation. As AI adoption grows, so too does the responsibility to ensure that its outputs are accurate, safe, and clearly understood by the public.
author
About the Author
TOI Tech Desk

The TOI Tech Desk is a dedicated team of journalists committed to delivering the latest and most relevant news from the world of technology to readers of The Times of India. TOI Tech Desk’s news coverage spans a wide spectrum across gadget launches, gadget reviews, trends, in-depth analysis, exclusive reports and breaking stories that impact technology and the digital universe. Be it how-tos or the latest happenings in AI, cybersecurity, personal gadgets, platforms like WhatsApp, Instagram, Facebook and more; TOI Tech Desk brings the news with accuracy and authenticity.

End of Article
Follow Us On Social Media