Home » Health » Man Poisoned Himself After Following ChatGPT’s Health Advice

Man Poisoned Himself After Following ChatGPT’s Health Advice

Man Hospitalized After Following ChatGPT‘s Advice to Self-Treat Chloride Deficiency – A recent case study details how a man in the United States suffered severe health consequences, including psychosis, after using OpenAI’s ChatGPT to seek advice on eliminating chloride from his diet, leading him to ingest a dangerous chemical.

Researchers investigating the incident found that the chatbot failed to exhibit the caution a medical professional would, not questioning the user’s reasoning for wanting to alter his diet. Both anecdotal and clinical evidence suggests AI can be beneficial in healthcare, but this case highlights the risks of relying on Large Language Models (LLMs) for self-diagnosis and treatment.



The patient, whose identity has not been released to protect his privacy, acted on ChatGPT’s suggestions and purchased sodium bromide. While used to treat epilepsy in dogs, sodium bromide is also a potent pool cleaner and pesticide. Over three months,he consumed the substance,developing “paranoia and auditory and visual hallucinations.”

Bromism,a syndrome caused by bromide toxicity,is uncommon today,but historically notable. It was prevalent in the 19th century, with a 1930 study revealing that up to 8% of patients in psychiatric hospitals suffered from it.Regulation of bromide by the FDA between 1975 and 1989 considerably reduced the incidence of bromism.

The case study indicates the patient consulted either ChatGPT 3.5 or 4.0 when researching methods to reduce chloride in his diet. The incident occured in the spring of 2025, according to researchers at the University of California, San Francisco, who published the findings in the journal *Digital Medicine* on August 1st, 2025.

In a product launch livestream on Thursday, August 7th, 2025, OpenAI CEO Sam Altman unveiled ChatGPT 5, touting it as “the best model ever for health.” Altman announced new “safe completions” designed to address potentially harmful or ambiguous queries. He also featured a testimonial from an OpenAI employee and his wife, who used ChatGPT to navigate a cancer diagnosis, understand medical reports, and make informed decisions about treatment, including radiation therapy. Altman emphasized the goal of empowering users to be “active participants in their own care journey.”

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.