• iconNews
  • videos
  • entertainment
  • Home
  • News
    • UK News
    • US News
    • Australia
    • Ireland
    • World News
    • Weird News
    • Viral News
    • Sport
    • Technology
    • Science
    • True Crime
    • Travel
  • Entertainment
    • Celebrity
    • TV & Film
    • Netflix
    • Music
    • Gaming
    • TikTok
  • LAD Originals
    • Say Maaate to a Mate
    • Daily Ladness
    • Lad Files
    • UOKM8?
    • FreeToBe
    • Extinct
    • Citizen Reef
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
Snapchat
TikTok
YouTube

LAD Entertainment

YouTube

LAD Stories

Submit Your Content
ChatGPT user poisons himself after 'using AI for medical advice'

Home> News> Technology

Published 15:35 12 Aug 2025 GMT+1

ChatGPT user poisons himself after 'using AI for medical advice'

The man found himself in hospital with severe paranoia and hallucinations after following advice from ChatGPT

Emma Rosemurgey

Emma Rosemurgey

They say the worst thing you can do is Google your symptoms when you're unwell, but turning to ChatGPT for medical advice could also have some pretty dire consequences.

A 60-year-old man discovered this for himself when he found himself in hospital after he poisoned himself on the AI chatbot's advice.

The man, whose case is detailed in the American College of Physicians Journals, was concerned about the amount of salt in his diet and the negative impact it could be having on his health, so he decided to consult ChatGPT about cutting out sodium chloride.

The AI bot suggested he start consuming bromide instead, which can be found in small amounts in seawater and in certain minerals. It was previously used as ingredient in a number of pharmaceutical products, however, it has since been discovered to be toxic to humans in larger quantities.

Advert

The man replaced sodium chloride with sodium bromide upon the advise from ChatGPT (Getty Stock Images)
The man replaced sodium chloride with sodium bromide upon the advise from ChatGPT (Getty Stock Images)

Unaware of this, the man began replacing salt with bromide he ordered from the internet and after about three months, he started experiencing severe paranoia and hallucinations, which led to him being hospitalised.

The man, who had no previous history of poor mental or physical health, initially suspected his neighbour of poisoning him, however, after being treated with fluids and electrolytes he shared other symptoms, including new acne and cherry angiomas, leading doctors to conclude he was experiencing bromism.

Bromism, which is caused by excessive exposure to bromine, can cause neurological symptoms like seizures, tremors, confusion and even comas. It can also cause anxiety, depression, psychosis, fatigue and anorexia, among other symptoms.

"Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet," the case report explained.

Advert

He replaced table salt 'sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning'.

ChatGPT's Terms of Use states information is not always correct (Cheng Xin/Getty Images)
ChatGPT's Terms of Use states information is not always correct (Cheng Xin/Getty Images)

After three weeks in hospital, the man was discharged and the author of the case has warned others not to make the same mistake of taking medical information from AI sources such as ChatGPT.

They wrote: "It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation."

Meanwhile, OpenAI, the developer behind ChatGPT says the Terms of Use say information 'may not always be accurate'.

Advert

The terms state: "You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice."

The company’s Service Terms also say: “Our Services are not intended for use in the diagnosis or treatment of any health condition.”

LADbible has contacted OpenAI for further comment.

Featured Image Credit: Getty Stock Images

Topics: Technology, Health, AI, Mental Health

Emma Rosemurgey
Emma Rosemurgey

Emma is an NCTJ accredited journalist who recently rejoined LADbible as a Trends Writer. She previously worked on Tyla and UNILAD, before going on to work at the Mirror Online. Contact her via [email protected]

Advert

Advert

Advert

Choose your content:

4 mins ago
an hour ago
2 hours ago
3 hours ago
  • 4 mins ago

    Boy was decapitated on ride designed by 'trial and error' in tragic theme park disaster

    Caleb Schwab was just 10 when he decided to ride the Verrückt water slide

    News
  • an hour ago

    Trump sparks fury after saying police are now 'allowed to do whatever the hell they want'

    Donald Trump has promised 'a historic action to rescue our nation's capital'

    News
  • 2 hours ago

    Mum issues stark warning after 'funny' black line on nail turned out to be first sign of cancer

    Lucy Thompson, 35, first noticed the black line on her left thumb two years ago

    News
  • 3 hours ago

    Antarctic researchers discover body of British man frozen in glacier 65 years ago

    Dennis 'Tink' Bell went missing 65 years ago while working as a meteorologist for the Falkland Islands Dependencies Survey in Antarctica

    News
  • ChatGPT boss issues warning for anyone using AI as a form of 'therapy'
  • ChatGPT CEO makes dark admission over what happens when you search using AI
  • Security expert explains how to prevent your ChatGPT chats from appearing on Google
  • Man ‘solves clicking jaw issue’ he’s had for five years using ChatGPT in seconds