ladbible homepage
ladbible homepage
  • Home
  • News
    • UK
    • US
    • World
    • Ireland
    • Australia
    • Science
    • Crime
    • Weather
  • Entertainment
    • Celebrity
    • TV
    • Film
    • Music
    • Gaming
    • Netflix
    • Disney
  • Sport
  • Technology
  • Travel
  • Lifestyle
  • Money
  • Originals
    • FFS PRODUCTIONS
    • Say Maaate to a Mate
    • Daily Ladness
    • UOKM8?
    • FreeToBe
    • Citizen Reef
  • Advertise
  • Terms
  • Privacy & Cookies
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Archive
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
Snapchat
TikTok
YouTube
Submit Your Content Here
  • SPORTbible
  • Tyla
  • GAMINGbible
  • LADbible Group
  • UNILAD
  • FOODbible
  • UNILAD Tech
ChatGPT user poisons himself after 'using AI for medical advice'
Home>News>Technology
Published 15:35 12 Aug 2025 GMT+1

ChatGPT user poisons himself after 'using AI for medical advice'

The man found himself in hospital with severe paranoia and hallucinations after following advice from ChatGPT

Emma Rosemurgey

Emma Rosemurgey

google discoverFollow us on Google Discover

They say the worst thing you can do is Google your symptoms when you're unwell, but turning to ChatGPT for medical advice could also have some pretty dire consequences.

A 60-year-old man discovered this for himself when he found himself in hospital after he poisoned himself on the AI chatbot's advice.

The man, whose case is detailed in the American College of Physicians Journals, was concerned about the amount of salt in his diet and the negative impact it could be having on his health, so he decided to consult ChatGPT about cutting out sodium chloride.

The AI bot suggested he start consuming bromide instead, which can be found in small amounts in seawater and in certain minerals. It was previously used as ingredient in a number of pharmaceutical products, however, it has since been discovered to be toxic to humans in larger quantities.

Advert

The man replaced sodium chloride with sodium bromide upon the advise from ChatGPT (Getty Stock Images)
The man replaced sodium chloride with sodium bromide upon the advise from ChatGPT (Getty Stock Images)

Unaware of this, the man began replacing salt with bromide he ordered from the internet and after about three months, he started experiencing severe paranoia and hallucinations, which led to him being hospitalised.

The man, who had no previous history of poor mental or physical health, initially suspected his neighbour of poisoning him, however, after being treated with fluids and electrolytes he shared other symptoms, including new acne and cherry angiomas, leading doctors to conclude he was experiencing bromism.

Bromism, which is caused by excessive exposure to bromine, can cause neurological symptoms like seizures, tremors, confusion and even comas. It can also cause anxiety, depression, psychosis, fatigue and anorexia, among other symptoms.

"Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet," the case report explained.

He replaced table salt 'sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning'.

ChatGPT's Terms of Use states information is not always correct (Cheng Xin/Getty Images)
ChatGPT's Terms of Use states information is not always correct (Cheng Xin/Getty Images)

After three weeks in hospital, the man was discharged and the author of the case has warned others not to make the same mistake of taking medical information from AI sources such as ChatGPT.

They wrote: "It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation."

Meanwhile, OpenAI, the developer behind ChatGPT says the Terms of Use say information 'may not always be accurate'.

The terms state: "You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice."

The company’s Service Terms also say: “Our Services are not intended for use in the diagnosis or treatment of any health condition.”

LADbible has contacted OpenAI for further comment.

Featured Image Credit: Getty Stock Images

Topics: Technology, Health, AI, Mental Health

Emma Rosemurgey
Emma Rosemurgey

Emma is an NCTJ accredited journalist who recently rejoined LADbible as a Trends Writer. She previously worked on Tyla and UNILAD, before going on to work at the Mirror Online. Contact her via [email protected]

Recommended reads

Symptoms explained as woman has boobs so big she can’t even play with sonKennedy News and MediaBoy, 11, who vanished for six years abroad explains why police didn't save him when called(YouTube/60 Minutes Australia)Student dies and two being treated in new meningitis outbreak GettyPam Grier makes candid sex confession as TV star admits she has 'three day long orgasms'Michael Tullberg/Getty Images

Advert

Choose your content:

an hour ago
10 hours ago
12 hours ago
  • Kennedy News and Media
    an hour ago

    Symptoms explained as woman has boobs so big she can’t even play with son

    Bristol mum Charlotte Innes says she has been denied breast reduction surgery five times

    News
  • Getty
    10 hours ago

    Student dies and two being treated in new meningitis outbreak

    The UK Health Security Agency confirmed that they are actively contacting those who may be at risk

    News

    breaking

  • Getty Stock
    12 hours ago

    More than 900 bodies remain trapped in shipwreck at bottom of US ocean

    There's another huge concern with the ship

    News
  • BBC
    12 hours ago

    Moment US press and Chinese security argue in chaotic scenes

    In the end they decided to push their way out

    News
  • Man creates cancer vaccine for 'best mate' dog using ChatGPT
  • ChatGPT CEO makes dark admission over what happens when you search using AI
  • Woman breaks up with boyfriend after discovering his ChatGPT message history
  • ChatGPT boss issues warning for anyone using AI as a form of 'therapy'