• iconNews
  • videos
  • entertainment
  • Home
  • News
    • UK News
    • US News
    • Australia
    • Ireland
    • World News
    • Weird News
    • Viral News
    • Sport
    • Technology
    • Science
    • True Crime
    • Travel
  • Entertainment
    • Celebrity
    • TV & Film
    • Netflix
    • Music
    • Gaming
    • TikTok
  • LAD Originals
    • FFS PRODUCTIONS
    • Say Maaate to a Mate
    • Daily Ladness
    • UOKM8?
    • FreeToBe
    • Citizen Reef
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
Snapchat
TikTok
YouTube

LAD Entertainment

YouTube

LAD Stories

Submit Your Content
ChatGPT user poisons himself after 'using AI for medical advice'

Home> News> Technology

Published 15:35 12 Aug 2025 GMT+1

ChatGPT user poisons himself after 'using AI for medical advice'

The man found himself in hospital with severe paranoia and hallucinations after following advice from ChatGPT

Emma Rosemurgey

Emma Rosemurgey

google discoverFollow us on Google Discover

They say the worst thing you can do is Google your symptoms when you're unwell, but turning to ChatGPT for medical advice could also have some pretty dire consequences.

A 60-year-old man discovered this for himself when he found himself in hospital after he poisoned himself on the AI chatbot's advice.

The man, whose case is detailed in the American College of Physicians Journals, was concerned about the amount of salt in his diet and the negative impact it could be having on his health, so he decided to consult ChatGPT about cutting out sodium chloride.

The AI bot suggested he start consuming bromide instead, which can be found in small amounts in seawater and in certain minerals. It was previously used as ingredient in a number of pharmaceutical products, however, it has since been discovered to be toxic to humans in larger quantities.

Advert

The man replaced sodium chloride with sodium bromide upon the advise from ChatGPT (Getty Stock Images)
The man replaced sodium chloride with sodium bromide upon the advise from ChatGPT (Getty Stock Images)

Unaware of this, the man began replacing salt with bromide he ordered from the internet and after about three months, he started experiencing severe paranoia and hallucinations, which led to him being hospitalised.

The man, who had no previous history of poor mental or physical health, initially suspected his neighbour of poisoning him, however, after being treated with fluids and electrolytes he shared other symptoms, including new acne and cherry angiomas, leading doctors to conclude he was experiencing bromism.

Bromism, which is caused by excessive exposure to bromine, can cause neurological symptoms like seizures, tremors, confusion and even comas. It can also cause anxiety, depression, psychosis, fatigue and anorexia, among other symptoms.

"Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet," the case report explained.

He replaced table salt 'sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning'.

ChatGPT's Terms of Use states information is not always correct (Cheng Xin/Getty Images)
ChatGPT's Terms of Use states information is not always correct (Cheng Xin/Getty Images)

After three weeks in hospital, the man was discharged and the author of the case has warned others not to make the same mistake of taking medical information from AI sources such as ChatGPT.

They wrote: "It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation."

Meanwhile, OpenAI, the developer behind ChatGPT says the Terms of Use say information 'may not always be accurate'.

The terms state: "You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice."

The company’s Service Terms also say: “Our Services are not intended for use in the diagnosis or treatment of any health condition.”

LADbible has contacted OpenAI for further comment.

Featured Image Credit: Getty Stock Images

Topics: Technology, Health, AI, Mental Health

Emma Rosemurgey
Emma Rosemurgey

Emma is an NCTJ accredited journalist who recently rejoined LADbible as a Trends Writer. She previously worked on Tyla and UNILAD, before going on to work at the Mirror Online. Contact her via [email protected]

Advert

Advert

Advert

  • ChatGPT CEO makes dark admission over what happens when you search using AI
  • ChatGPT boss issues warning for anyone using AI as a form of 'therapy'
  • Man ‘solves clicking jaw issue’ he’s had for five years using ChatGPT in seconds
  • ChatGPT CEO Sam Altman makes alarming prediction about the AI future

Choose your content:

18 mins ago
an hour ago
12 hours ago
  • Frazer Harrison/Getty Images
    18 mins ago

    Doctors explain five key lifestyle changes to reduce risk of bowel cancer as James Van Der Beek dies aged 48

    The father-of-six, 48, was diagnosed with bowel cancer in 2023

    News
  • Getty Stock Image
    an hour ago

    NHS issue warning against eating grapefruit for millions taking common antidepressant

    The citrus fruit really doesn't mix well with some medications

    News
  • Emma McIntyre/Getty Images
    an hour ago

    Key symptoms of bowel cancer in young people as important signs often ignored

    Dawson's Creek actor James Van Der Beek, 48, passed away earlier this week following his stage three colorectal cancer diagnosis

    News
  • TikTok/SouthYorkshireFire
    12 hours ago

    True meaning behind ‘H’ on signs outside people’s houses has been shared

    You see them everywhere and they contain some important information

    News