• iconNews
  • videos
  • entertainment
  • Home
  • News
    • UK News
    • US News
    • Australia
    • Ireland
    • World News
    • Weird News
    • Viral News
    • Sport
    • Technology
    • Science
    • True Crime
    • Travel
  • Entertainment
    • Celebrity
    • TV & Film
    • Netflix
    • Music
    • Gaming
    • TikTok
  • LAD Originals
    • FFS PRODUCTIONS
    • Say Maaate to a Mate
    • Daily Ladness
    • UOKM8?
    • FreeToBe
    • Citizen Reef
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
Snapchat
TikTok
YouTube

LAD Entertainment

YouTube

LAD Stories

Submit Your Content
Woman 'nearly died' after ChatGPT gave her incorrect advice over poisonous plant

Home> News> Technology

Published 13:35 23 Jan 2026 GMT

Woman 'nearly died' after ChatGPT gave her incorrect advice over poisonous plant

YouTuber Kristi is warning her followers against blindly following information they receive from ChatGPT and other AI chatbots

Emma Rosemurgey

Emma Rosemurgey

Millions of people turn to ChatGPT every single day, to help them with anything from drafting an email to researching a DIY project, and pretty much everything in between.

However, one woman has revealed how the AI chatbot 'nearly killed' her best friend, proving the notion that a pinch of salt should be taken with every answer.

YouTuber Kristi took to Instagram to warn her followers of the dangers that could potentially come with blindly following information from the service, after her friend received advice that could've been fatal.

Kristi, who has nearly half a million followers on her account @rawbeautybykristi, shared the tale of how her pal nearly poisoned herself after ChatGPT reassured her that a poisonous plant in her backyard was actually completely harmless.

Advert

The friend sent the chatbot a photo of the unidentified plant, asking 'what plant is this,' only to be told it looks like carrot foliage. According to screenshots, ChatGPT went on to list several reasons it was 'confident' the plant was carrot foliage, including the 'finely divided and feathery leaves,' which is very 'classic' for carrot tops.

The friend sent photos of the plant to ChatGPT (@rawbeautybykristi/Instagram)
The friend sent photos of the plant to ChatGPT (@rawbeautybykristi/Instagram)

Interestingly, the chatbot went on to list some common lookalikes of carrot foliage, including parsley, cilantro (or coriander for us Brits), Queen Anne's lace and, shock horror, poison hemlock.

When Kristi's friend directly asked if the plant in the photo was poison hemlock, she was met with multiple reassurances it wasn't.

"I don't know if you guys know this, you eat it, you die. You touch it, you can die," Kristi told her followers, before sharing an answer she received on Google, which states that poison hemlock causes 'systemic poisoning' for which there is no antidote.

After sharing another photo with ChatGPT, the friend was reassured once again that the plant was not poison hemlock because it does not show smooth hollow stems with purple blotching, despite the image appearing to show exactly that.

What's even more concerning is the fact the friend was encouraged to incorrectly label the plant as carrot foliage on the assumption it might be in a shared garden in the school where she works.

When Kristi put the same photo into Google lens, another AI platform that allows you to search images, the responses immediately confirm it is in fact poison hemlock. Her friend then put the same images into a different ChatGPT window on her phone and was also immediately told the plant was poisonous.

"She's a grown adult and she knew to ask me beyond what ChatGPT said thank God, because what if she wasn't? They would literally be dead, there is no antidote for this," Kristi said.

"This is a warning to you that ChatGPT and other large language models and any other AI, they are not your friend, they are not to be trusted, they are not helpful, they are awful and they could cause severe harm."

LADbible has approached ChatGPT for comment.

Featured Image Credit: @rawbeautykristi

Topics: Instagram, ChatGPT, AI, Artificial Intelligence, Technology, Social Media

Emma Rosemurgey
Emma Rosemurgey

Emma is an NCTJ accredited journalist who recently rejoined LADbible as a Trends Writer. She previously worked on Tyla and UNILAD, before going on to work at the Mirror Online. Contact her via [email protected]

Advert

Advert

Advert

  • ChatGPT had startling response when it was asked what a normal person should do to become rich
  • ChatGPT had worrying response when asked what the world will look like in 2076
  • ChatGPT CEO makes dark admission over what happens when you search using AI
  • ChatGPT tells user how to 'blow up a sports venue' in shocking experiment

Choose your content:

an hour ago
2 hours ago
  • Getty Stock
    an hour ago

    UK government targeting loophole under 18s are using to access adult websites

    The government are looking for more ways to keep kids safe online

    News
  • GoFundMe
    an hour ago

    Update on man who was left in coma and given 4% chance of survival after pulling out ingrown hair

    US man Steven Spinale was thought to be brain dead, but his family never gave up

    News
  • Family Handout
    2 hours ago

    Driver admits to killing Brit teen who was jailed in Dubai after having sex with girl on holiday

    Marcus Fakana had only recently returned to the UK when he was killed in a car crash

    News
  • Getty Stock Images
    2 hours ago

    New driveway rule affecting every home in England could save you more than £1,000 a year

    The rule change encourages UK homeowners to make the switch to fully electric vehicles

    News