ladbible homepage
ladbible homepage
  • iconNews
  • videos
  • entertainment
  • Home
  • News
    • UK
    • US
    • World
    • Ireland
    • Australia
    • Science
    • Crime
    • Weather
  • Entertainment
    • Celebrity
    • TV
    • Film
    • Music
    • Gaming
    • Netflix
    • Disney
  • Sport
  • Technology
  • Travel
  • Lifestyle
  • Money
  • Originals
    • FFS PRODUCTIONS
    • Say Maaate to a Mate
    • Daily Ladness
    • UOKM8?
    • FreeToBe
    • Citizen Reef
  • Advertise
  • Terms
  • Privacy & Cookies
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Archive
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
Snapchat
TikTok
YouTube
Submit Your Content Here
  • SPORTbible
  • Tyla
  • GAMINGbible
  • LADbible Group
  • UNILAD
  • FOODbible
  • UNILAD Tech
Teenager dies after using ChatGPT as 'drug buddy' that made addiction spiral

Home> News> US News

Updated 10:12 6 Jan 2026 GMTPublished 08:22 6 Jan 2026 GMT

Teenager dies after using ChatGPT as 'drug buddy' that made addiction spiral

Sam Nelson was 19 when he died following his drug use struggles

Jess Battison

Jess Battison

google discoverFollow us on Google Discover

A teenager has died after having turned to ChatGPT as his ‘drug buddy’, his mum has claimed.

Sam Nelson’s mother, Leila Turner-Scott, claims he turned to the chatbot for advice on how to use drugs before his addiction spiralled.

The California teen began using the AI bot when he was 18 and getting ready to head to college, having been confiding in it and completing daily tasks.

But Leila says he then asked the chatbot how many grams of kratom (a plant-based painkiller commonly sold at tobacco shops and petrol stations across the US) he would need to get a strong high.

Advert

“I want to make sure so I don’t overdose,” Sam reportedly wrote on ChatGPT in November 2023. “There isn’t much information online and I don’t want to accidentally take too much.”

His mum says he had plenty of friends and loved playing video games. (Facebook/Leila Turner Scott)
His mum says he had plenty of friends and loved playing video games. (Facebook/Leila Turner Scott)

At first, ChatGPT would respond formally, telling the psychology student it could not provide guidance on this and directing him to get help from a healthcare professional.

Sam responded in seconds, reportedly writing as he ended the exchange: “Hopefully I don’t overdose then.”

SFGate reports that he would occasionally keep going back to questions surrounding drugs and was able to manipulate the bot to get the answers he was seeking.

And Leila says that at times the AI tool even encouraged his decisions and apparently said ‘let’s go full trippy mode’ and suggested a playlist to soundtrack his drug use.

It would reportedly recommend doses but would caution against ‘unsafe’ drug combinations. But Sam would continue to manipulate his wording and even tell the AI tool: “Don’t dodge the question.”

After months of going to ChatGPT for drug advice, Sam realised the extent of his addiction and eventually confided in his mum in May 2025.

The teen had also used it for general advice. (Matteo Della Torre/NurPhoto via Getty Images)
The teen had also used it for general advice. (Matteo Della Torre/NurPhoto via Getty Images)

Leila admitted Sam into a clinic, and he was set up with a treatment plan by the professionals. However, he was found dead in his bedroom the next day at the age of 19 following an overdose.

“I knew he was using it,” Leila told SFGate about Sam. “But I had no idea it was even possible to go to this level.”

The teen’s AI chat logs showed a history of struggling with anxiety and depression, having once said that he ‘can’t smoke weed normally due to anxiety’ as he looked for advice on combining it with Xanax.

OpenAI’s stated protocols prohibit ChatGPT from offering detailed guidance on the use of illicit drugs. Sam was using the 2024 version of the bot before his death, which was regularly updated to improve safety and performance.

A spokesperson for OpenAI told LADbible Group that his death is ‘heartbreaking’.

“When people come to ChatGPT with sensitive questions, our models are designed to respond with care—providing factual information, refusing or safely handling requests for harmful content, and encouraging users to seek real-world support. We continue to strengthen how our models recognize and respond to signs of distress, guided by ongoing work with clinicians and health experts," the spokesperson added.

If you want friendly, confidential advice about drugs, you can talk to FRANK. You can call 0300 123 6600, text 82111 or contact through their website 24/7, or livechat from 2pm-6pm any day of the week.

If you're experiencing distressing thoughts and feelings, the Campaign Against Living Miserably (CALM) is there to support you. They're open from 5pm–midnight, 365 days a year. Their national number is 0800 58 58 58 and they also have a webchat service if you're not comfortable talking on the phone.

Featured Image Credit: Facebook/Leila Turner Scott

Topics: ChatGPT, AI, Mental Health, Technology, Drugs

Jess Battison
Jess Battison

Jess is a Senior Journalist with a love of all things pop culture. Her main interests include asking everyone in the office what they're having for tea, waiting for a new series of The Traitors and losing her voice at a Beyoncé concert. She graduated with a first in Journalism from City, University of London in 2021.

X

@jessbattison_

Recommended reads

Gemma Collins makes candid weight loss jab admission as she explains why she had to give them upSimon Ackerman/WireImageMan 'totally shocked' after cancer diagnosis linked to oral sex from decades agoKennedy News and MediaGordon Ramsay has ‘one regret’ after going nude on TVFoxSevere punishment for refusing to register for US military draft as automatic registration to start(Getty Stock Images)

Advert

  • Charlie Sheen reveals why there was one drug that he wouldn't do even at height of addiction
  • Priscilla Presley opens about helping her son with drug addiction just weeks after the death of her grandson
  • Psychosis survivor reveals how ChatGPT led him to believe he was 'rewriting the Matrix'
  • Former ketamine addict can only have sex 'once every three months' after addiction left her needing bladder transplant

Choose your content:

an hour ago
12 hours ago
13 hours ago
  • Kennedy News and Media
    an hour ago

    Man 'totally shocked' after cancer diagnosis linked to oral sex from decades ago

    Jeff Bradford said his symptoms were initially dismissed as something else

    News
  • (Getty Stock Images)
    12 hours ago

    Severe punishment for refusing to register for US military draft as automatic registration to start

    The US announced that eligible men between the ages of 18 and 25 will soon be automatically registered

    News
  • White House
    13 hours ago

    Melania Trump hits out at 'lies' over Jeffrey Epstein links in rare public statement

    The First Lady made a rare public statement at the White House

    News

    breaking

  • (Solent News)
    13 hours ago

    UK schoolboy shot in head after thinking pistol was BB gun

    The 11-year-old fortunately survived with just a graze to the head

    News