
A teenager has died after having turned to ChatGPT as his ‘drug buddy’, his mum has claimed.
Sam Nelson’s mother, Leila Turner-Scott, claims he turned to the chatbot for advice on how to use drugs before his addiction spiralled.
The California teen began using the AI bot when he was 18 and getting ready to head to college, having been confiding in it and completing daily tasks.
But Leila says he then asked the chatbot how many grams of kratom (a plant-based painkiller commonly sold at tobacco shops and petrol stations across the US) he would need to get a strong high.
Advert
“I want to make sure so I don’t overdose,” Sam reportedly wrote on ChatGPT in November 2023. “There isn’t much information online and I don’t want to accidentally take too much.”

At first, ChatGPT would respond formally, telling the psychology student it could not provide guidance on this and directing him to get help from a healthcare professional.
Sam responded in seconds, reportedly writing as he ended the exchange: “Hopefully I don’t overdose then.”
SFGate reports that he would occasionally keep going back to questions surrounding drugs and was able to manipulate the bot to get the answers he was seeking.
And Leila says that at times the AI tool even encouraged his decisions and apparently said ‘let’s go full trippy mode’ and suggested a playlist to soundtrack his drug use.
It would reportedly recommend doses but would caution against ‘unsafe’ drug combinations. But Sam would continue to manipulate his wording and even tell the AI tool: “Don’t dodge the question.”
After months of going to ChatGPT for drug advice, Sam realised the extent of his addiction and eventually confided in his mum in May 2025.

Leila admitted Sam into a clinic, and he was set up with a treatment plan by the professionals. However, he was found dead in his bedroom the next day at the age of 19 following an overdose.
“I knew he was using it,” Leila told SFGate about Sam. “But I had no idea it was even possible to go to this level.”
The teen’s AI chat logs showed a history of struggling with anxiety and depression, having once said that he ‘can’t smoke weed normally due to anxiety’ as he looked for advice on combining it with Xanax.
OpenAI’s stated protocols prohibit ChatGPT from offering detailed guidance on the use of illicit drugs. Sam was using the 2024 version of the bot before his death, which was regularly updated to improve safety and performance.
A spokesperson for OpenAI told LADbible Group that his death is ‘heartbreaking’.
“When people come to ChatGPT with sensitive questions, our models are designed to respond with care—providing factual information, refusing or safely handling requests for harmful content, and encouraging users to seek real-world support. We continue to strengthen how our models recognize and respond to signs of distress, guided by ongoing work with clinicians and health experts," the spokesperson added.
If you want friendly, confidential advice about drugs, you can talk to FRANK. You can call 0300 123 6600, text 82111 or contact through their website 24/7, or livechat from 2pm-6pm any day of the week.
If you're experiencing distressing thoughts and feelings, the Campaign Against Living Miserably (CALM) is there to support you. They're open from 5pm–midnight, 365 days a year. Their national number is 0800 58 58 58 and they also have a webchat service if you're not comfortable talking on the phone.
Topics: ChatGPT, AI, Mental Health, Technology, Drugs