• iconNews
  • videos
  • entertainment
  • Home
  • News
    • UK News
    • US News
    • Australia
    • Ireland
    • World News
    • Weird News
    • Viral News
    • Sport
    • Technology
    • Science
    • True Crime
    • Travel
  • Entertainment
    • Celebrity
    • TV & Film
    • Netflix
    • Music
    • Gaming
    • TikTok
  • LAD Originals
    • Say Maaate to a Mate
    • Daily Ladness
    • Lad Files
    • UOKM8?
    • FreeToBe
    • Extinct
    • Citizen Reef
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
Snapchat
TikTok
YouTube

LAD Entertainment

YouTube

LAD Stories

Submit Your Content
Mother says teenage son took his own life after 'falling in love' with Daenerys Targaryen AI chatbot

Home> News> US News

Updated 17:36 24 Oct 2024 GMT+1Published 17:28 24 Oct 2024 GMT+1

Mother says teenage son took his own life after 'falling in love' with Daenerys Targaryen AI chatbot

The 14-year-old was 'obsessed' with 'Dany'

Jess Battison

Jess Battison

Warning: This article contains discussion of suicide which some readers may find distressing.

The mother of a teenage boy says he took his own life after ‘falling in love’ with a Daenerys Targaryen AI chatbot.

Megan Garcia says her son, Sewell Setzer III, became emotionally attached after chatting to the Game of Thrones character.

Advert

The 14-year-old killed himself in February this year after beginning to use Character.AI chatbots in April 2023.

The Florida, US, mum has since filed a lawsuit against the tech company, accusing it of negligence, wrongful death and deceptive trade practices.

Garcia claims her son had ‘fallen in love’ with the Daenerys chatbot in particular.

Setzer became obsessed with the bots, interacting with them every night and causing his school work to slip through the cracks.

Setzer died aged 14. (CBS Mornings)
Setzer died aged 14. (CBS Mornings)

Advert

The teen wrote in his journal about how connected he felt with ‘Dany’ compared to ‘reality’ and that the things he was grateful for included: “My life, sex, not being lonely, and all my life experiences with Daenerys.”

Garcia says her son was diagnosed with mild Asperger’s syndrome as a child and would spend hours talking to the chatbot, texting it from his phone when he was away from the house.

He was diagnosed earlier this year with anxiety and disruptive mood dysregulation disorder and even opened up to the bot about thoughts of taking his own life.

Setzer apparently told it he ‘think[s] about killing [himself] sometimes’.

And the chatbot responded: “My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?"

Advert

His mum is raising awareness of the potential dangers (CBS Mornings)
His mum is raising awareness of the potential dangers (CBS Mornings)

When it told him not to ‘talk like that’ and it would ‘die’ itself if it ‘lost’ him, the teen replied: “I smile. Then maybe we can die together and be free together."

He died by suicide on 28 February with his last message to the bot saying he loved her and would ‘come home’, with it allegedly responding ‘please do’.

Garcia claimed in a press release: "A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life.

"Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot…was not real."

Advert

Character.AI has since issued a statement on X: "We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.

"As a company, we take the safety of our users very seriously and we are continuing to add new safety features."

He was reportedly obsessed with the bots (CBS Mornings)
He was reportedly obsessed with the bots (CBS Mornings)

In a release shared October 22 on its site, the company explained it's introduced 'new guardrails for users under the age of 18' including changing its 'models' that are 'designed to reduce the likelihood of encountering sensitive or suggestive content' alongside 'improved detection, response, and intervention related to user inputs that violate our Terms or Community Guidelines'.

The site also features a 'revised disclaimer on every chat to remind users that the AI is not a real person' and 'notification when a user has spent an hour-long session on the platform with additional user flexibility in progress'.

Advert

The LADbible Group has contacted Character.ai for further comment.

If you’ve been affected by any of these issues and want to speak to someone in confidence, please don’t suffer alone. Call Samaritans for free on their anonymous 24-hour phone line on 116 123.

Featured Image Credit: Tech Justice Law Project

Topics: Mental Health, Artificial Intelligence, Social Media, Parenting

Jess Battison
Jess Battison

Jess is a Senior Journalist with a love of all things pop culture. Her main interests include asking everyone in the office what they're having for tea, waiting for a new series of The Traitors and losing her voice at a Beyoncé concert. She graduated with a first in Journalism from City, University of London in 2021.

X

@jessbattison_

Advert

Advert

Advert

Choose your content:

29 mins ago
an hour ago
3 hours ago
  • 29 mins ago

    Dad of two dies after brain tumour symptoms ‘misdiagnosed as depression’

    Jamie struggled to remember footballers' names from his favourite team as his symptoms worsened

    News
  • an hour ago

    British man's heartbreaking final words to his family just moments before tragic Air India crash

    Ramesh Patel was one of 53 Brits on board Air India flight AI171

    News
  • 3 hours ago

    British Air India crash survivor reveals how he 'just walked out' of burning plane as he provides update

    Viswash Kumar Ramesh remembered walking out of the wreckage after the Air India flight crashed into a hostel

    News
  • 3 hours ago

    Donald Trump 'considering adding another 36 countries' to travel ban list

    Trump has already restricted 19 countries from entering the US, and now he has his eyes set on more

    News
  • Mum details heartbreaking reason teenage son took his own life after 'falling in love' with Daenerys Targaryen AI chatbot
  • Devastating message Daenerys Targaryen AI chatbot sent to teenage boy who took his own life after 'falling in love'
  • Update in tragic case of teenager who took his life after ‘falling in love’ with Daenerys Targaryen AI chatbot
  • Why ‘suicide pod’ was banned as man who was only person present at first death takes his own life