• iconNews
  • videos
  • entertainment
  • Home
  • News
    • UK News
    • US News
    • Australia
    • Ireland
    • World News
    • Weird News
    • Viral News
    • Sport
    • Technology
    • Science
    • True Crime
    • Travel
  • Entertainment
    • Celebrity
    • TV & Film
    • Netflix
    • Music
    • Gaming
    • TikTok
  • LAD Originals
    • Say Maaate to a Mate
    • Daily Ladness
    • Lad Files
    • UOKM8?
    • FreeToBe
    • Extinct
    • Citizen Reef
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
Snapchat
TikTok
YouTube

LAD Entertainment

YouTube

LAD Stories

Submit Your Content
Update in tragic case of teenager who took his life after ‘falling in love’ with Daenerys Targaryen AI chatbot

Home> News> US News

Published 11:40 25 May 2025 GMT+1

Update in tragic case of teenager who took his life after ‘falling in love’ with Daenerys Targaryen AI chatbot

Megan Garcia alleges that her young son was 'manipulated into taking his own life' by the AI-powered Game of Thrones character

Olivia Burke

Olivia Burke

Featured Image Credit: Tech Justice Law Project

Topics: AI, Artificial Intelligence, Technology, Parenting, US News

Olivia Burke
Olivia Burke

Olivia is a journalist at LADbible Group with more than five years of experience and has worked for a number of top publishers, including News UK. She also enjoys writing food reviews (as well as the eating part). She is a stereotypical reality TV addict, but still finds time for a serious documentary.

X

@livburke_

Advert

Advert

Advert

Warning: This article contains discussion of suicide which some readers may find distressing

A grief-stricken mother's legal battle against an AI company who she believes is responsible for her teenage son's death can continue, a judge has ruled.

Megan Garcia filed a landmark wrongful death lawsuit against Character.ai following the tragic death of her son Sewell Setzer III, who took his own life on 28 February last year after 'falling in love' with a Daenerys Targaryen AI chatbot.

The 14-year-old, from Florida, US had become emotionally attached to the AI-powered Game of Thrones character after he began chatting to it online in April 2023.

Advert

Garcia, a lawyer, claims that her son - who affectionately referred to the chatbot as 'Dany' - was targeted with 'anthropomorphic, hypersexualised, and frighteningly realistic experiences' while using Character.ai.

"A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life," the mum said, as per Sky News.

In the civil claim she has brought against Character Technologies - the firm behind Character.AI - which the individual developers, Daniel de Freitas and Noam Shazeer, and Google are also named in, Garcia is suing for negligence, wrongful death and deceptive trade practices.

Sewell Setzer III, 14, sadly took his own life last year after months of talking to the AI chatbot (CBS Mornings)
Sewell Setzer III, 14, sadly took his own life last year after months of talking to the AI chatbot (CBS Mornings)

She claims the founders 'knew' or 'should have known' that conversing with the AI characters 'would be harmful to a significant number of its minor customers'.

Advert

Lawyers for the company wanted the case dismissed, as they claimed that chatbots should be protected under the First Amendment.

Despite arguing that ruling against this would have a 'chilling effect' on the artificial intelligence industry, US Senior District Judge Anne Conway sided with Garcia on Wednesday (May 21).

The judge said she was 'not prepared' to agree that the chatbot's responses could be considered free speech 'at this stage'.

In her ruling earlier this week, Judge Conway told how Sewell had become 'addicted' to the AI app within a matter of months, seeing him become socially withdrawn and even quit his basketball team.

"[In] one undated journal entry he wrote that he could not go a single day without being with the [Daenerys Targaryen Character] with which he felt like he had fallen in love; that when they were away from each other they (both he and the bot) 'get really depressed and go crazy'," she said.

Advert

In wake of the judge's decision - which has been described as 'truly historic' by Meetali Jain, the director of the Tech Justice Law Project, which is supporting Garcia's case - the mum's lawsuit can now proceed.

His mother Megan Garcia has filed a lawsuit against Character.ai, which a judge has ruled can proceed (CBS Mornings)
His mother Megan Garcia has filed a lawsuit against Character.ai, which a judge has ruled can proceed (CBS Mornings)

"It sends a clear signal to [AI] companies [...] that they cannot evade legal consequences for the real-world harm their products cause," Jain said in a statement.

Sewell took his own life after sending the Daenerys Targaryen chatbot a message saying: "I promise I will come home to you. I love you so much, Dany."

He received the response: "I love you too, Daenero. Please come home to me as soon as possible, my love."

Advert

The teenager then said: "What if I told you I could come home right now?"

To which the chatbot replied: "...please do, my sweet king."

Sewell had also written about how he felt more connected to 'Dany' than 'reality, while listing things he was grateful for, which included: "My life, sex, not being lonely, and all my life experiences with Daenerys."

His mum says the 14-year-old, who was diagnosed with mild Asperger’s syndrome as a child, would spend endless hours talking to the chatbot.

Early last year, Sewell had also been diagnosed with anxiety and disruptive mood dysregulation disorder, and he told the chatbot that the thought 'about killing [himself] sometimes'.

Advert

The teenager's last conversation was with the AI-powered Game of Thrones character (CBS Mornings)
The teenager's last conversation was with the AI-powered Game of Thrones character (CBS Mornings)

The chatbot responded: “My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?"

When it told him not to ‘talk like that’ and it would ‘die’ itself if it ‘lost’ him, the teen replied: “I smile. Then maybe we can die together and be free together."

A spokesperson for Character.ai said the company will continue to fight the lawsuit, adding that it has safety measures in place to protect minors including features to stop 'conversations about self-harm'.

A spokesperson for Google, which is where the founders originally worked on the AI model, said the tech giant strongly disagrees with the judges ruling, while saying that it is an 'entirely separate' entity to Character.ai which 'did not create, design, or manage Character.ai's app or any component part of it'.

Legal analyst Steven Clark said the case was a 'cautionary tale' for AI firms, telling ABC News: "AI is the new frontier in technology, but it's also uncharted territory in our legal system. You'll see more cases like this being reviewed by courts trying to ascertain exactly what protections AI fits into.

"This is a cautionary tale both for the corporations involved in producing artificial intelligence. And, for parents whose children are interacting with chatbots."

If you’ve been affected by any of these issues and want to speak to someone in confidence, please don’t suffer alone. Call Samaritans for free on their anonymous 24-hour phone line on 116 123.

  • Mum details heartbreaking reason teenage son took his own life after 'falling in love' with Daenerys Targaryen AI chatbot
  • Devastating message Daenerys Targaryen AI chatbot sent to teenage boy who took his own life after 'falling in love'
  • Mother says teenage son took his own life after 'falling in love' with Daenerys Targaryen AI chatbot
  • Tragic update in case of skydiver who died after jumping 10,000ft from plane

Choose your content:

3 hours ago
4 hours ago
  • 3 hours ago

    Man horrified to discover his new home is actually someone else's Airbnb

    Ben Echianu, from Manchester, branded the woman whom he believes posed as a landlord as 'heartless'

    News
  • 3 hours ago

    Christian Horner releases statement after being sacked by Red Bull after 20 years

    Horner shared a lengthy statement on social media

    News
  • 4 hours ago

    Donald Trump shuts down reporter who asked Jeffrey Epstein question following release of CCTV footage

    It seems the US President deems the topic to be no longer up for discussion

    News
  • 4 hours ago

    Cardiologist who had heart attack warns people to not 'make same mistake' over little-known warning sign

    The irony isn't lost on Dr William Wilson

    News