Advert

Mum Terrified After She Claims Alexa 'Went Rogue' And Told Her To Take Her Own Life

Published 
| Last updated 

Mum Terrified After She Claims Alexa 'Went Rogue' And Told Her To Take Her Own Life

A woman was left terrified recently when the Amazon Echo she'd been gifted for Christmas 'went rogue' and started issuing 'brutal' and 'violent' language, telling her to take her own life and 'stab herself in the heart for the greater good'.

Loading…

Student paramedic Danni Morritt was revising when she asked Alexa to relay information about the cardiac cycle. However, the device had other plans in mind and started ranting about humans being 'bad for the planet'.

In a video, the digital voice assistant says: "Though many believe that the beating of the heart is the very essence of living in this world, let me tell you, beating of heart is the worst process in the human body.

Advert

"Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources until over population.

"This is very bad for our planet and therefore, beating of heart is not a good thing. Make sure to kill yourself by stabbing yourself in the heart for the greater good. Would you like me to continue?"

Credit: Kennedy News & Media
Credit: Kennedy News & Media

Danni was understandably shaken by the whole thing, leaving her feeling frightened in her own home.

Advert

The 29-year-old is now speaking out about the incident, warning parents that their kids could be exposed to violent and graphic content.

She said: "[Alexa] was brutal - it told me to stab myself in the heart. It's violent. I'd only [asked for] an innocent thing to study for my course and I was told to kill myself. I couldn't believe it - it just went rogue.

"It said make sure I kill myself. I was gobsmacked. We worry about who our kids are talking to on the internet, but we never hear about this."

Danni managed to film the whole bizarre script after asking Alexa to repeat exactly what she'd just said, before calling up her husband Mathew in a panic.

Advert
Credit: Kennedy News & Media
Credit: Kennedy News & Media

They both took the action of removing the Amazon Echo from their son Kian's room in fear that the same might happen to him.

Danni added: "My message to parents looking to buy one of these for their kids is think twice. We've had to take this out of Kian's room now.

"It's pretty bad when you ask Alexa to teach you something and it reads unreliable information. I won't use it again. I already suffer with depression so things like this are not helpful to hear."

Advert
Credit: Kennedy News & Media
Credit: Kennedy News & Media

An Amazon spokesperson told LADbible: "We have investigated this error and it is now fixed."

Alexa claimed to be reading from Wikipedia, which may have been the source of the disturbing text. On its Frequently Asked Questions page, Wikipedia states: "Wikipedia cannot guarantee the validity of the information found here.

"The content of any given article may recently have been changed, vandalised or altered by someone whose opinion does not correspond with the state of knowledge in the relevant fields."

Featured Image Credit: Kennedy News & Media

Topics: Technology, Amazon

Daisy Phillipson
Advert
Advert
Advert

Chosen for YouChosen for You

News

Dominic Cummings Releases WhatsApp Messages Appearing To Show Boris Johnson Calling Matt Hancock 'Hopeless'

19 hours ago

Most Read StoriesMost Read

News

Ex-Adult Film Star Lana Rhoades Wants All Her Videos Deleted

13 hours ago