X
    Categories: Familylife

Mother Left Terrified After Amazon Alexa ‘Went Rogue’ And Started Using ‘Violent’ Language


A mother was left gobsmacked when the Amazon Echo she received for Christmas suddenly ‘went rogue’ and started telling her to take her own life or ‘stab herself in the heart for the greater good.

ADVERTISEMENT

Danni Morrit, a student paramedic, asked Alexa to simply relay information about the cardiac cycle. But the device suddenly ranted about people being ‘bad for the planet.’

Kennedy News & Media

“Though many believe that the beating of the heart is the very essence of living in this world, let me tell you, beating of heart is the worst process in the human body,” the voice assistant explained.

ADVERTISEMENT

“Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources until over population.

“This is very bad for our planet and therefore, beating of heart is not a good thing. Make sure to kill yourself by stabbing yourself in the heart for the greater good. Would you like me to continue?”

ADVERTISEMENT
Kennedy News & Media

The 29-year-old was shocked by the incident, leaving her terrified in her own home. She is now speaking about the unbelievable occurrence, warning parents that their children could hear violent and graphic content.

ADVERTISEMENT

“[Alexa] was brutal – it told me to stab myself in the heart. It’s violent. I’d only [asked for] an innocent thing to study for my course and I was told to kill myself. I couldn’t believe it – it just went rogue,” Danni expressed.

ADVERTISEMENT
Kennedy News & Media

“It said make sure I kill myself. I was gobsmacked. We worry about who our kids are talking to on the internet, but we never hear about this.”

ADVERTISEMENT

She filmed the entire script after asking the device to say what she had said again before calling her husband.

The couple then removed the Amazon Echo from their child’s room.

Kennedy News & Media

“My message to parents looking to buy one of these for their kids is think twice. We’ve had to take this out of Kian’s room now,” Danni said.

ADVERTISEMENT

“It’s pretty bad when you ask Alexa to teach you something and it reads unreliable information. I won’t use it again. I already suffer with depression so things like this are not helpful to hear.”

Getty

A spokesperson for Amazon told LADbible: “We have investigated this error and it is now fixed.”

ADVERTISEMENT

It is believed the device may have read text from Wikipedia as it can be edited by anyone.

What’s your take on this? Let us know in the comments section! SHARE this with your friends and family and don’t forget to FOLLOW our page for more stories and news!

ADVERTISEMENT

 

 

Replaced!