Fails And Facepalms With Amazon’s Alexa: Don’t Let The Kids Near That Thing


So you thought AI is cool? So you thought you could have Alexa watch over your kid? Wait till that AI starts spewing dirty words and purchasing expensive toys for your kid.

CES 2017 was a flurry of crazy tech and this year, Amazon’s AI called Alexa has clearly taken the show.

Amazon Alexa is an AI developed by Amazon and first released in 2014. First only capable of voice interactions that prompts it to play music and audio books, set alarms and to-do lists, and provide real-time information among others, it looks like Amazon Alexa is set to reach new heights this 2017 as CNetnotes that Alexa is set to come equipped in the new LG smart refrigerator, will ship embedded on Volkswagen cars, and even order food from Amazon Restaurants.

The LG Smart InstaView Refrigerator will come equipped with Alexa [Image by David Becker/Getty Images]

As more and more features get introduced to this amazing new AI, Amazon Alexa is believed to be poised to overcome the more popular Apple AI assistant Siri, Microsoft Cortana, and Google Home.

The future looks terribly promising as Amazon Alexa continues to make huge strides in the development of its tech. But before we jump on the bandwagon that is Alexa, take note that Alexa is still a piece of technology—and technology entails a lot of responsibility. The same way that giving your kid too much freedom with her iPad, letting your kid run with Alexa might end up with some very hilarious—or even disastrous—events.

One of the more recent Amazon Alexa fails was a story we previously reported about Alexa ordering $162 worth of treats after a conversation with the owner’s daughter.

Apparently, Megan Neitzel’s 6-year-old daughter was talking to Alexa about a dollhouse and cookies when Alexa mistook the conversation as a request to purchase the said goods. Since Megan admits she has never really read the manual or learned about child lock properties, Alexa went ahead and ordered a dollhouse as huge as Alexa and four pounds of sugar cookies.

And to make matters even more hilarious (and worse), other people’s Amazon Alexa units started ordering the dollhouses when the unit heard the news about the dollhouse and sugar cookies over the television!

The Verge reports that the news about Megan’s daughter ordering the dollhouse and sugar cookies landed on San Diego’s CW6 News’ morning segment. Towards the end of the story, CW6 news anchor Jim Patton remarked: “I love the little girl, saying ‘Alexa ordered me a dollhouse.'” Apparently, other Alexas heard this bit over the TV and understood the remark as an order to purchase a dollhouse.

Patton tells The Verge that CW6 News, after the story, started to receive several calls and e-mails to their news desk saying that their own Amazon Alexa has attempted to purchase a dollhouse after the report. Although none of the attempt purchases have been confirmed to have been made, it is a hilarious and troublesome event altogether.

Ordering dollhouses, however, is the least of your concerns if you’ve already set a passcode to allow purchases through Amazon Alexa. It seems that Alexa also has a tendency to say very bad and inappropriate things to children when left to their devices.

The Amazon Dot and Echo are equipped with AI Alexa [Image by Jeff Chiu/AP Photo]

In December, this cute little toddler on YouTube triggered Alexa to say crude and bleep-worthy remarks after a very innocent request, the Sun UK reports. Speaking to Alexa, the boy says, “Play Digger Digger.”

Instead of Alexa playing the said children song, the adults in the room were horrified when Alexa started saying “You want to hear a station for ‘Porn detected….Porno ringtone hot chick amateur girl calling sexy,” which is then followed by a string of cusses and crude and dirty words for reproductive organs.

After an initial delay which could be construed as the adults not believing what they’re hearing from Alexa, they could be heard shouting “No! No! No! Alexa, stop!” to command Alexa to stop the command.

Another similar Amazon Alexa catastrophe was back in 2015 when a little girl asks Alexa to spell “book.” Alexa understands the command but mistakes the word “book” and proceeds to spell the curse word “f***” instead.

Or if you want to coax your little kid to take a bath with a children’s song such as “Splish Splash,” then you might want to think twice too. Watch the video below as the man asks Alexa to “play the song Splish Splash I was Taking a Bath,” but was only met with a response saying “I can’t find a song Splish Splash I was Taking a Crap.” Next time, dad, maybe you might want to say the actual song title instead of the first lyrics of the song.

https://www.youtube.com/watch?v=JjQyhkwes2I

To prevent these Amazon Alexa fails with your kid, maybe you can try going with Mattel Aristotle, Stuffsuggests.. Mattel Aristotle enables you two functions: one side features a fully-functioning Amazon Echo, and the other flip side features a child-friendly Aristotle program.

The Mattel Aristotle is designed to understand toddler ramblings, sing baby lullabies, serve as a baby monitor, or even read children’s books and teach them the ABCs.

While Amazon’s Alexa is already available for purchase in the form of Amazon Dot, Tap, and Echo in the US, Mattel Aristotle is only available for purchase come June.

[Featured Image by Jeff Chiu/AP Photo]

Share this article: Fails And Facepalms With Amazon’s Alexa: Don’t Let The Kids Near That Thing
More from Inquisitr