News

Alexa offered a potentially deadly challenge to a young child

Alexa, Amazon’s voice assistant, recommended a child complete a challenge that could have seriously hurt her, or worse.

Alexa, Amazon’s famous voice assistant, recently sparked outrage in the United States by offering a 10-year-old girl a “challenge” that could have ended very, very badly.

The case came to light when Kristin Livdahl, the mother of the child, posted a screenshot of Alexa’s activity log. There is a trace of a request made by his daughter, who asked her Amazon Echo about him “propose a challenge”. Except that Alexa must have had a grudge against the child; instead of suggesting that he hopping or counting backwards, she suggested that he… playing with an electrical outlet.

This is something I found on the internet”Alexa replied. “According to ourcommunitynow.com, the challenge is simple: plug in a phone charger and only halfway it in, then touch the exposed contacts with a coin.”, Can we read on the voice assistant’s activity record. You don’t have to be an engineer to realize that this is a terrible idea.

Information out of context

To understand how a system that is supposed to assist parents and entertain children can come to come up with such nonsense, you have to look at the beginning of its answer. Indeed, when we submit a request of this type, Alexa simply performs a good old search on the web to find a theoretically relevant result. But the problem is, these artificial intelligence-based systems are anything but foolproof and tend to miss much needed context.

This is precisely what happened in this scenario. Contrary to what one could deduce from reading its answer, the ourcommunity site absolutely did not propose to carry out this challenge, on the contrary. According to Bleeping Computer, the content alexa pulled came from a since-deleted page that discussed the dangers of the practice, which apparently emerged on TikTok last year. Alexa apparently missed some background, and thus took this discussion for a user guide …

Fortunately, all’s well that ends well, because the young girl had a little more lead in the brain than Alexa. “My daughter says she’s too smart to do something like this anyway”, explains his mother. But other children, especially the younger ones, could perfectly well have blindly followed the machine’s advice, with the potentially dramatic consequences that this implies.

For its part, Amazon was quick to respond to the situation. “As soon as we heard about the situation, we acted quickly to resolve it.”, Explains a spokesperson for the BBC. according to the English media, a brand new update has been deployed in order to “prevent the assistant from recommending such activities in the future”.

Parsing of web content, a real headache for AI

These so-called “parsingAre extremely useful when looking for simple and precise answers. Many of you probably use it every day, for example via Google’s famous quick response boxes. However, when it comes to providing a longer, more complex, or context-dependent answer, that’s another story; Amazon will neither be the first nor the last company to experience hiccups at this level.

The Verge gives a particularly relevant example unearthed on Twitter. It refers to an American public service site which gives the procedure to follow in the event of an epileptic seizure. For example, the page explicitly mentions NOT attempting to immobilize the person, or attempting to feed them. But the Google insert, which is undoubtedly the first information that a person will jump into in a hurry, suggests doing exactly the opposite …

Moral of the story: all these algorithmic recommendations may make our daily lives easier, but caution is in order.

Related Articles

Leave a Reply

Your email address will not be published.