editor's blog
Subscribe Now

A Nightmare Bug: Random People Listening In

Amazon has apparently had an Echo hiccup of the sort that would give customers bad dreams. It sent a random conversation to a random contact.

A couple had installed numerous Alexa-enabled devices in the home. At some point, they had a conversation – as couples are wont to do – involving, apparently, hardwood floors. No big deal, right? Except that they then got a call from one of their contacts (who is an employee of the husband) saying that they had been hacked. Why? Because that person had been texted a copy of the conversation, including a link to the original recording.

They say that an Amazon employee confirmed what had happened, so this wasn’t in their imagination.

There But for the Grace…

Hardwood floors aren’t a particularly edgy conversation. But just imagine what could have happened with another conversation and another contact. Needless to say, the couple unplugged all of their Alexa devices.

So what the heck is going on here? From everything I can tell, Amazon isn’t giving out any specifics. It seems to be a glitch in their message-sending feature. I looked at their help page, and it shows a few steps required to send such a message. First you say, “Send a message to <insert your contact here>,” then it confirms the contact if unsure, and then prompts for the actual message. It then sends (without further confirmation).

Based on this, there’s no way a conversation should be sent silently, without someone knowing. The theory is that Alexa misinterpreted part of the conversation as instructions to send a message. If that’s the case, it should have given at least one prompt. Maybe it did and they didn’t hear it.

Constant, Ubiquitous Surveillance?

I’ve heard stories of friends of mine having conversations at home about something unusual (for them), only to have ads for that thing start showing up in their Facebook feeds. Creepy much? Their assumption was that their cellphone was recording everything and sending it to our cumulonimbus overlords. Those overlords are then selling the info to advertisers.

The industry response – including mine to my friends – about such things is that, at least for now, it can’t happen. Why? Because the ongoing audio is processed only locally – until the wake-word is heard, at which point the following audio is sent to the cloud for processing. If that’s true, then full, non-command conversations should never go anywhere.

Another defense has been that sending full audio all the time would chew up way too much bandwidth. Which may be true. But we also know that that may well be a temporary situation. Will our communications infrastructure eventually have pipes fat enough to radio all conversations from everyone into the cloud for dissection and action?

That’s not what’s alleged to have happened in this case, but, for people already nervous about this, it might as well be the same thing.

Horrible Optics, at Best

Regardless of the technical facts, this is the kind of glitch that will freeze the hearts of users – current and potential. Amazon claims that the bug was extremely rare, and that they’re working on fixing it. I’m assuming that the log files that they examined contain enough information to point to a root cause. If so, then, yeah – they can fix it. It’s a good use of over-the-air updates!

I don’t know if the big tech companies know it, but there’s growing unease with the amount of power they and their technology wield. Our devices no longer let us control things like we used to be able to; the companies control them for us, presuming to know more about what we want than we do ourselves. And much of it amounts to controlling our behavior, getting us to do what is in their best interests and making it harder to learn what alternatives there might be. How much of that is fear vs. reality? Not clear. But it’s also not clear how far it could go, except for debacles that bring scrutiny by lawmakers.

With the advent of Europe’s GDPR taking effect, everyone is thinking privacy right now. My inbox is chock-full of emails from groups I never signed up with asking if I want to stay on the list. So it’s a bad week for a rather dramatic violation of privacy.

3 thoughts on “A Nightmare Bug: Random People Listening In”

  1. I agree that the privacy violation was quite acute in this case of the couple with too many Alexa devices that sent their conversation unintentionally to a contact. The only ads that I see show up in my Facebook feed or Google-powered ads are key words that I have typed into Google or Amazon products, so far. Prior to Amazon coming out with their Alexa device my Niece named her first child Alexa, so who knew that Alexa would become such a popular name?

  2. No matter how suspicious it has seemed that Amazon is encouraging us to put listening devices in every room of our homes, the company has always said that its Echo assistants are not listening in on or recording conversations. Over and over again, company spokespeople have promised that they only start recording if someone says the wake word: “Alexa”. Your blog post is really amazing to read. Thank you so much for sharing this outstanding post here. Keep blogging with new content.

Leave a Reply

featured blogs
Sep 21, 2018
在这我们谈论的不是您的叠层设计跟其他人比怎么样,而是您设计的 PCB 层叠结构,是刚性板、柔性板、刚...
Sep 21, 2018
  FPGA luminary David Laws has just published a well-researched blog on the Computer History Museum'€™s Web site titled '€œWho invented the Microprocessor?'€ If you'€™re wildly waving your raised hand right now, going '€œOoo, Ooo, Ooo, Call on me!'€ to get ...
Sep 20, 2018
Last week, NVIDIA announced the release of the Jetson Xavier developer kit. The Jetson Xavier, which was developed in OrCAD, is designed to help developers prototype with robots, drones, and other......
Sep 18, 2018
Samtec performs several tests in-house as part of our qualification testing on a product series; including Low Level Contact Resistance (LLCR). It measures the amount of resistance in a position on a part. LLCR is used in combination with several other tests to track the over...