Smart speakers like the Amazon Echo and Google Home are growing rapidly in popularity — especially in the United States — as more people are using them for everyday inquiries ranging from turning on/off lights, starting complicated smart routines, to getting traffic and weather updates, to playing back some upbeat tunes while you’re cooking dinner. But while these devices are incredibly handy when it comes to answering questions or performing tasks, many people are concerned about the privacy risk that they present.
In order to give consumers better privacy protections, California Assemblyman Jordan Cunningham (R-San Luis Obispo) has introduced a bill entitled “The Future of Eavesdropping Act” or (AB 1395). So, what exactly does the bill entail? Well, the text indicates that the bill would “prohibit a smart speaker device, as defined, or a specified manufacturer of that device, from saving or storing recordings of verbal commands or requests given to the device, or verbal conversations heard by the device, regardless of whether the device was triggered using a key term or phrase.”
Although the bill doesn’t really go into great detail, no distinction is made as to whether the bill would prevent recordings from being stored on-device or on, for example, Google or Amazon servers. Regardless of that distinction, voice assistants like Alexa store voice recordings in the cloud so that it can adapt to a user’s speaking style (or differentiate between two or more people in a household).
In the case of a device like an Amazon Echo, a voice recording is captured once the wake word — usually “Alexa” — is spoken aloud. As Amazon explains, “When you speak to Alexa, a recording of what you asked Alexa is sent to Amazon’s cloud so we can process and respond to your request.”
Customers have the opportunity to go into the Alexa app and delete these recording if they’re hypersensitive about their privacy, but Amazon adds this warning, “Deleting voice recordings may degrade your Alexa experience. If you delete voice recordings, we will also remove Home Screen Cards in the Alexa app related to those voice recordings.”
The company goes on to state, “We use your requests to Alexa to train our speech recognition and natural language understanding systems. The more data we use to train these systems, the better Alexa works, and training Alexa with voice recordings from a diverse range of customers helps ensure Alexa works well for everyone.”
With this in mind, it seems like AB 1395 would most definitely interfere with the performance of today’s modern voice assistants, and would likely see a lot of pushback from tech giants who rely on such data. In Amazon’s case, the company has come under fire for lapses in security with regards to Alexa voice data as witnessed by a case where a woman’s conversation was sent unbeknownst to her to someone on her Echo’s contact list.
In Germany, 1,700 private voice recordings and other telemetry data from Alexa was sent to an Amazon customer by accident. The customer had requested a copy of his on personal data stored by Amazon, and received the aforementioned recordings by accident.
More recently, Google came under fire when it was learned that Nest Secure was equipped with a microphone from the moment it was first launched, but didn’t disclose that capacity to customers (or anyone else for that matter). It was only when Google Assistant functionality was enabled for the device that it was revealed that the Nest Secure did indeed have a microphone all along.
“The on-device microphone was never intended to be a secret and should have been listed in the tech specs. That was an error on our part,” said a Google spokesperson when questioned about the microphone.
There are definitely some legitimate concerns about customer privacy with respect to devices that rely on Alexa or the Google Assistant, but it remains to be seen if bills like AB 1395 (you can read the proposed bill’s text here) are the right way to go about changing things for the better.