Do Siri or Amazon Echo Undermine Your Privacy?


Source: Amazon

If you have misgivings about permitting the internet-connected devices in your home to listen in on what you and your roommates and family are doing, then you might feel like you’re alone. Plenty of people have enabled Siri to listen in on their conversations from an iPhone, and lots of people have an Amazon Echo sitting in their living room or kitchen. Few people seem concerned. After all, if you browse a list of common privacy myths, you’ll likely notice that they all involve people believing that they’re safer from surveillance or more protected from privacy invasions than they really are.

So it may not surprise you to learn that Stacey Gray, a legal and policy fellow at the Future of Privacy Forum, reports for Re/Code that many people don’t understand the ways in which always-on microphone-equipped devices are collecting information. For an illustration of the problem at hand, Gray points to the Amazon Echo, which can be activated by the spoken command “Alexa;” Mattel’s Hello Barbie; or Apple’s Siri, which can be activated by the command “Hey, Siri” on an iPhone or iPad.

As consumers grow accustomed to voice as a useful way to interact with these and other devices, they also bump into questions about how or when these devices are listening in on their conversations. Some are designed to stay on at all times, like security cameras or baby monitors, and Gray notes that others “use the microphone like an electronic ‘on button,’ allowing the detection of a spoken ‘wake phrase’ that triggers the device to activate and begin transmitting data.” In either case, these devices don’t necessarily pose privacy threats dangerous enough to make people think twice about adopting them, but there are privacy implications be aware of.

As noted in Gray’s new study on the privacy implications of such devices, published by the Future of Privacy Forum (PDF), each category of device presents different privacy implications, influenced by factors such as whether the collected data is stored locally (which is increasingly rare) or whether it’s transmitted from the device to a third party or to external cloud storage. Another important factor is whether the device is used for voice recognition (the biometric identification of an individual by the characteristics of his or her voice) or for speech recognition (the translation of voice input into text).

Gray reports that while the benefits of voice command are indisputable, and recent advancements in the technology vast, voice data “still has unique social and legal significance.” She reports that “when devices respond to our voices with human-like personalities, people interact with them differently, asking more intimate and personal questions.” Apple and other companies have found that smartphone-based personal assistants need to be able to respond appropriately to sensitive questions about rape, abuse, and mental health, and several companies have run into sensitive issues surrounding the recording of children’s conversations.
Further complicating the privacy implications is the fact that a person can be identified by the unique characteristics of his or her voice. Gray notes that “while biometric identification could be used to enable security controls over devices, it also implicates a range of federal and state laws and regulations, along with state biometric statutes like Illinois’s Biometric Information Privacy Act.” While most consumer devices aren’t designed with that purpose or capability, manufacturers still need to be aware of the regulations around biometric authentication, potentially including the applicability of two-party consent statutes, which are state laws that require consent from all parties before a conversation in a private setting can be recorded.

The privacy debates generated by microphone-enabled devices and services — from Google’s Chrome browser to Samsung’s smart TVs to Mattel’s Hello Barbie — has highlighted the tension between apprehension over data collection and the push toward the more convenient and capable interfaces enabled by speech activation and voice recognition. The specific privacy implications vary based on whether internet-connected devices are manually activated, speech activated, or always on, and by the social and legal context in which the devices are being used.
But when it comes to the devices in your home, expectations for privacy are generally higher than elsewhere — though it remains to be seen whether courts will be able to reconcile that expectation with the third-party doctrine that there isn’t any reasonable expectation of privacy in information shared with the outside world. There are plenty of questions that still need to be answered about microphone-equipped devices, and most consumers don’t fully understand what they’re getting themselves into when they buy a new device. Gray notes that manufacturers need to emphasize user awareness, consent-based features, and control over the device. But consumers would also do well to keep a few key questions in mind when gauging the privacy implications of a new microphone-equipped device.
You should find out whether data processing and storage happen locally or externally, and find out whether the device arrives with speech recognition or other audio recording functionality pre-enabled. You should also find out whether the device contains a hard on/off switch that can disable the microphone, and determine whether the device provides visual cues on when it’s recording or transmitting information. Additionally, you should be able to confirm that the use of your voice data is limited enough to prevent misuse.
More from Gear & Style Cheat Sheet: