Voice assistants are clearly here to stay. In the years since Siri first landed on the iPhone, we’ve all gotten used to the idea of asking a piece of artificial intelligence for help. We love talking to Siri, asking questions of Google Assistant, and conversing with Cortana. More and more of us talk to Alexa on members of the Amazon Echo family of devices. We look to these voice assistants to run quick searches, and we use them to control the other gadgets in our homes. We also ask them for help in finding the content we’re looking for. Other times, we talk to them when we just need a laugh.
Over the years, many reviewers and users have debated which smart assistant is best. They’ve argued about which one is the smartest, which one is the best listener, or which one has the most useful array of app integrations and abilities. And they’ve debated the merits not only of the voice assistants that are accessible on smartphones, but also the capabilities of the assistants in newer gadgets, like the Google Home and Amazon Echo.
But if you value your privacy, you may have another question about Siri and Google Assistant and Alexa. Are there reasons we shouldn’t be so ready to talk to these voice assistants and let them listen in on the conversations we have at home, in the car, at the office, and everywhere we eat, shop, commute, relax, play, and otherwise live our lives? It turns out you may want to think twice about what you’re telling Siri or Alexa or Google Assistant. You may be pretty creeped out when you find out why.
1. Voice assistants are always listening
In Siri’s early days, the voice assistant was only listening when you pressed the home button on your iPhone and purposefully asked for her help. But that’s not the case anymore. On the most recent iPhones, you can activate Siri simply by saying, “Hey, Siri.” Which, as you might have guessed, means that Siri is constantly listening and waiting for you to say those magic words.
The same thing is true of home speakers, like Google Home or Amazon Echo, that are sitting on your shelf and waiting for you to ask for help. It’s helpful to be able to summon the help of a voice assistant without pressing a button. But it also means that voice assistant is listening to everything that’s going on within close range of the device. That may not matter if you’re hanging out at home by yourself, just watching Netflix. But if you’re having a conversation about highly personal matters, or discussing confidential details, do you really want your phone or your smart speaker listening in?
2. Your conversations aren’t as private as you think
As The Cheat Sheet recently reported, voice assistants like Siri and Alexa do undermine your privacy, even if most people don’t seem concerned. If you browse through common privacy myths, you’ll notice they all involve people believing they’re safer from surveillance or more protected from privacy invasions than they really are. So it makes sense that many people don’t understand the ways in which always-on microphone-equipped devices — think an iPhone with “Hey, Siri” on or the Amazon Echo — are collecting information.
The privacy implications are determined by factors like whether the collected data is stored locally (which is pretty rare) or whether it’s transmitted from the device to a third party or to external cloud storage. Another important factor? Whether the device is used for voice recognition or for speech recognition. Voice recognition involves the biometric identification of an individual by the characteristics of their voice. But speech recognition refers to the translation of voice input into text. Do you know exactly what the AI you’re talking to is doing?
3. It’s not just your conversations that your voice assistant can hear
While we’re running through unsettling scenarios in our heads, why not think about all the things that aren’t conversations that your voice assistant can hear? Sure, it hears when you’re asking it for the weather or requesting a report on the morning’s traffic. But it also hears the conversations you’re having with other members of your household, or whoever is within close proximity of your phone when you carry it around during the day. It picks up on what’s going on in the background, too, which is particularly unsettling when you think about devices like the Amazon Echo or Google Home.
The audio your voice assistant hears could reveal how many people live in your house. It could give someone insight into your schedule and the hours when you’re usually at home. The audio could reveal whether you have any pets at home. It could also reveal some patterns in the music you listen to, the TV shows you like, the movies you watch, and the games you play. You might think none of this information is actually recorded or extrapolated. But we live in a world where every piece of information you share with Google is used to show you ads and sell you products. It seems pretty unlikely tech companies aren’t going to take note of all the things they can learn about you.
4. Your conversations are recorded — and probably stored
So your voice assistant is always listening. And it’s sending at least part of what you to say to an external server or to cloud storage. That’s a necessary step, because the software that can figure out what you’re asking and then get you the information you’re asking for isn’t located on your device. But what happens to your data after it’s sent to that server? Chances are good that it’s stored and probably associated with your account. Dan Price reports for MakeUseOf that both Google Assistant (on Google Home) and Amazon Alexa (on the Echo family of devices) save the audio recordings of your requests and “log them against your user account.”
You can actually log in to your account and hear all of the requests that you’ve made to your voice assistant of choice. Price warns, “What if someone gains unauthorized access? There could be a lot of personal information saved there.” You can delete your history of voice requests. But you can’t do anything about the aggregated data that companies use to improve the assistant. That data is something you should worry about not only with Google or Amazon, but also with Apple. (Cupertino keeps Siri requests tagged with your device’s ID for six months and then stores the raw audio for another 18 months after that.)
5. We speak intimately with voice assistants
When a piece of software sounds like a human, we tend to talk with it like it is one. That works out great for everyone involved when you ask Siri to tell you a joke or ask Google Assistant to entertain you with a story. But as you’ll notice if you peruse our list of things you shouldn’t search on Google, we’re already too apt to share personal information with software like search engines. Personal information search engines and advertisers collect can make creepy and even damaging assumptions about who we are. That’s even easier to do when you’re talking to a voice assistant — a piece of software that’s designed to at least try to have a coherent and productive conversation with you.
It’s probably not a great idea to ask a voice assistant about your problems or your medical issues — at least if you don’t want Apple, Google, or Amazon to know about them. Apple and other companies have found their voice assistants need to respond to sensitive questions about rape, abuse, and mental health. It’s of the utmost importance to ask for help when you need it, but it’s also a good idea to safeguard your privacy in less urgent situations.
6. You can be identified by the unique characteristics of your voice
You’ve probably realized by now when you have a conversation with a voice assistant, your speech is recorded and stored, at least temporarily, on the servers of a major tech company. You may not be creeped out by that prospect, especially because we all give up some personal information in exchange for the apps and services we use every day, but this may change your mind. It turns out you can be identified by the unique characteristics of your voice. Aviva Rutkin reports for New Scientist that a passphrase as short as “Hey Siri” is “a powerful way to verify that you are who you say you are.”
Software that can recognize people’s voices, and distinguish one voice from another, is already being used in criminal investigations. And banks use the technology to determine whether people calling their help lines are actually scam artists and not real customers. Your voice offers insight into who you are and what you’re doing. And when analyzed by the right software, a recording of your voice can also reveal your height and weight, and even your demographic background. As Rutkin notes, “Having devices in the home that recognize voices does raise security concerns, especially if they understand what you’re saying.”
7. It’s tough to get real answers about how your device is collecting information
As with any gadget purchase, it’s always a good idea to do your research about the device on which you’re using a voice assistant. But the questions you’d need to ask to gauge the privacy implications of a new microphone-equipped device aren’t as straightforward as questions about how loud a speaker is or how large a screen is. In fact, some of these questions are pretty tough to find straight answers for.
You should find out whether data processing and storage happens locally or externally, and find out whether the device arrives with speech recognition or other audio recording functionality pre-enabled. You should also find out whether the device contains a hard on/off switch that can disable the microphone, and determine whether the device provides visual cues on when it’s recording or transmitting information. Additionally, you should be able to confirm the use of your voice data is limited enough to prevent misuse.