Just when you thought owning a Google Assistant was less risky than owning Alexa, this happens.
A third-party language expert hired by Google leaked audio data recorded with the Google Assistant.
How did this happen?
Google uses hundreds of human private contractors to review some recorded conversations from Google Assistant.
In this particular case, the issue is the Dutch language. Google may not care what you say, but they do care how you say it. To teach the Assistant to function better, Google may need to rely on humans to convey the nuances of a particular syntax. An algorithm for English may need to behave differently from a Dutch algorithm. Bring in a human to “translate” to the machine, and the results should become better over time.
Why? It involves your privacy.
Your Google Assistant activates when you use the wake word (“OK, Google” and “Hey, Google.”) Conversations with your Google Assistant are supposed to be recorded only after you use the wake word.
However, Google Assistant has been known to mishear things and activate without its wake word being said. When this happens, the conversation is still recorded. Of more than a thousand recordings, the Dutch publication noted that 153 should not have been recorded – the wake words were never spoken. Do the math and that’s a 15% error rate. Not good.
This leak, of course, violates Google’s data and security policies. But the whistleblower was drawing attention to an important issue. The problem isn’t that the Assistant was recording the audio – that’s stated in the terms and conditions. The problem is that you cannot completely control when the Assistant records something… and you can’t control if a human will hear it.
VRT NWS says the recordings have included:
- Bedroom conversations
- Chats between parents and children
- Professional phone calls with lots of private information
- Medical questions
- Pornographic searches
- A woman facing physical violence
You can actually hear some of these recordings in the video at the bottom of their article. In some cases, the reporter finds the person who was speaking and plays their recording back to them.
The moral implications are enormous. And security experts say these recordings could be used anywhere. A recording of your own voice could possibly be used against you in court. And if Google or its contractors overhears domestic abuse, should they report it to the police or let the incident go?
This is a modern day “trolley problem.” If Google does nothing, they could be allowing crimes to continue unabated. If Google intervenes, owning an Assistant would be like inviting a police officer to be your roommate. And if Google intervenes selectively, what are its boundaries? And who decides them?
How is Google responding?
Today, Google posted: “Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.”
What protection do you have?
As Google points out, “You can turn off storing audio data to your Google account completely, or choose to auto-delete data after every 3 months or 18 months.”
If you’re worried about privacy, it might be best to hold off on buying an Assistant until this issue is solved. (Though it may never be truly safe.)
Not an isolated problem
If you have an Amazon Alexa, this issue is nothing new. Human employees have shared recordings from that device as well.
Remember that any device with a camera or microphone could be exploited to record you. And anything you type into a keyboard could be logged as well. It’s best to use a browser and a search engine that defends your privacy, use a VPN, and think twice before allowing permissions to any app you download.
Google and Facebook make their money by marketing your data, so you may want to consider alternatives whenever possible. But we get it, the convenience is a big draw. Stay safe out there, internet friends.