We’ve written extensively about security concerns in recent weeks, but this might be the most shocking development yet. Reddit user FallenMyst posted on Tuesday that he had recently started a new job with a company called Walk N’Talk Technologies.
At his job, he is asked to listen back to seemingly random bits of audio and provides feedback on how they match with the text that the device outputs when translating the audio ‚ and you won’t believe where the audio comes from.
DON’T MISS: Samsung explains why its smart TV isn’t actually stalking you
Here’s a more in-depth description from one of his comments:
“I’m given an audio file (sound bite) and the corresponding text based translation (how the phone translated the speech). My job is to listen to the file, compare it to the text and provide feedback on how correctly the sound bite was interpreted by the phone. If the text and speech are a perfect match, I just move on. However, if the phone either translated something incorrectly due to a heavy accent or loud background noise, I note that in my evaluation.”
Here’s where things get interesting — after working his way through several of these sound bites, FallenMyst realized that much of what he was listening to were commands that mobile phone owners were giving to their personal assistants such as Siri and Cortana.
“I heard everything from kiddos asking innocent things like “Siri, do you like me?” to some guy asking Galaxy to [do something inappropriate and physically impossible]. I wish I was kidding,” writes FallenMyst.
Many commenters noted that you can find your entire log of recorded voice commands by heading to this page within your Google account. Unless you turn the setting off, everything you say can be recorded and apparently has a chance of winding up in the hands of a third party.