Click to Skip Ad
Closing in...

Thousands of Amazon employees hear what you say to Alexa

Updated Nov 15th, 2022 6:26AM EST
Amazon Echo

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Amazon has thousands of employees

across the globe who actively listen to Alexa recordings from Echo devices as part of a broad effort to improve the reliability of the product, according to a new bombshell report from Bloomberg. The revelation has naturally caused a significant amount of blowback from privacy advocates, primarily because Amazon hasn’t exactly been upfront about what it’s doing.

As the Bloomberg report lays out, Amazon describes the process in its privacy materials as follows: “We use your requests to Alexa to train our speech recognition and natural language understanding systems.”

Conveniently, the language above doesn’t at all suggest that Amazon employees are listening to consumer voice recordings. And though consumers certainly have the ability to opt-out of this speech recognition training program, why would the average consumer think to toggle it off when the description itself is seemingly innocuous? Indeed, one could just as reasonably assume, rather naively, that Amazon simply uses AI to analyze voice commands and improve overall reliability.

It’s worth noting that Amazon employees can’t trace back a specific recording to any specific user, but some privacy advocates would be quick to argue that that’s besides the point. Further, Amazon, in a statement on the matter, revealed that only a small percentage of user recordings get analyzed in the first place.

The report reads in part:

Occasionally the listeners pick up things Echo owners likely would rather stay private: a woman singing badly off key in the shower, say, or a child screaming for help. The teams use internal chat rooms to share files when they need help parsing a muddled word—or come across an amusing recording.

Sometimes they hear recordings they find upsetting, or possibly criminal. Two of the workers said they picked up what they believe was a sexual assault. When something like that happens, they may share the experience in the internal chat room as a way of relieving stress.

Commenting on the report, Amazon explained that they “only annotate an extremely small sample of Alexa voice recordings in order [to] improve the customer experience.”

Amazon also added that they implement “strict technical and operational safeguards” and that “all information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it.”

As for how the entire workflow operates, in a broad sense, reviewers listen to a voice recording, transcribe it, and annotate it to the degree that Alexa’s understanding of it wa accurate or off the mark.

There’s been a lot of debate regarding Bloomberg’s piece, but I think Ryan Mac’s take via Twitter is on the nose.

Some people will respond to this story with, “Well, what did you expect?”

What’s lost on them is that Amazon never explicitly said it does this. Amazon’s privacy policy is vague when it should be explicit and direct with what it does with customer information.

Privacy wise, this isn’t the first time an Amazon product has made headlines for all the wrong reasons.

This past January, word surfaced that employees were able to access live feeds from Ring’s smart doorbell camera. Incidentally, Amazon acquired Ring for $1 billion in 2018. Even more damning was the revelation that Ring employees could access a particular user’s feed armed only with an email address.

Yoni Heisler Contributing Writer

Yoni Heisler has been writing about Apple and the tech industry at large with over 15 years of experience. A life long expert Mac user and Apple expert, his writing has appeared in Edible Apple, Network World, MacLife, Macworld UK, and TUAW.

When not analyzing the latest happenings with Apple, Yoni enjoys catching Improv shows in Chicago, playing soccer, and cultivating new TV show addictions.