Click to Skip Ad
Closing in...
  1. Best TV Soundbar
    09:57 Deals

    Did someone make a mistake? There’s no way this soundbar should only cost $49.99

  2. Screwdriver Set Amazon
    13:47 Deals

    Amazon shoppers rave about this 22-in-1 screwdriver set that’s down to $18 today

  3. Surge Protector Amazon
    15:01 Deals

    Brilliant $30 Amazon find expands a power outlet without an ugly power strip

  4. Kasa Smart Home Sale
    12:05 Deals

    Amazon’s massive Kasa smart home sale has deeper discounts than Prime Day

  5. Amazon Deals
    10:34 Deals

    Today’s best deals: Exclusive Prime-only sale, $50 camera drone, Alexa smart thermos…

Apple won’t let other people listen to your Siri recordings anymore unless you opt-in

August 2nd, 2019 at 6:50 AM
Apple Siri Recordings

A report a few days ago revealed that third-parties listen to parts of Siri voice recordings to improve the accuracy of the service. A former employee for one of this Siri grading companies claimed that humans “regularly hear confidential medical information, drug deals, and recordings of couples having sex,” among other things. The report followed stories about similar practices involving Amazon’s Alexa and Google’s Assistant. Actual humans can improve voice assistants by being allowed to listen in to voice recordings.

Apple has now suspended the program to conduct a review, and will only resume it once Siri users actually give their consent to have their Siri interactions graded.

“We are committed to delivering a great Siri experience while protecting user privacy,” an Apple spokesperson told The Verge. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

As The Verge points out, it’s unclear whether Apple will stop saving recordings to its servers while the program is suspended. Apple keeps recordings for six months, at which point it can remove identifying information and store the recording for two years or more, the report notes.

“A small portion of Siri requests are analyzed to improve Siri and dictation,” Apple told The Guardian, which first published details about the Siri grading program. “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities, and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

So far, Siri users were not able to opt-out from any Siri grading program, and Apple’s terms of service weren’t clear that such recordings could be made available to third-parties, The Verge explains.

Chris Smith started writing about gadgets as a hobby, and before he knew it he was sharing his views on tech stuff with readers around the world. Whenever he's not writing about gadgets he miserably fails to stay away from them, although he desperately tries. But that's not necessarily a bad thing.

Popular News