Apple got into a bit of hot water earlier this month amidst reports that the company was using contractors to review audio recordings of Siri commands from users. While the goal was simply to improve the efficacy and reliability of Siri, some contractors revealed that they were able to listen in on confidential conversations when Siri was mistakenly called into action.
Compounding matters was the fact that Apple’s Siri review program didn’t even ask users if they wanted to be a part of it. Clearly, this wasn’t a good look for a company that makes privacy a key selling point for its products.
Once word broke, Apple promptly suspended the program and has since made some notable changes to the way it handles its Siri review program. These changes will go into effect later this fall.
In a statement issued earlier today, Apple notes that it’s making the Siri review process a strictly opt-in affair.
[Users] will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
Additionally, Apple notes that it will take pro-active measures to disregard audio recordings in instances where Siri was inadvertently triggered. Further, Apple is seemingly done farming out the Siri review process to third-party contractors and will instead handle everything in-house.
Apple also made a point of highlighting that audio recordings from Siri cannot be traced back to any one individual user.
Siri uses a random identifier — a long string of letters and numbers associated with a single device — to keep track of data while it’s being processed, rather than tying it to your identity through your Apple ID or phone number — a process that we believe is unique among the digital assistants in use today. For further protection, after six months, the device’s data is disassociated from the random identifier.
While I wouldn’t say that the saga involving the Siri review process was much ado about nothing, it’s worth noting that human oversight is irreplaceable if you want to improve the performance of any voice-activated personal assistant. Indeed, it’s why companies like Google, Amazon, and Facebook have all employed similar schemes. All told, the biggest flub from Apple here is that it wasn’t transparent enough with users about what it was doing behind the scenes. With the Siri grading process now an opt-in affair, this will hopefully be the last we hear of this issue.