The Guardian last week ran an expose highlighting how contractors working to improve the reliability of Siri can sometimes listen to information that should by all accounts be kept private, a list which includes but is not limited to medical information and even couples having sex.

As many other companies do with respect to their own virtual assistant software, Apple employs these contractors to gauge the efficacy of Siri and whether or not the voice assistant was activated purposefully or not. The latter point is essentially the crux of the problem, with one anonymous contractor detailing how confidential conversations can be heard when Siri is inadvertently called into action.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” the contractor explained. “These recordings are accompanied by user data showing location, contact details, and app data.”

When the story first broke, Apple explained that such conversations cannot be traced back to a specific Apple ID and that contractors only have access to less than 1% of all Siri queries.

Since then, Apple has opted to take a stricter approach, which is to say the company has since suspended the Siri review program altogether.

Originally reported by TechCrunch, Apple relayed that it will review its Siri review process entirely. The company added that it plans to roll out a future iOS update that will let users opt out of the review program if they so choose.

“We are committed to delivering a great Siri experience while protecting user privacy,” Apple said in a statement. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

What’s interesting about this whole saga is that one anonymous source prompted Apple to essentially do a 180 on its Siri review protocol, which is somewhat odd given that every other company with voice assistant software employs the same techniques. And given that no recorded audio can be traced back to any individual user, Apple’s response here is arguably disproportionate. At the same time, Apple’s reaction makes sense given that the company has long been trying to position privacy as a feature.