Amazon has a pretty straightforward response to the news that emerged last week about its facial recognition system incorrectly matching photos of 28 members of congressmen to mugshots.

Those results came a result of testing by the ACLU, which took 25,000 publicly-available police mugshots and then asked Rekognition to compare those images to photos of all 535 members of Congress. When the results came in, 28 lawmakers were positively ID’d as matching the faces of arrestees.

Amazon’s response to using the facial recognition tech that way? Basically — “You’re doing it wrong.”

In a blog post written by Matt Wood, Amazon’s general manager for deep learning and artificial intelligence, he explains the ACLU used the tech’s default setting of an 80 percent confidence level. Amazon suggests setting the confidence threshold instead at 99 percent (“As we recommend in our documentation”), at which point the misidentification rate, Matt writes, drops to zero. “This illustrates how important it is for those using ‎the technology for public safety issues to pick appropriate confidence levels, so they have few (if any) false positives.

He goes on to point out that in public safety and law enforcement scenarios, Rekognition is “almost exclusively used” to help narrow the field, to then allow humans to come in and quickly review options using their judgement — as opposed to the system making fully autonomous decisions.

“A final word about the misinterpreted ACLU results. When there are new technological advances, we all have to clearly understand what’s real and what’s not,” Wood writes. “There’s a difference between using machine learning to identify a food object and using machine learning to determine whether a face match should warrant considering any law enforcement action. The latter is serious business and requires much higher confidence levels.

“We continue to recommend that customers do not use less than 99% confidence levels for law enforcement matches, and then to only use the matches as one input across others that make sense for each agency. But, machine learning is a very valuable tool to help law enforcement agencies, and while being concerned it’s applied correctly, we should not throw away the oven because the temperature could be set wrong and burn the pizza.”

Amazon’s point about not using the default configuration is certainly well taken and worth stressing again. That doesn’t erase the very real possibility, though, that a law enforcement agency somewhere could use it at its default setting to question or detain innocent people for crimes they had no part in.

It’s why the ACLU has come out with a statement of its own in response to Matt’s post, in which the civil rights organization essentially calls for a federally imposed moratorium on use of Rekognition.

Jacob Snow, technology and civil liberties attorney at the ACLU Foundation of Northern California, says that “Amazon should take steps to fix the damage its ill-advised face surveillance product may have already caused and to prevent further harm. Amazon should respond to members of Congress. It should disclose every government agency that has already purchased this technology. And it should heed the calls of organizations and its own customers, employees, and shareholders and stop selling face surveillance to the government.”