Amazon has come under fire before for Rekognition, the facial recognition technology that the company has allowed police to use and which has led to concerns the company is essentially supporting the surveillance state. The technology has also come under scrutiny in the past for a variety of other reasons, such as the fact that the system could be flawed in a way that misidentifies minorities, for example.
Now, leading artificial intelligence researchers from across the technology and academic spectrum — including from Amazon rivals, like Google, Microsoft and Facebook — have published an open letter via Medium essentially scolding Amazon over selling the tech to police. And, of course, the letter asks the company to stop.
Citing a statement from Amazon vice president Michael Punke, noting that the company supports legislation that helps ensure its products aren’t used to infringe on civil liberties, the letter goes on to “call on Amazon to stop selling Rekognition to law enforcement as such legislation and safeguards are not in place.”
The letter appears to have been sparked partly by Amazon’s reaction to research from Massachusetts Institute of Technology researcher Joy Buolamwini. Her testing found that software from companies like Amazon — including software that’s made available to police — would give higher error rates when trying to detect the gender of dark-skinned women compared to lighter-skinned men. According to an Associated Press report, she included in her research software from Microsoft and IBM, which sought to fix the problems she identified.
Amazon, however, “responded by criticizing her research methods.” From the AI researchers’ open letter:
There are currently no laws in place to audit Rekognition’s use, Amazon has not disclosed who the customers are, nor what the error rates are across different intersectional demographics. How can we then ensure that this tool is not improperly being used as (Amazon Web Services GM for deep learning and AI Matthew Wood) states?
What can be relied on, the letter continues, are audits by independent researchers like Buolamwini “with concrete numbers and clearly designed, explained, and presented experimentation, that demonstrates the types of biases that exist in these products. This critical work rightly raises the alarm on using such immature technologies in high stakes scenarios without a public debate and legislation in place to ensure that civil rights are not infringed.”