Facial recognition technology has come a long way in recent years, and today many smartphone users rely on face unlock features to keep their devices secure. Amazon’s facial scanning software, called Rekognition, won’t unlock your phone, but it is currently being used by some law enforcement agencies to identify criminals based on their mugshots.
Last year, the ACLU ran the Rekognition software on photos of members of Congress and found that the software flagged 28 of them as being known criminals. These false positives showed serious flaws in Amazon’s software and highlighted a racial bias in the program, disproportionately tagging people of color. Now, a year later, the ACLU ran its test again … and it looks like almost nothing has changed.
In the latest test, the Rekognition software was used on photos of 120 lawmakers from California, and it once again produced dozens of false matches to known criminals. 26 individuals were flagged as being matches to mugshots in a criminal database, which is roughly one out of every five of the officials that were run through the system.
The test aimed to highlight the drawbacks of using such inaccurate technology to identify potential criminals. Police forces who may rely on the software to quickly identify individuals with criminal records could easily end up misidentifying a person, or incorrectly believe that they are wanted under another name.
“The software clearly is not ready for use in a law enforcement capacity,” Phil Ting, a California Assemblyman who was one of the lawmakers falsely flagged as being a criminal, told the L.A. Times. “These mistakes, we can kind of chuckle at it, but if you get arrested and it’s on your record, it can be hard to get housing, get a job. It has real impacts.”
Amazon, for its part, claims that the ACLU is “misusing” the software, which can be customized so that it only returns results when it is 99 percent confident in a match. The company offered the following statement to Gizmodo:
The ACLU is once again knowingly misusing and misrepresenting Amazon Rekognition to make headlines. As we’ve said many times in the past, when used with the recommended 99% confidence threshold and as one part of a human-driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking. We continue to advocate for federal legislation of facial recognition technology to ensure responsible use, and we’ve shared our specific suggestions for this both privately with policy makers and on our blog.
That’s great and all, but if the software is meant to be used with this specific setting in place, why is it even possible to change it, and what is the likelihood that every law enforcement agency that uses it is also following that recommendation?