Amazon stops police use of facial recognition technology
The shipping company to pull software because of its history of incorrectly tagging Black and Latino people
Amid nationwide protests against police brutality, Amazon has announced a moratorium on the use of its facial recognition technology.
The software has long been criticized due to its history of incorrectly tagging Black and Latinx people.
A 2018 test showed that the technology wrongly identified 28 members of Congress as people who had been charged with a crime. The American Civil Liberties Union (ACLU) called the software is “flawed, biased, and dangerous.”
Amazon sold the technology to law enforcement agencies as a tool to recognize suspects from photos and videos. However, the company said that they “hope this one-year moratorium might give Congress enough time to implement appropriate rules,” to regulate the technology.
Amazon Rekognition uses machine learning to compare an image with hundreds of thousands of others in a database. Critics warn that the software can lead to cases of mistaken identity.
Nicole Ozer, technology and civil liberties director of the ACLU, told NPR that the moratorium shows that Amazon finally recognizes the dangers that the software poses to Black and brown communities.
“Face recognition technology gives governments the unprecedented power to spy on us wherever we go,” Ozer said. “It fuels police abuse. This surveillance technology must be stopped.”
The technology has been shown to correctly identify lighter-skinned people with a misidentification rate of only 7% of white men, and a mistaken identity rate of 35% for darker-skinned women.
Earlier this week, IBM also wrote that they will stop providing their facial recognition technology to police departments. The company wrote a letter to Congress advocating for police reform, responsible use of technology, and broadening skills and educational opportunities.
In the letter, the company wrote, “IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency.”
Have you subscribed to theGrio’s new podcast “Dear Culture”? Download our newest episodes now!