Groups have questions about how facial recognition technology impact Black people

Several organizations are voicing concerns about the use of facial recognition technology, and its accuracy and fairness.

A recently released Pew Research Center study shows Black Americans are skeptical about using facial recognition technology. 

And they’re not the only ones.

Organizations such as the American Civil Liberties Union (ACLU), the Brookings Institute, and Harvard University question the technology’s accuracy and negative impact on Black and communities of color. 

facial recognition technology
A video surveillance camera hangs from the side of a building on May 14, 2019 in San Francisco. (Photo by Justin Sullivan/Getty Images)

Simply put, the technology–with varying degrees of accuracy–has difficulty identifying faces, especially Black and brown ones, under less-than-ideal circumstances. 

“This technology is dangerous when it works and dangerous when it doesn’t,” Nate Wessler, the deputy project director of the Speech, Privacy, and Technology Project at the ACLU, told theGrio. 

Consider the data.

The Gender Shades Project looked at more than 1,270 images representing different gender and skin types. According to its report, the project found that the technology it studied contained higher facial recognition error rates among women with darker skin.

The Center for Strategic and International Studies notes the technology does very well in ideal conditions like good lighting and picture clarity. The center said that absent ideal conditions, though, the error rate in one instance jumped from 0.1% to 9.3%. 

Then there’s this, from Detroit Police Chief James Craig, whose officers arrested and held a Black man, Robert Williams, for 30 hours based on faulty facial recognition technology. In a statement to the Detroit Free Press, Williams said he was coming home from work when police arrested him outside his home and in front of his family. 

Prosecutors dropped the case, and Williams has since sued Detroit police.

“If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify,” Vice News quoted Craig as saying in a public meeting.

James had previously told the Detroit Free Press the arrest was the result of “shoddy investigative work” and said he would personally apologize. 

Wessler and the ACLU have been one of the more active organizations advocating against the use of facial recognition technology. Here’s his conversation with theGrio, edited for brevity and clarity. 

Why is the ACLU concerned about the use of facial recognition technology?

We know that this technology isn’t perfect. It’s machine learning algorithms that make a computerized guess about whether one photo matches some other set of photos. This technology is not designed, even in the best of circumstances, to give a perfectly accurate match. But we also know that this technology fails more often and sometimes markedly more often when used on people of color and darker skin. Even if we were talking about technology that somehow became 100 percent accurate all the time, that wouldn’t satisfy our concerns because it raises the specter of perfect government surveillance that we have never known in this country, and that, in fact, our democracy has never seen.

facial recognition and law enforcement
Robert Williams has sued Detroit Police after a false facial recognition match led to him being wrongfully identified and subsequently arrested as a shoplifting suspect. (ACLU)

It raises the possibility that police can hook up facial recognition technology to a network of surveillance cameras across our cities and identify and follow us, or as we go about our daily business, figure out instantaneously who each of us is, where we’re going, where we’ve been. And that’s a kind of government surveillance we have just never lived within this country, we’ve never accepted in this country, and we’re really concerned about it.

Why is facial recognition technology especially problematic among people of color?

All of the accuracy tests of this technology have shown racial disparities in false-match rates. In other words, this technology results in false identification of darker-skinned people, particularly Black people, much more than it does with white people. There are a couple of reasons for that. These are machine-learning algorithms that learn how to distinguish and match faces based on processing huge sets of training data which are composed of tons of pairs of two different photos of the same person. But those training data sets have historically been very disproportionately composed of white people, and particularly white men, which means that these algorithms got relatively good at identifying white men and matching photos of white men, but relatively bad at identifying the matching photos of people of color and of women. This technology also has difficulties because of the color contrast settings in modern digital camera technology that are optimized for lighter-skinned faces.

On top of that, you have the problem of this being used by a policing system that has disproportionately targeted communities of color. If they feed in an image of an unknown Black suspect, [and] match it against a database that is disproportionately made up of Black people’s faces then the algorithm …. is likely to identify a possible match and is likely to be a false match. The likelihood of a false match is increased because of the overrepresentation and the technological problems. So you have technical problems and racism and policing problems … you match them together, and you get a recipe for really damaging effects.

There’s concern about how police might deploy the technology during a protest, correct?

That’s tremendously concerning to us. The right to protest is protected by our constitution for a reason. It’s because you cannot have a functioning democracy if people are chilled from going out in the streets and making their voices heard. You know sometimes that is just the last, most effective option available to people to force our elected leaders to listen to us. And in moments of great threat to our democratic system and a serious, regressive turn in some of our policies and lawmaking in the country, the right to protest is all the more important. And you know the prospect of police being able to instantaneously create a perfect list of everyone who attended a protest and then do who knows what with it, is just incredibly chilling, and I think particularly chilling to members of communities that are already over-policed.

TheGrio is FREE on your TV via Apple TV, Amazon Fire, Roku and Android TV. Also, please download theGrio mobile apps today!

Mentioned in this article:

More About: