The Problem of Racial Profiling and Facial Recognition

In theory, biometrics should be free of the idiocy of racism. A fingerprint cannot tell you how much melanin someone’s skin has. Ditto an iris, a voice pattern, etc. Unfortunately, there is the problem of facial recognition.

As Joy Buolamwini of MIT and Timnit Gebru of Microsoft found in their research of three commercial software systems: “Darker-skinned females are the most misclassified group (with error rates of up to 34.7%). The maximum error rate for lighter-skinned males is 0.8%.”

Their research has, correctly, rattled some of the biggest companies working on facial recognition software:

How did this happen?


These systems are so good at identifying lighter-skinned males because the data sets used in their training had a large number of light-skinned males in them, a result of the programmers’ own biases. This could have been unintentional: The development teaMS may see and understand the world as being a place primarily of light-skinned males because that is all they are exposed to. Further, owing to the very, very well-documented lack of diversity in technology – from studying to teaching to industry – there was likely no one else in the organization who would quickly notice the problem. Whether intentional or not, this bias is now putting people at even greater risk of discrimination because of the tone of color of their skin. The risk, in this case, is the destruction of their lives.

Law enforcement agencies see facial recognition systems as a powerful tool to fight crime. To that end, they are deploying them in areas where darker skinned people live which they deem “high risk.”

In the words of Rep. Elijah Cummings (D-MD), “If you’re black, you’re more likely to be subjected to this technology and the technology is more likely to be wrong. That’s a hell of a combination.”

This is why Brian Brackeen, CEO of facial recognition software developer Kairos, says these systems should not be used by the police at all.

“To be truly effective, the algorithms powering facial recognition software require a massive amount of information,” he writes. “The more images of people of color it sees, the more likely it is to properly identify them. The problem is, existing software has not been exposed to enough images of people of color to be confidently relied upon to identify them. And misidentification could lead to wrongful conviction, or far worse.”

The United States has the highest number of people in prison (2,193,798) of any nation in the world. The next highest is China (1,548,498) whose population is four times that of the U.S. In the U.S., blacks make up 12.3% of the general population but 33% of the prison population. Whites, on the other hand, make up 64% of the general population and only 20% of the prison population. Some would argue this is more a result of poverty than racism. In 2016 8.8 percent of the white population lived at or below the poverty line ($24,563 for a family four), for blacks it was 22 percent of the population, according to the U.S. Census Bureau.

However, the persistence of that poverty cannot be separated from race.2 For example, in 2017 the median household net worth of whites in the Greater Boston Area was $247,500; for blacks it was $8.3 This is in large part because of differences in the rate of homeownership. According to a study by the Federal Reserve close to 80% of whites in the Boston area own a home, compared to only one-third of blacks.

This difference is the result of explicit government and financial institution policies which began in 1934 and continue to this day. When the Federal Housing Authority was founded in 1934 it refused to “insure mortgages in and near African-American neighborhoods — a policy known as ‘redlining.’ At the same time, the FHA was subsidizing builders who were mass-producing entire subdivisions for whites — with the requirement that none of the homes be sold to African-Americans.”4 In 1968 Congress passed the Federal Fair Housing act to stop this. Unfortunately, since then redlining has been effectively privatized. In 2018 a study by the Center for Investigative Reporting found that, “in 61 metro areas across the U.S., people of color were more likely to be denied conventional mortgage loans than whites, even when controlling for applicants’ income, loan amount, and neighborhood.”

Facial recognition technology’s false acceptance and false rejection rates caused by race and gender are so high it is difficult to see how any responsible company could sell such a flawed system to a law enforcement agency.

A recent Gizmodo headline sums up the issues all too well: Study Finds Predictive Policing No More Racist Than Regular Policing. The tech industry cannot settle for creating things that merely allow us to make the same mistakes more efficiently.5

1. When was the last time you heard a tech company call for government regulation? When was the first?
2. See Stamped from the Beginning: The Definitive History of Racist Ideas in America by Ibraham X. Kendi, Nation Books, 2017; and The Case For Reparations by Ta-Nehisi Coates, The Atlantic, June 2014
3. The discrepancy is so lopsided The Boston Globe published an article with the headline “That was no typo: The median net worth of black Bostonians really is $8.
4. See The Color of Law by Richard Rothstein, Liveright, 2017
5. See Facial Recognition Systems Won’t Stop School Shootings



Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email

The Largest Internet Company in Mexico Taps Tec360 and Veridium for Trusted Phishing Resistant Passwordless Authentication and to secure Okta SSO A top provider of


Veridium The True Passwordless Enterprise

Veridium The True Password-less Enterprise In February 2017 when I joined Veridium as CPO, I recognised and appreciated one of the biggest challenges for Enterprise