I recently spoke to the federal biometric conference in Tampa. For that reason, a story out of Detroit stood out. Rep. Rashida Tlaib (D., MI.) called upon the Detroit Chief of Police James Craig to only black people as analysts to run their facial recognition program. Not only would such hiring violate anti-discrimination laws but it is based on a racially questionable assumption. The Police Chief called the suggestion “racist.”
During a demonstration of the technology, Tlaib declared “Analysts need to be African-Americans, not people that are not. It’s true, I think non-African-Americans think African-Americans all look the same!” She defended the statement by noting that people confuse Reps. John Lewis, D-Ga., and Elijah Cummings, D-Md.
Craig showed better judgment and responded “I trust people who are trained, regardless of race, regardless of gender.”
It is true that early tested of FRT had a higher error rate for African American females. However, the top performing programs are now reaching above a 99 percent accuracy. A MIT study finding the error rate shock up the industry and they resulted in better algorithms and training protocols.
The comments also reflected a fundamental misunderstanding of FRT. The point of FRT is that the algorithms makes the identification. Training concerns issues like lighting, framing, and operation. It is a contraction of terms to have the operator make the identification.
The racial stereotypes and dated information reflective in Tlaib’s comments are embarrassing. The error rate found by the MIT study are discussed in my forthcoming two-part study on privacy and biometrics. It is important for the industry to remain focused on this issue to continue to address any racial error rates. However, great strides have occurred in the technology.
