Facial recognition research from Joy Buolamwini, a Ph.D. candidate at the MIT Media Lab, is the subject of the documentary "Coded Bias" produced and directed by Shalini Kantayya. (Courtesy of Independent Lens)
**FILE** Facial recognition research from Joy Buolamwini, a Ph.D. candidate at the MIT Media Lab, is the subject of the documentary "Coded Bias" produced and directed by Shalini Kantayya. (Courtesy of Independent Lens)

Recent research conducted by Scientific American online supported fears that facial recognition technology (FRT) can worsen racial inequities in policing. The research found that law enforcement agencies that use automated facial recognition disproportionately arrest Black people. 

The reportโ€™s authors stated that they believe these results from factors that include the โ€œlack of Black faces in the algorithmsโ€™ training data sets, a belief that these programs are infallible and a tendency of officersโ€™ own biases to magnify these issues.โ€ 

FRT was again cast in a negative light after the arrest of a 61-year-old grandfather, who is now suing Sunglass Hutโ€™s parent company after the storeโ€™s facial recognition technology mistakenly identified him as a robber. Harvey Eugene Murphy Jr. was subsequently held in jail, where he says he was sexually assaulted, according to a lawsuit.

The robbery occurred at a Sunglass Hut store in Houston, where two gun-wielding bandits stole thousands of dollars in cash and merchandise. Houston police identified Murphy as a suspect โ€“ even though he lived in California. They arrested Murphy when he returned to Texas to renew his driverโ€™s license. His lawsuit claims that, while in jail, he was sexually assaulted by three men in a bathroom, causing him to suffer lifelong injuries.

The Harris County District Attorneyโ€™s office later determined Murphy was not involved in the robbery โ€“ but the damage was already done while he was in jail, his lawyers said in a news release. 

โ€œThis is precisely the kind of situation weโ€™ve been warning about for years; that these systems, whatever their theoretical reliability, are in practice so finicky, and so consequential, that they cannot be fixed,โ€ Os Keyes, an Ada Lovelace Fellow, and PhD Candidate at the University of Washington, told Vice News.

โ€œThe only thing Iโ€™d push back on is Murphyโ€™s lawyerโ€™s claim that it could happen to anyone; these systems are attractive precisely because they promise to automate and speed up โ€˜business as usual,โ€™ which includes laundering existing police biases against people who are already in the system, minority groups, and anyone else who doesnโ€™t fit. This outcome is as inevitable as it is horrifying and should be taken as a sign to restrict and reconfigure policing in general as well as FRT in particular.โ€

Scientific American researchers noted that the algorithms used by law enforcement โ€œare typically developed by companies like Amazon, Clearview AI and Microsoft, which build their systems for different environments.โ€ They argued that, despite massive improvements in deep-learning techniques, federal testing shows that most facial recognition algorithms perform poorly at identifying people besides white men.     

In 2023, the Federal Trade Commission prohibited Rite Aid from using FRT after the company wrongly accused individuals of shoplifting. CBS News noted that, in one incident, an 11-year-old girl was stopped and searched by a Rite Aid employee based on a false match.  Also last year, the Detroit Police Department was sued by a woman whom their technology misidentified as a carjacking suspect. Eight months pregnant at the time, Porcha Woodruff was jailed after police incorrectly identified her using FRT.

The FTC acknowledged that people of color are often misidentified when using FRT. โ€œDisproportionate representation of white males in training images produces skewed algorithms because Black people are overrepresented in mugshot databases and other image repositories commonly used by law enforcement,โ€ Scientific American researchers determined. โ€œConsequently, AI is more likely to mark Black faces as criminal, leading to the targeting and arresting of innocent Black people.

โ€œWe believe that the companies that make these products need to take staff and image diversity into account. However, this does not remove law enforcementโ€™s responsibility. Police forces must critically examine their methods if we want to keep this technology from worsening racial disparities and leading to rights violations.โ€

Stacy M. Brown is a senior writer for The Washington Informer and the senior national correspondent for the Black Press of America. Stacy has more than 25 years of journalism experience and has authored...

Leave a comment

Your email address will not be published. Required fields are marked *