“Big Brother is watching you” is a phrase referring to the government’s surveillance of people by using listening devices and cameras. We are a society dependent on technology. The watchful eyes of government and corporations has increased many times over.
Facial recognition is a type of surveillance covered in the documentary “Coded Bias” that premieres on PBS Monday, March 22 during the series “Independent Lens.” (Check your local listing.) The producer/director of this documentary is Shalini Kantayya. Kantayya premiered the film at the 2020 Sundance Film Festival.
“Coded Bias” uncovers disturbing findings from research conducted by Joy Buolamwini, a Ph.D. candidate at the MIT Media Lab. Buolamwini made a startling discovery. Facial recognition used in many security systems, does not see dark-skinned faces accurately. Also, this technology does not recognize female faces accurately. At the MIT Media Lab, Buolamwini began to study computer vision technology. The computer vision software that was supposed to detect her face was flawed.
“It did not work that well until I put on the white mask,” said Buolamwini as she demonstrated in the film. “When I put on the white mask, detected. When I took off the white mask, not so much.”
Perplexed, Buolamwini studied data for face recognition. The data was based on white males, the people in the tech field doing research. Further, what was uncovered time-and-time again, was facial recognition technology made mistakes in critical instances. In China, the U.K. and in many U.S. cities facial recognition cameras are used in pedestrian areas and as a form of entry into buildings. This is how the FBI and other law enforcement agencies found those involved in the Jan. 6, 2021 insurrection.
The documentary shows examples of how by simply walking down a street a person can be stopped by the police, if your face shows up with a red square around it. That labels you as a “target.” You are deemed suspicious. The “target” designation happens so quickly because information about you is already in a database.
“We now have a silver bullet algorithm,” said Cathy O’Neil, an American mathematician and the author of the blog “mathbabe.org. “We have to constantly monitor every process for bias.”
Silkie Carlo, director of Big Brother Watch in the UK, has monitored a trial use of facial recognition technology used by the UK police. She found that 98 percent of matches were wrong with faces showing as being dangerous or criminal. That means individuals anyone targeted may be in the UK police database, incorrectly. The database is like fingerprints that are on file. Carlo got the interest of Baroness Jenny Jones, a member of the UK Parliament. The two of them confronted the police when a pedestrian was “profiled” by facial recognition and was given a fine. A surprise later revealed in the documentary was Jones, the member of Parliament learned she was in a “dangerous” file.
Buolamwini’s research resonates globally, but has gained more traction in the U.S. A Brooklyn high-rise landlord filed an application to change its security system from a key fob to facial recognition. The building already has security cameras at the entrance and in the halls. The building is a predominately Black-occupied apartment building. Residents in Brooklyn see the proposed security process is a form of harassment. Buolamwini worked with the residents of the Brooklyn building to lodge a formal complaint.
“There is this old song in science fiction that says the future is already here. It’s just not evenly distributed,” said Virginia Banks, Ph.D., author of “Automating Inequality.” “Rich people get the fancy tools first, then it goes last to the poor. But in fact, what I found is the absolute opposite. The most punitive, the most invasive, most surveillance-focused tools that we have go into poor and working communities first.”
Artificial intelligence (AI) influences everything we do. It’s what you see through your social media feeds and when researching information through a web browser.
Buolamwini formed the Algorithmic Justice League (alj.org) to research and campaign against how algorithms can be harmful. The mostly female group has been working on the human resources profession. Resumes and online application systems are extremely biased. Buolamwini’s research in this area was picked up in a New York Times article showing Amazon’s hiring system overwhelmingly turned down women applicants.
The on-camera interviews in the documentary repeatedly told how AI is based on research data formulated by white males. Buolamwini and alj.org are determined to bring forth new research to change the system.
“Your view of the world is being governed by artificial intelligence,” said Buolamwini. “Our faces may be the final frontier of privacy.”