In February 2019, Nijeer Parks was accused of shoplifting candy and trying to hit a police officer with a car at a Hampton Inn in Woodbridge, N.J. He had been identified by police using facial recognition software, even though he was 30 miles away at the time of the incident.
Mr. Parks spent 10 days in jail and paid around $5,000 to defend himself. In November 2019, the case was dismissed for lack of evidence.
Mr. Parks, 33, is now suing the police, the prosecutor and the city of Woodbridge for false arrest, false imprisonment and violation of his civil rights.
He is the third person known to be falsely arrested based on a bad facial recognition match. In all three cases, the people mistakenly identified by the technology have been Black men.
Facial recognition technology is known to have flaws. In 2019, a national study of over 100 facial recognition algorithms found that they did not work as well on Black and Asian faces. Two other Black men — Robert Williams and Michael Oliver, both of whom live in the Detroit, Mich., area — were also arrested for crimes they did not commit based on bad facial recognition matches. Like Mr. Parks, Mr. Oliver filed a lawsuit against the city over the wrongful arrest.
Nathan Freed Wessler, an attorney with the American Civil Liberties Union who believes that police should stop using face recognition technology, said the three cases demonstrate “how this technology disproportionately harms the Black community.”
Americans like to rely on science and statistics and technology as supposedly neutral arbiters that will do things better. But because these systems are designed by people who are racist and live in a racist society, the science and statistics and technology are also inherently racist, even when the people designing them do not identify as racist. All of them are part and parcel of a society that engenders inequality in all parts of life. Another recent example of this is the algorithm that allowed Stanford hospital executives to be vaccinated before front line workers in the hospitals. When you default to algorithm or numbers or science or technology as the sole arbiter of decisions, you aren’t creating a better way to run society. You are simply finding a way to avoid blame for the inequalities that inevitably result.