Racism in Facial Recogntion Information Session
the CPC hosted an information session with Isedua Oribhabor about racism and the dangers in facial recognition surveillance
On January 9, 2020, Robert Williams, a black man from the suburbs of Detroit was arrested in his driveway in front of his wife and children for a crime he did not commit. Months earlier, another man had stolen thousands of dollars worth of watches from a store in Midtown Detroit. Facial recognition had been employed by the police department for two years without residents’ knowledge. Police used it in attempt to find the perpetrator of the theft. It told them Williams was his man. He spent 18 hours in custody before being interrogated in which he disputed the evidence against him. He paid his $1000 bond and was released, but had to retain an attorney to ultimately clear him of the wrongful arrest.
The evidence is absolutely overwhelming: government surveillance disproportionately targets the communities of color, adding to the already abundantly apparent racial bias in general policing. Surveillance by way of facial recognition technology is no different.
New surveillance technologies often rely on historical policing data for their algorithms, making them just as bias as past policing. In May 2016, ProPublica reported that a computer program labeled black defendants much more likely to reoffend as white defendants. Biased data is not the only problem, but programs also tend to have the biases of their creators built into them, espcially facial recognition. A 2019 federal study found that African American and Asian individuals are 100 times more likely to be misidentified by facial recognition.
After the death of Freddie Gray at the hands of the Baltimore Police Department, activists took to the street to protest the brutality. Baltimore Police responded by using facial recognition to identify certain protesters while engaging in a Constituationally protected right. It is likely the same thing was done across the country this summer in the wake of the deaths of George Floyd and Breonna Taylor.
Because of the problems outlined above with facial recognition technology, many cities, including several right here in the Bay Area, have banned its use by law enforcement. Here is a complete list of those cities and the dates the bans were passed according to Fight for the Future:
Portland, OR’s ban is unique because not only does it prohibit the technology’s use by the government, but it also prevents the use by private corporations as well.
Other cities such as Baltimore, MD and Minneapolis, MN are also considering bans. There is no reason why every city in Santa Clara County shouldn’t also consider it.
In the wake of the deaths of Ahmaud Arbery, Breonna Taylor, and George Floyd, several tech companies that develop facial recognition software ceased their efforts, at least one of them for good. Among those companies are Amazon, Microsoft, and IBM. IBM’s statement was by far the most passionate stating the company strongly opposes the use of the technology for “mass surveillance, racial profiling, violations of basic human rights and freedoms” and called on Congress to pass strong police reform. To back up their statement, IBM halted development and sales of their facial recognition tech indefinitely.
Microsoft has been calling for regulation of facial recognition technology since 2018 and in June committed to not selling the tech to law enforcement until stronger regulation comes about.
Amazon merely committed to not sell to law enforcement for one year.
In June of 2020, Senators Ed Markey and Jeff Merkley and Representatives Ayanna Pressley and Pramila Jayapal proposed the Facial Recognition and Biometric Technology Moratorium Act, a bill that has since gained the support of Senators Warren, Wyden, and Sanders that would completely ban the use of facial recognition surveillance by law enforcement in the United States.
Contact your legislators today and express your strong support for the Facial Recognition and Biometric Technology Moratorium Act (H.R. 7235)!
the CPC hosted an information session with Isedua Oribhabor about racism and the dangers in facial recognition surveillance