According to a report today from The Intercept and the Investigative Fund, IBM has been developing video surveillance software for law enforcement that can search people by skin color and ethnicity.
Post-9/11 IBM Bets On Video Surveillance
According to the report, IBM didn’t initially intend its video analytics software to be used for surveillance. However, after the 9/11 attacks happened in the U.S., the company saw high demand from law enforcement agencies and police departments for advanced video analytics technology that could be integrated into public cameras to search and find terrorist suspects.
According to The Intercept, in 2012 the NYPD gave IBM “secret access” to its city-wide video surveillance system, which the company used to develop new surveillance features, such as searching camera footage for images of people by hair color, facial hair and even skin tone.
The NYPD said that it only used the skin tone feature for testing purposes and didn’t deploy it to its camera system because it didn’t want the public to think it was racially profiling. The NYPD also said that giving access to IBM to its camera system was required for the collaboration to work.
However, civil liberties advocates argued that New Yorkers should have been made aware that the real-time collection of physical data was being handed over to a private firm. This revelation comes at a time when New York City Mayor Bill de Blasio and the NYPD are fighting against a new city council bill that would require more transparency from the NYPD.
No Longer Just a Counter-Terrorism Tool
Although the original intent of the IBM surveillance technology was to integrate the technology into the NYPD’s camera system in order to find terrorist suspects, the technology may have been used in everyday crimes too. However, NYPD denied this, and Peter Donald, the department’s spokesperson, said that he’s not aware of any case where IBM’s technology was used in an arrest or prosecution.
Meanwhile, the Campus police at California State University, Northridge, which also adopted IBM’s technology, admitted that it has been using the technology to track everyday criminals and even student protesters.
IBM’s Tech Tracks Ethnicity
Donald noted that sometime in 2016 or early 2017, IBM announced to the NYPD that it developed version 2.0 of its video surveillance technology, which could now also track people by ethnicity and provide tags, including “Asian,” “Black,” or “White.” The NYPD said it “explicitly rejected that product” because of this feature.
However, Rick Kjeldsen, a former IBM researcher who worked on this software, said that NYPD’s statement is misleading because neither IBM nor any other company would have worked on such a feature if the NYPD hadn’t shown interest in it.
Racial Profiling Concerns
Civil liberties advocates are worried that IBM’s technology could be used for mass racial profiling. Rachel Levinson-Waldman, senior counsel at the Brennan Center’s Liberty and National Security Program, said:
“Whether or not the perpetrator is Muslim, the presumption is often that he or she is. It’s easy to imagine law enforcement jumping to a conclusion about the ethnic and religious identity of a suspect, hastily going to the database of stored videos and combing through it for anyone who meets that physical description and then calling people in for questioning on that basis.”
Clare Garvie, a law fellow at Georgetown Law’s Center on Privacy and Technology, also said that any form of real-time location tracking should raise Fourth Amendment issues. The Supreme Court has already ruled twice that real-time location tracking is unconstitutional. Tracking where people are via city-wide camera surveillance systems could fall under the same ruling.
According to Garvie, any form of “identity-based surveillance” may also violate the First Amendment because it could compromise people’s right to anonymous public speech and association.
Kjeldsen also told The Intercept that IBM’s development of this surveillance technology for the NYPD without the public’s consent sets a dangerous precedent.
“Are there certain activities that are nobody’s business no matter what? Are there certain places on the boundaries of public spaces that have an expectation of privacy? And then, how do we build tools to enforce that? That’s where we need the conversation. That’s exactly why knowledge of this should become more widely available — so that we can figure that out," he said.