Skip to main content

Microsoft Refused to Sell its Facial Recognition Tech to California Police

(Image credit: Shutterstock)


Microsoft refused a contract in which the company would have installed its facial recognition technology into the cars and body cameras of California police officers, Reuters reported this week. The company reportedly worried that the California law agency, which wasn't named, would use it in a way that would violate human rights.

Facial Recognition & Human Rights Abuse

Unlike some other companies in the technology industry, Microsoft worried that its surveillance technology would lead to women and minorities being disproportionately held for questioning, for the simple reason that the police database on which the facial recognition AI would be trained contained mostly faces of white males. In general, the more data used to train machine learning systems, the more accurate the system becomes. But even then 100% accuracy is not guaranteed. It gets much worse when you also lack sufficient data to train the AI models with high accuracy for the intended goal.

Speaking at a Stanford University conference on “human-centered artificial intelligence,” Microsoft president Brad Smith, who’s been leading the company in a more pro-human rights direction over the past few years, said that the California police wanted to run a face scan of everyone they stopped. The company declined the contract and told the police representatives that this technology is not the answer.

In the conference, Smith added that Microsoft also rejected another offer to install its facial recognition systems in a capital city from an unnamed country that the nonprofit Freedom House had deemed not free. According to Reuters, Smith said the technology would have suppressed freedom of assembly.

Smith also said in the conference that companies need a commitment to human rights, which he believes is increasingly critical as rapid technology advances allow governments to conduct blanket surveillance, deploy autonomous killer robots and take other steps that may be difficult, if not impossible, to reverse.