UK Police's Facial Recognition Systems Are Wrong Up To 98% Of The Time
Big Brother Watch (BBW), a civil rights organization from the UK that “works to roll back the surveillance state,” released a report in which it reveals that the UK Metropolitan Police’s experimental facial recognition system is wrong 98% of the time, thus making it virtually useless.
AI And False Positive Rates
Over the past few years, we’ve seen all sorts of companies claim that artificial intelligence (AI) systems have incredibly high-accuracy, whether it’s about facial recognition, content filtering, stopping malware, or what have. However, what they don’t say is that usually they test their systems against pre-selected data points to achieve those high success rate percentages.
For instance, a developer of a content filtering AI system may claim that they are able to identify a high percentage of terrorist content on the web, but only when they already know that the content they’re analyzing is terrorist content. When the system is used to identify terrorist content out of many other forms of content, it may not only achieve a low success rate, but it may also misidentify other content as terrorist content, thus achieving a high false positive rate, too.
Police's Automated Facial Recognition System
This is basically what happened with the Met Police’s facial recognition system, too. Through 50 freedom of information requests, BBW was able to discover that, on average, a staggering 95% of the facial recognition system’s “matches” were actually wrongly identifying innocent people.
The Met Police’s facial recognition system had the worst track record, with only 2% matching accuracy and with 98% wrongly identified people. The system has only correctly recognized two people, none of which were wanted criminals. One was matched incorrectly on the watch list, and the other was on a mental health-related watch list. The force made no arrest using the automated facial recognition system.
The South Wales Police’s facial recognition system wrongly identified 91% of the matches. The system led to 15 arrests or 0.005% of the total matches. Thirty-one innocent people were wrongly identified and were asked to prove their identity and innocence. All of the 2,451 people whose faces were automatically analyzed were stored for 12 months, a policy that came into place with the Investigatory Powers Act (Snooper’s Act), and which is likely to be unlawful. The majority of the people whose faces were scanned automatically were also not notified that the police system has "matched" them as targets.
BBW Calls On UK Authorities To Stop Using Automated Facial Recognition Systems
Currently, there is no legislation in the UK that regulates the use of facial recognition systems through CCTV cameras by the police, nor is there any independent oversight for the police’s use of these systems.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
BBW believes that the live automated facial recognition cameras that enable biometric identification checkpoints violate the Human Rights Act of 1998. Article 8 of the Human Rights Act says that any interference with the right to private life must be both necessary and proportionate.
The BBW claimed that the live automated facial recognition cameras fail both of these tests:
It is plainly disproportionate to deploy a technology by which the face of every passer-by is analysed, mapped and their identity checked. Furthermore, a facial recognition match can result in an individual being stopped in the street by the police and asked to prove their identity and thus their innocence.
BBW also warned that surveillance technologies are developing so quickly that it has become difficult to follow and legislate them. However, the British people will need to decide whether or not they want to live in a world where they are continuously watched, intrusively surveilled, and biometrically tracked, and think about how that may affect their fundamental rights.
The BBW called on the UK authorities to stop using the automated and highly inaccurate facial recognition systems until there are proper regulations and oversight in place.
-
Christopher1 If these facial recognition mechanisms are that inaccurate, it might be time to say to the police "Sorry but you cannot use these anymore until they are more accurate in real-life situations!"Reply
I always thought that facial recognition was a big waste of time because shadows are known to 'confuse' the facial recognition systems. -
HEXiT they shouldnt be using them at all. the snoopers charter is an illegal surveillance bill that is way to vague in its scope.Reply
as for this ai i wouldnt be surprised if they still insist on using it and will try to justify its use at every opportunity as long as it doesn't cost more than hiring real police to do real policing. -
ElectrO_90 Remember, this is The Emperors new clothes. Sell anyone anything, especially if the government is paying. The government always loses money on deals, and someone fleeced the UK Government selling them a magic bean.Reply -
soonpaomeng Its time to look on Artificial General Intelligence (AGI), without it, no Natural Language Understanding will success. Above failures is typical Deep Learning AI, its time to go AGI @pao_meng. One UK adviser appears don't understand AGI thus police and goverment heading wrong direction. Please convey this msg to UK intelReply