Security

London Police’s Facial Recognition System Is Wrong 81% of The Time

uk-police-facial-recognition-technology-report
PIxabay

Facial recognition technology is a subject that has made a lot of people angry, especially since it started to get implemented at borders and in airports. The critics have expressed their worries not only when it comes to privacy but also to how unreliable the systems actually are. 

According to a new report that was commissioned by Scotland Yard, the facial recognition system that is currently being used by the United Kingdom’s Metropolitan Police had mistakenly targeted four out of five innocent people as suspects. 

The report has been obtained by Sky News and The Guardian and raises some serious concerns about how the technology is being used, and “calls for the facial recognition programme to be halted.”

The technology has been used for the first time in 2016, during the Notting Hill Carnival. Since then, it has been trial-tested at 10 locations so far, including Leicester Square, Westfield Stratford, and Whitehall, during the 2017 Remembrance Sunday commemorations.

Professor Pete Fussey and Dr Daragh Murray have dug through its data at all these locations and found that out of 42 suspect matches, only 8 of them had been correct. That is a staggering 81% error rate. 

Our report conducted a detailed, academic, legal analysis of the documentation the Met Police used as a basis for the face recognition trials,” Prof. Fussey said. “There are some shortcomings and if [the Metropolitan Police] was taken to court there is a good chance that would be successfully challenged.

The Metropolitan Police’s deputy assistant commissioner, Duncan Bell, said the following about the results of the report:

We are extremely disappointed with the negative and unbalanced tone of this report… We have a legal basis for this pilot period and have taken legal advice throughout. We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer.”

That is not the only problem with the system: usually, when live facial recognition is being used, everyone who is in the range of the cameras is officially under overt surveillance. The co-authors of the report considered this an operational problem though the Metropolitan Police noticed the public about the trials via sings and tweets. 

However, the researchers concluded this process did not truly gain “meaningful consent”. In addition to that, a BBC documentary detailed an incident where a man who refused to take part in a facial recognition trial was fined for it. 

Treating LFR camera avoidance as suspicious behaviour undermines the premise of informed consent.” Prof. Fussey added. “The arrest of LFR camera-avoiding individuals for more minor offences than those used to justify the test deployments raise clear issues regarding the extension of police powers and of ‘surveillance creep‘.”

Big Brother Watch, an anti-surveillance campaign group is currently challenging the Metropolitan Police’s use of the technology: “The only question now is when is the Met finally going to commit to stop using facial recognition.”  

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

To Top