Don’t be surprised if you’re arrested next time you visit the UK.
Facial recognition technology trialed by the Metropolitan Police is reportedly 81 percent inaccurate.
The system, according to a study by the University of Essex, mistakenly targets four out of five innocent people as wanted suspects.
It is likely to be found unlawful if challenged in court.
In order to compile an independent report on the London police service’s testing, Peter Fussey and Daragh Murray were granted what the University called “unprecedented” access to six of the 10 trials, completed between June 2018 to February 2019.
The pair joined officers in LFR control rooms and on the ground; they also attended briefing and debriefing sessions and planning meetings.
“This report was based on detailed engagement with the Metropolitan Police’s processes and practices surrounding the use of live facial recognition technology,” co-author Fussey said in a statement.
“It is appropriate that issues such as those relating to the use of LFR are subject to scrutiny, and the results of that scrutiny made public,” he added.
The researchers’ main concerns all seem pretty legitimate.
They argue that the Metropolitan Police failed to obtain “explicit legal authorization” for the use of LFR in domestic law, or take into account factors like the technology’s intrusive nature or use of biometric processing.
Plus, there was “insufficient pre-test planning and conceptualization,” which led to “a number of issues” regarding consent, public legitimacy, and trust.
Across the six trials evaluated, LFR technology made 42 matches, only eight of which the report authors can say with absolute certainty the technology got right.
That doesn’t instill a lot of confidence in me—someone living and working in the UK.
“Ultimately, the impression is that human rights compliance was not built into the Metropolitan Police’s systems from the outset,” Murray said, “and was not an integral part of the process.”
In light of their findings, Murray and Fussey are calling for all live trials of LFR to be discontinued until these concerns are addressed.
The Metropolitan Police did not immediately respond to Geek’s request for comment.
More on Geek.com:
- New York City Schools May Introduce Facial Recognition
- Facial Recognition Privacy Act Aims to Protect Your ID
- Did Taylor Swift Use Facial Recognition to Catch Stalkers at Concert?
The post London Police’s Facial Recognition System Has 81 Percent Error Rate appeared first on Geek.com.
from Geek.com https://ift.tt/2xwSEv3
via IFTTT
0 comments:
Post a Comment