A British police agency is defending (this link is inoperable for the moment) its use of facial recognition technology at the June 2017 Champions League soccer final in Cardiff, Wales—among several other instances—saying that despite the system having a 92-percent false positive rate, “no one” has ever been arrested due to such an error.
New data about the South Wales Police’s use of the technology obtained by and through a public records request shows that of the 2,470 alerts from the facial recognition system, 2,297 were false positives. In other words, nine out of 10 times, the system erroneously flagged someone as being suspicious or worthy of arrest.
In a public statement, the SWP said that it has arrested “over 450” people as a result of its facial recognition efforts over the last nine months.
“Of course, no facial recognition system is 100 percent accurate under all conditions. Technical issues are normal to all face recognition systems, which means false positives will continue to be a common problem for the foreseeable future,” the police wrote. “However, since we introduced the facial recognition technology, no individual has been arrested where a false positive alert has led to an intervention and no members of the public have complained.”
The agency added that it is “very cognizant of concerns about privacy, and we have built in checks and balances into our methodology to make sure our approach is justified and balanced.”
However, Big Brother Watch, a London-based advocacy group, is unsatisfied with the police’s response:
Not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool.
Big Brother Watch has more to come on this, soon. Watch this space!https://t.co/9i058Y5pP6
— Big Brother Watch (@bbw1984) May 4, 2018