A privacy group has slammed the police’s use of facial recognition systems at public events in the UK as “dangerous and inaccurate”.
An investigation by non-profit Big Brother Watch found that different forces had tested the technology at football matches, festivals and parades.
Cameras scanned faces at events such as London’s Notting Hill carnival and last year’s Champions League final in Cardiff and matched them to existing police photographs such as mugshots in an effort to identify wanted criminals.
Big Brother Watch said that the Metropolitan Police’s tech wrongly identified innocent people in 98 per cent of cases – 102 in all – with just two people correctly identified, neither of whom was a wanted criminal.
The figure for the UK as a whole was a “staggering” 95 per cent, Big Brother Watch said.
“Facial recognition has long been feared as a feature of a future authoritarian society, with its potential to turn CCTV cameras into identity checkpoints, creating a world where citizens are intensively watched and tracked,” it said.
“However, facial recognition is now a reality in the UK – despite the lack of any legal basis or parliamentary scrutiny, and despite the significant concerns raised by rights and race equality groups.
“This new technology poses an unprecedented threat to citizens’ privacy and civil liberties, and could fundamentally undermine the rights we enjoy in public spaces.”
South Wales Police, which made 2,685 ‘matches’ between May 2017 and March 2018 with 2,451 false alarms, said the system has improved over time.
“When we first deployed and we were learning how to use it… some of the digital images we used weren’t of sufficient quality,” said deputy chief constable Richard Lewis.
“Because of the poor quality, it was identifying people wrongly. They weren’t able to get the detail from the picture.”
On the practical use of the tech, he continued: “The operator in the van is [sometimes] able to see that the person identified in the picture is clearly not the same person, and it’s literally disregarded at that point.
“On a much smaller number of occasions, officers went and spoke to the individual… realised it wasn’t them, and offered them the opportunity to come and see the van.
“At no time was anybody arrested wrongly, nobody’s liberty was taken away from them.”
Information Commissioner Elizabeth Denham outlined her own concerns in a blog post.
“There may be significant public safety benefits from using facial recognition technology — to enable the police to apprehend offenders and prevent crimes from occurring,” she said.
“But how FRT is used in public spaces can be particularly intrusive. It’s a real step change in the way law-abiding people are monitored as they go about their daily lives.
“There is a lack of transparency about its use and is a real risk that the public safety benefits derived from the use of FRT will not be gained if public trust is not addressed.
“I have been deeply concerned about the absence of national level co-ordination in assessing the privacy risks and a comprehensive governance framework to oversee FRT deployment.”
Denham welcomed plans to establish an oversight panel for facial recognition technology, which she will sit on along with the biometrics commissioner Alastair MacGregor QC and the surveillance camera commissioner Tony Porter, and also the appointment of a National Police Chiefs Council lead for the governance of the use of the tech in public spaces.