A legal challenge ‘first’ to a UK police force’s use of automated facial recognition (AFR) technology is set to become a critical nationwide test of the state’s power to deploy radical biometric surveillance methods.
The decision by South Wales Police not to stand in the way of a case being brought by a Cardiff resident now allows a predicted High Court battle – likely to have a significant impact on privacy laws in the UK.
Ed Bridges – represented by human rights organisation Liberty – had threatened to take the force to court if it did not immediately end its use of AFR technology in public spaces.
Three UK forces have used facial recognition in that arena since June 2015 – South Wales Police, the Metropolitan Police Service and Leicestershire Police. The Welsh force has been at the forefront of its deployment, using it on at least 22 occasions.
Mr Bridges’s face has likely been mapped and his image stored at least twice. He believes he was scanned as a passer-by on a busy shopping street in Cardiff in the days before Christmas, and then again while peacefully protesting outside the Cardiff Arms Fair in March 2018.
He said: “This dystopian style of policing has no place in Cardiff or anywhere else and I am delighted this legal challenge will go ahead.
“Without warning the police has used this invasive technology on peaceful protesters and thousands of people going about their daily business, providing no explanation of how it works and no opportunity for us to consent.
“The police’s indiscriminate use of facial recognition technology on our streets makes our privacy rights worthless and will force us all to alter our behaviour – it needs to be challenged and it needs to stop.”
Mr Bridges will seek to challenge the use of AFR technology in court because it violates the privacy rights of everyone within range of the cameras, has a chilling effect on peaceful protest, discriminates against women and BAME people, and breaches data protection laws.
Members of the public have so far donated more than £3,450 to Mr Bridge’s challenge via crowdfunding site CrowdJustice.
Liberty lawyer Megan Goulding, who is solicitor for Mr Bridges, said: “We are pleased South Wales Police has recognised the importance of this issue and agreed to a judge reviewing its actions.
“The police’s creeping rollout of facial recognition is not authorised by any law, guided by any official policy or scrutinised by any independent body.
“Scanning the faces of thousands of people whenever they see fit and comparing them to shady databases which can contain images sourced from anywhere at all has seriously chilling implications for our freedom.”
SWP Chief Constable Matt Jukes, who says the force welcomes the scrutiny, confirmed it will not seek to prevent the case from taking place.
The force has admitted it has used AFR technology to target petty criminals, such as ticket touts and pickpockets outside football matches, but they have also used it on peaceful protesters.
On March 27 this year, the force used it at a protest outside the Defence, Procurement, Research, Technology and Exportability Exhibition – the ‘Cardiff Arms Fair’. Mr Bridges attended the protest and he believes he, like many others there, was scanned by the AFR camera opposite the fair’s main entrance.
Protestors were not aware that facial recognition would be deployed and the police did not provide any information at the time of the event.
A force spokesman told Police Oracle: “South Wales Police has received correspondence relating to the deployment of automated facial recognition technology which we have responded to.
“The force has been very cognisant of concerns surrounding privacy and are confident that our approach is lawful and proportionate.”
In May, the force defended its decision to trial all AFR technology during last year’s Champions League final.
SWP said when it comes to arrests and successful convictions, NEC’s NeoFace Watch software platform proved more than a match for offenders.
AFR ‘Identity’ allows officers to load images of persons of interest and compare them against 500,000 custody images to see if there is a possible match.
And AFR ‘Locate’ uses live feeds from CCTV type cameras based either at specific, fixed locations or cameras secured to the top of one of a vehicle to locate persons on prescribed watch lists.
And the result in the eleven months since becoming the first force to introduce the ground-breaking programme is more than 2,000 positive matches, 450 arrests and offenders facing jail terms of six and four-and-a-half years respectively for robbery and burglary offences.
While no facial recognition is 100 per cent accurate under all conditions, no one has been arrested after an incorrect match, the force added.
Mr Jukes pointed to the “reality” of the technology assisting at major sporting events in crowded places that are potential terror targets.
He said: “We need to use technology when we’ve got tens of thousands of people in those crowds to protect everybody, and we are getting some great results from that.
“But we don’t take the use of it lightly and we are being really serious about making sure it is accurate.”
The force found itself on the defensive after deciding to trial the project during a weekend that saw 170,000 football fans in the Welsh capital.
The AFR software wrongly identified 2,297 people as potential offenders as officers patrolled the Champions League final between Real Madrid and Juventus at the Principality Stadium on June 3 last year.
According to data on the force’s website, that represented 92 per cent ‘false positives’ out of 2,470 potential matches – with just 173 providing ‘true positive alerts’.
The force blamed the high number of false positives at the football final on “poor quality images” supplied by agencies, including Uefa and Interpol, as well as the fact it was its first major deployment of the technology.