Cops' facial recognition was 'unlawful,' court rules

Joanna Estrada
August 14, 2020

In a potentially landmark ruling, the U.K. Court of Appeals ruled Tuesday that the use of automatic facial recognition software by the South Wales Police (SWP) violated privacy rights.

With this, the Court of Appeal overturned a September 2019 ruling by the High Court in Cardiff, UK, which said that SWP's use of facial recognition technology was lawful and that it complied with both the Human Rights Act and the Data Protection Act of 2018.

Ed Bridges, 37, had his face scanned while Christmas shopping in Cardiff in 2017 and then again at a peaceful anti-arms protest outside the city's Motorpoint Arena in 2018.

South Wales Police Chief Constable Matt Jukes said: "The test of our ground-breaking use of this technology by the courts has been a welcome and important step in its development". "In relation to both of those questions too much discretion is now left to individual police officers", they said.

He said: "This technology is an intrusive and discriminatory mass surveillance tool". The South Wales Police had reportedly failed to verify that the software in use does not contain bias on the basis of race or sex.

"I was particularly encouraged by the approach to these proceedings by South Wales Police who have worked so hard to be transparent and ethical in their approach to use AFR technology in this pilot phase".

Mr Bridges, who the force confirmed was not a person of interest and has never been on a watchlist, crowdfunded his legal action and is supported by civil rights organisation Liberty, which is campaigning for a ban on the technology.

Download the EWN app to your iOS or Android device.

In addition to South Wales Police, facial recognition technology is also being used by the Metropolitan Police Service and Leicestershire Police. "Facial recognition is a threat to our freedom - it needs to be banned", she said.

Liberty hailed the victory as "the world's first legal challenge" to police use of facial recognition tech, but it's nearly certainly not going to be the last. "We should all be able to use public spaces without being subjected to oppressive surveillance".

Experts state that the automatic facial recognition technology in question is a "dystopian surveillance tool" which contains biases against people of color and violates United Kingdom citizens' rights.

This Court of Appeal ruling now means the use of AFR by the police is effectively outlawed until it can be brought before parliament.

"It is time for the Government to recognise the serious dangers of this intrusive technology", says Goulding. "The use of advanced surveillance technologies like live facial recognition demands proper consideration and full parliamentary scrutiny".

Activist Ed Bridges took South Wales Police to a High Court in London after he said his face was scanned twice using the technology.

"The ruling does not diminish the clear need for facial recognition technology in policing", he said.

Kevin Blowe, co-ordinator of the Netpol police-monitoring network, also raised concerns that police "will find other ways to use" the technology.

Surveillance Camera Commissioner Tony Porter responded to the ruling with a blog post calling for "a full review of the legislative landscape", including an update to the Surveillance Camera Code of Practice.

"The technology is only part of the issue; it's the databases that are behind them and how people end up on them and who the police are looking for".

The ruling did not completely ban the use of facial recognition tech inside the United Kingdom, but does narrow the scope of what is permissible and what law enforcement agencies have to do to be in compliance with human rights law.

The use of AFR has been controversial for a few years now.

Other reports by Click Lancashire

Discuss This Article

FOLLOW OUR NEWSPAPER