UK to Implement China-Style Facial Recognition Surveillance System

Nikolai Grunin, an employee at NtechLab, the company that won the city's tender to supply
KIRILL KUDRYAVTSEV/AFP via Getty Images

Civil rights campaigners have warned that Britain is heading towards a “dystopian” future in which Chinese-style facial recognition cameras are used to turn public spaces into “open-air prisons”.

Last week, the Home Office published new proposed guidelines for the use of AI camera surveillance of the British public in order to “enable operators of surveillance camera systems to make legitimate use of available technology in a way that the public would rightly expect and to a standard that maintains public trust and confidence.”

The guidance, which is the first update to the Surveillance Camera Code of Practice in eight years, has been criticised as “bare bones” and a threat to civil liberties by campaigners.

The former Surveillance Camera Commissioner Tony Porter told the BBC: “I don’t think it provides much guidance to law enforcement, I don’t really it provides a great deal of guidance to the public as to how the technology will be deployed.”

Mr Porter added that he found it odd that the guidance made no mention of how such systems will be implemented in Transport For London, which operates thousands of cameras.

Megan Goulding, a lawyer for the civil rights group Liberty added: “One year since our case led the court to agree that this technology violates our rights and threatens our liberty, these guidelines fail to properly account for either the court’s findings or the dangers created by this dystopian surveillance tool.

“Facial recognition will not make us safer, it will turn public spaces into open-air prisons and entrench patterns of discrimination that already oppress entire communities.”

The Home Office defended the use of live facial recognition (LFR) saying: “The Government is committed to empowering the police to use new technology to keep the public safe, whilst maintaining public trust, and we are currently consulting on the Surveillance Camera Code.

“In addition, College of Policing have consulted on new guidance for police use of LFR in accordance with the Court of Appeal judgment, which will also be reflected in the update to the code.”

The London-based Privacy International warned that the use of facial recognition could be used to monitor and potentially clamp down on protest movements.

“Unfortunately, police forces do increasingly use a wide range of surveillance capabilities at protests, like facial recognition technology, IMSI catchers and mobile phone data extraction tools. It means that by attending a protest, the police can potentially identify you, track you and monitor you.”

Indeed, Breitbart London reported in February that police forces throughout Britain have been deploying drones to monitor protests.

Facial recognition software has become a key tool of control used by authoritarian regimes, in particular, communist China, which has installed hundreds of millions of cameras over the past decade.

Questions have also been raised as to the efficacy of facial recognition technology in preventing crime, with figures released earlier this year showing that only one arrest was made as a result of scanning 13,000 people’s faces.

Last year the Metropolitan Police used facial recognition in three instances. On the first occasion, some 4,600 faces were scanned and compared to a list of 6,000 wanted people, yet the cameras were unable to identify any suspects.

In another operation, 8,600 people had their faces scanned in Oxford Circus. The cameras alerted the police of eight potential suspects, however, only one actually matched the list of suspects.

The director of Big Brother Watch, Silkie Carlo said at the time: “The police’s own data shows facial recognition surveillance is dangerously inaccurate and a waste of public money, let alone a serious assault on our civil liberties.”

Follow Kurt Zindulka on Twitter here @KurtZindulka

COMMENTS

Please let us know if you're having issues with commenting.