Police using ‘dangerous’ facial recognition systems unlawfully in the UK

19th November 2019 / United Kingdom
Police using 'dangerous' facial recognition systems unlawfully in the UK

TruePublica has reported and published many articles covering the subject of the government’s mass civilian data architecture with all of its lawbreaking activities (that we know of). State agencies are now using technology systems that they themselves do not know how they work or operate and are unable to control breaches of privacy of critically sensitive information. It is of no surprise that facial recognition systems are being operated by the police unlawfully or that the systems themselves have been found inadequate. It was only two months ago that we reported on the facial recognition epidemic in the UK – a spectacular example of the destruction of civil liberties if ever there was one.

BigBrotherWatch reports that the Information Commissioner’s Office has released a report stating that police use of facial recognition technology may have been unlawful. It has raised concerns that the technology has been used when it has not been ‘strictly necessary’ and called for a regulatory framework to ensure ‘governed, targeted and intelligence-led’ deployment, rather than the current ‘blanket’ use.

Silkie Carlo, Director of Big Brother Watch, said that “police have been let off the leash, splashing public money around on Orwellian technologies, and regulators trying to clean up the mess is too little too late. This is a society-defining civil liberties issue that requires not just regulators but political leadership.

ITPRO reports that the – Information Commissioner Elizabeth Denham said that the investigation raised serious concerns about the use of a technology that relies on huge amounts of sensitive personal information.

It is right that our police forces should explore how new techniques can help keep us safe,” Denham said in a blog post. “But from a regulator’s perspective, I must ensure that everyone working in this developing area stops to take a breath and works to satisfy the full rigour of UK data protection law.

“Moving too quickly to deploy technologies that can be overly invasive in people’s lawful daily lives risks damaging trust not only in the technology but in the fundamental model of policing by consent. We must all work together to protect and enhance that consensus.”

 

READ: Worse than facial recognition – the next big privacy outrage

 

The revelation that facial recognition technology was being used at London’s King’s Cross Station earlier this year raised serious concerns about data privacy and regulation, particularly as private developers were able to install the technology without notice.

SafeSubcribe/Instant Unsubscribe - One Email, Every Sunday Morning - So You Miss Nothing - That's It


The ICO said it found the current combination of laws, codes and practices relating to live facial recognition to be inadequate for driving the ethical and legal approach that’s needed to truly manage the risk that the technology presents. It also said that the technology would increase the likelihood of legal failures without a statutory code and would ultimately undermine public confidence.

The ICO argued that a statutory code of practice was necessary to give the police and the public enough knowledge as to when and how the police can use live facial recognition systems in public spaces.

The Independent reported this story with the headline: Police may have used ‘dangerous’ facial recognition unlawfully in UK

 

 

At a time when reporting the truth is critical, your support is essential in protecting it.
Find out how

The European Financial Review

European financial review Logo

The European Financial Review is the leading financial intelligence magazine read widely by financial experts and the wider business community.