“Dangerous and inaccurate” police facial recognition exposed

23rd May 2018 / United Kingdom
“Dangerous and inaccurate” police facial recognition exposed

Big Brother Watch’s latest report, released today, reveals:

 

  • South Wales Police store photos of all innocent people incorrectly matched by facial recognition for a year, without their knowledge, resulting in a biometric database of over 2,400 innocent people
  • Home Office spent £2.6m funding South Wales Police’s use of the technology, although it is “almost entirely inaccurate”
  • Metropolitan Police’s facial recognition matches are 98% inaccurate, misidentifying 95 people at last year’s Notting Hill Carnival as criminals – yet the force is planning 7 more deployments this year
  • South Wales Police’s matches are 91% inaccurate – yet the force plans to target the Biggest Weekend and a Rolling Stones concert next

 

Big Brother Watch is taking the report to Parliament today to launch a campaign calling for police to stop using the controversial technology, branded by the group as “dangerous and inaccurate”.

Big Brother Watch’s campaign, calling on UK public authorities to immediately stop using automated facial recognition software with surveillance cameras, is backed by David Lammy MP and 15 rights and race equality groups including Article 19, Football Supporters Federation, Index on Censorship, Liberty, Netpol, Police Action Lawyers Group, the Race Equality Foundation, and Runnymede Trust.

Shadow Home Secretary Diane Abbott MP and Shadow Policing Minister Louise Haigh MP will speak at the report launch event in Parliament today at 1600.

Police have begun using automated facial recognition in city centres, at political demonstrations, sporting events and festivals over the past two years. Particular controversy was caused when the Metropolitan Police targeted Notting Hill Carnival with the technology two years in a row, with rights groups expressing concern that comparable facial recognition tools are more likely to misidentify black people.

Big Brother Watch’s report found that the police’s use of the technology is “lawless” and could breach the right to privacy protected by the Human Rights Act.

Silkie Carlo, director of Big Brother Watch, said:

 

SafeSubcribe/Instant Unsubscribe - One Email, Every Sunday Morning - So You Miss Nothing - That's It


“Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK. Members of the public could be tracked, located and identified – or misidentified – everywhere they go.

We’re seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals.

It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate, that they have no legal power for, and that poses a major risk to our freedoms.

This has wasted millions in public money and the cost to our civil liberties is too high. It must be dropped.”

 

 

 

At a time when reporting the truth is critical, your support is essential in protecting it.
Find out how

The European Financial Review

European financial review Logo

The European Financial Review is the leading financial intelligence magazine read widely by financial experts and the wider business community.