Met Police Plan For Controversial Facial Recognition Technology at Notting Hill Carnival
TruePublica Editor: Many will no doubt read this article and conclude that the Police are simply using the latest technology to apprehend criminals. Truthfully, I think most rational people would agree with that idea. But take into account that facial recognition technology was also trialled at Notting Hill Carnival in 2016, where it failed to identify anyone and that research has shown that facial recognition software can carry racial accuracy biases. In addition, in March 2017, the US Government Accountability Office found that facial recognition algorithms used by the FBI are inaccurate almost 15 per cent of the time and are more likely to misidentify female and black people. One last fact to consider is that Britain is already the world’s most endemic surveillance state. There is no oversight of facial recognition – it is excluded from the remits of the Surveillance Camera Commissioner and the Biometrics Commissioner. Your image will be taken, possibly cross-referenced incorrectly and stored – what then? The technology is simply not reliable enough to be used in this way.
By Liberty: Civil liberties and race relations groups have demanded the Metropolitan Police Service abandon plans to deploy cameras equipped with facial recognition technology at this month’s Notting Hill Carnival.
The coalition – which includes Liberty, Privacy International, StopWatch and Black Lives Matter – has written to the Met, warning that scanning the faces of thousands of attendees and capturing their images has no basis in law, could lead to discriminatory policing, and represents a gross violation of carnival-goers’ privacy.
No law, no oversight
The police intend to monitor crowds at the Notting Hill Carnival using cameras equipped with facial recognition technology.
The biometric software scans the faces of passers-by, creating maps of unique facial characteristics that are as uniquely identifying as fingerprints. The scans will be measured and compared to images on an unknown database, the origin of which has not been disclosed by the Metropolitan Police.
In today’s letter, the concerned groups urge Commissioner Cressida Dick to scrap the plan, highlighting:
- The use of automated facial recognition in public spaces is a gross breach of the right to privacy under the Human Rights Act.
- Facial recognition technology is not governed by any law, and has never been debated or scrutinised by Parliament. This raises an urgent question as to the lawfulness of deploying it in public spaces.
- There is no independent oversight of the Met’s use of facial recognition.
- There is an intolerable lack of transparency around its use. It is not known how long captured images of innocent people are stored. Nor is it known which databases they are matched against, whether they are linked to social media accounts, or if the images are shared with anyone else.
- Research has shown that facial recognition algorithms can carry racial biases – which could inadvertently lead to discriminatory policing. The FBI’s software misidentifies people almost 15 per cent of the time – and is more likely to fail with black people and women. If the Met’s software has similar flaws, the risks of using it at Notting Hill Carnival are unacceptable.
SafeSubcribe/Instant Unsubscribe - One Email, Every Sunday Morning - So You Miss Nothing - That's It
Martha Spurrier, Director of Liberty, said:
“There are no laws, no rules and no oversight for facial recognition technology – not to mention the serious concerns about its accuracy. It is a shady enterprise neither our MPs nor the public have consented to or know enough about.
“There are significant doubts as to whether deploying this technology in public spaces can ever be lawful – especially without proper Parliamentary debate. The Met must urgently abandon its plans so that the thousands of people hoping to enjoy the carnival weekend know their police force will protect their human rights.”