This article is more than 1 year old

London cops urged to scrap use of 'biased' facial recognition at Notting Hill Carnival

Year-on-year deployment isn’t really a trial, say privacy groups

London's Metropolitan Police have been urged to back down on plans to once again use facial recognition software at next weekend's Notting Hill Carnival.

Privacy groups including Big Brother Watch, Liberty and Privacy International have written to police commissioner Cressida Dick (PDF) calling for a U-turn on the use of the tech.

Automated facial recognition technology will snap the party-goers' faces, and run them against a database. The aim is to alert cops to people who are banned from the festival or are wanted by the police, presumably so they can take immediate action.

The tech was first tested at the festival – where relations between police and revellers are often strained – last year, but it failed to identify anyone.

Despite this, the Met said at the time that it had "resulted in positive learning" and is describing this year's use as a continuation of the trial.

It also said in a statement sent to The Register that there would be a public consultation after the trial is complete.

But the privacy groups' letter said that they "reject the notion that year-on-year deployment constitutes a 'trial' – for which, by the Force's own admission, there are 'no timescales'".

The letter also points to reports of inherent biases in the system – research has previously shown that matching accuracies are lower on certain cohorts of people: women, black people, and those aged 18-30.

"Notting Hill Carnival is an event that specifically celebrates the British African Caribbean community," the letter states. "We reject the proposition that it is appropriate to test biometric surveillance on this community."

The letter's signatories – which include Black Lives Matter UK and the chief executive of the Race Equality Foundation – say that there is an "unacceptable" risk that the use of the tech could lead to discriminatory policing at the event.

The groups also criticise what they describe as a lack of transparency from the police on what kind of software it uses, how long the images are stored for and when they are deleted.

The Met's statement about the use of the tech didn't offer any more specifics on data retention than to say: "The faces of members of the public in the background of a positive identification will feature.

"These images will be retained for the purposes of analysis of this project only and will not be speculatively searched or disseminated for any purpose."

There has been continued unease about the way police and government store images generated through the use of facial recognition technology. A February review showed that more than 19 million mugshots are held on file by the government.

In an interview with The Reg in May, the UK's surveillance camera commissioner Tony Porter said that the key issue over the next three years "is going to be on big data, integration of data and how video surveillance as a philosophy and as a thing will change".

He said that he would be working with civil rights groups and manufacturers to help inform the government of how regulations around links between video surveillance and automated facial recognition should work.

Meanwhile, the Home Office has this month put a £4.6m contract for facial recognition software out to tender, which will run for an initial term of 60 months.

According to the tender announcement, a company is sought to provide "a combination of biometric algorithm software and associated components that provide the specialised capability to match a biometric facial image to a known identity held as an encoded facial image". ®

More about

TIP US OFF

Send us news


Other stories you might like