This article is more than 1 year old

ICO to probe facial recog amid concerns UK cops can't shake their love for unregulated creepy tech

Plus: Boffins find kit struggles in low light, crowds

The UK's data protection watchdog is investigating cops' use of facial recognition technology amid growing concerns about efficacy and ethics.

The tech is used by a number of forces in the UK, often during high-profile public events like Notting Hill Carnival or sports matches – but with limited success: An FoI request reported in May found a 98 per cent false positive rate when used by the Metropolitan Police.

Information Commissioner Elizabeth Denham raised concerns at the time about how intrusive the technology is, saying its use in public spaces was "a real step change in the way law-abiding people are monitored as they go about their daily lives".

Metropolitan Police at Notting Hill Carnival

London cops urged to scrap use of 'biased' facial recognition at Notting Hill Carnival

READ MORE

Her office's probe – first reported by The Telegraph and confirmed to The Register – aims to assess how law enforcement uses the automated facial recognition (AFR) tech.

As well as looking at forces' success in using facial recognition, the investigation will consider the fact there is currently no national coordination or governance framework into its use.

This has been a particular bone of contention among critics, and is the basis of legal challenges against forces in South Wales and London.

Many hoped to see some concrete policy proposals on facial biometrics and AFR in the government's long-awaited biometrics strategy that was published in June – but that simply kicked the can down the road, offering a vague pledge to update codes of practice on AFR "before it is widely adopted".

How the government is defining "widely adopted" is unclear – the UK's largest force, the London Met, will have deployed it 10 times by the end of this year and South Wales Police's use of the tech continues apace.

Neither force will countenance curtailing its use – despite acknowledging it might not result in a lot of arrests – and described the deployments as trials, which opponents rejected since they are going ahead in real-life situations on members of the public, outside of a controlled environment.

The news of the probe comes after the ICO slammed the Metropolitan Police's Gangs Violence Matrix, which was found to have broken a number of data protection principles.

Although the force escaped without a monetary penalty, the ICO didn't sugar-coat its criticisms of the database, which could bode well for a thorough investigation of AFR.

AFR struggles in crowds, low light

Meanwhile, academics at Cardiff University's Crime and Security Research Institute last week published a report (PDF) into South Wales Police's use of AFR.

Although it didn't outright condemn deployment, the report concluded there needed to be "considerable investment" and changes to police procedures if it was to generate consistent results.

The main assessment was of the technology itself – its false positive rate, the way the hardware and software worked and how officers interacted with it.

The academics found that, when used in real-time to scan against a watch list of 600-800 people, South Wales Police's kit returned true positive rates of just 3 per cent during its use between June and October 2017.

When the algorithm was updated in October, 26 per cent of matches between October 2017 and March 2018 were true positives.

However, this Locate system "was observed struggling with large crowds as it froze, lagged, and crashed when frames were full of people", which the academics described as "disappointing".

It also struggled in low-light conditions; when daylight faded, the cameras compensated by increasing ISO levels, which increased noise and made it harder for the system to detect and analyse faces.

Meanwhile, in 68 per cent of the cases that aimed to use AFR to identify unknown people by comparing them against 450,000-plus images in the police custody database, the unidentified image wasn't good enough for the system to work.

In those that were good enough, 73 per cent generated a possible suspect – something that is signed off on by an operator – when using the new algorithm; that was up from 20 per cent with the older one.

The report also noted that hardware configuration was an important element of overall performance, as the system "can be quite demanding in terms of its processing power requirements".

Indeed, over the course of deployments, the operators had to lower settings for "faces per frame" and "frames per second" in order to improve performance by reducing the load on the hardware.

At the Champions League, it was set to 10 faces per frame and 30 frames per second – 300 faces per second – but this was eventually reduced to 5 faces per frame and 10 frames per second – 50 faces per second.

"These continuous revisions to the frame and faces rates highlight the severity of the system's hardware performance problems," the authors said.

"Although the reduced load on the system improved performance, up to 50 faces per second is dramatically lower than the initial rate of up to 300 commissioned from [vendor] NEC."

The report also emphasised the importance of officers being in the loop as they can refute possible matches suggested by the algorithm, saying it would be better described as "assisted facial recognition".

Overall, there was evidence that AFR could contribute to policing – but the report noted there would have to be significant effort and investment to make it work consistently. ®

More about

TIP US OFF

Send us news


Other stories you might like