This article is more than 1 year old

Peers question experts over UK police use of AI, facial recognition tech

Academics, senior officers sat at uncomfortable table and asked 'what's going on 'ere, then?'

Members of the House of Lords are looking at the role technologies such as AI and facial recognition have in modern policing methods and law enforcement.

The Justice and Home Affairs Committee announced in May that it planned to probe the matter and has already heard one round of oral evidence in June from legal experts as it familiarises itself with the subject.

Today it was the turn of Professor Michael Wooldridge – Head of Computer Science at the University of Oxford – to be quizzed by peers.

He echoed many of the concerns previously raised and cited the example of a computer system that advises custody officers whether someone should be detained in a police cell or released based on a host of data including their criminal history.

His concerns lay not necessarily with the technology, but that it could lead to some officers “abdicating [their] responsibility” by becoming over-reliant on it.

Wooldridge warned that more needed to be understood about AI and what it can – and can’t – do. And he urged peers not to “humanise it”.

“It’s not like human intelligence,” he said. He then went further, adding that the technology is “brittle” and can fail in unexpected ways.

AI can mislabel or misidentify pictures, for example, he addded. While this may not pose a problem on social media, it could have grave consequences in the criminal justice system in areas such as facial recognition.

At this early stage the committee is simply gathering evidence but from the outset, the complexity of what they face is clear.

Speaking in June as the enquiry began hearing oral evidence [PDF], Professor Carole McCartney, Professor of Law and Criminal Justice at Northumbria University, explained that “one of the big criticisms of technologies... is their lack of scientific basis” and the importance of proper scrutiny.

“The House of Lords did a report recently on the fact that the underlying fundamental research is often lacking, and there is nothing new here; very often a lot of these technologies will be based on very sketchy scientific foundations and that is dangerous,” she said.

She backed up her point by touching on the example of automatic facial recognition technology employed by the South Wales Police.

“Essentially, they put the technology out into the wild and just keep an eye on it, and they call it ‘a trial’. That is not how scientific trials work,” she said.

Asked whether the benefits of technology outweigh the pitfalls, Professor McCartney said: “It depends."

“One of the difficulties in this area is that there is no silver bullet," she added. "There is no technology that will come along and solve domestic violence or enable us to predict burglaries successfully 100 per cent of the time.”

In a separate briefing last month, the committee also heard from members of the West Midlands Police who gave their take on how technology can aid law enforcement and criminal detection. Speaking behind closed doors, officers provided an update on projects from the National Data Analytics Solution, which uses advanced analytics that could be used by law enforcement.

Last week, the role of facial recognition technology was put into the spotlight in the US after the House Committee on the Judiciary heard evidence about how it is being used by law enforcement agencies.

Dr Cedric Alexander, a former member of President Barack Obama's Task Force on 21st Century Policing, underlined the minefield facing lawmakers by laying out how, on the one hand, FRT can promote justice and "even save lives" but not if it means sacrificing constitutional rights.

The hearing coincided with civil rights campaigners in the US who called on retailers to stop using facial-recognition technology amid worrying privacy concerns and fears that it could lead to people being wrongly arrested. ®

More about

TIP US OFF

Send us news


Other stories you might like