Quantcast
Channel: University of Cambridge - Latest news
Viewing all articles
Browse latest Browse all 4516

UK police fail to meet 'legal and ethical standards' in use of facial recognition

$
0
0

A team from the University of Cambridge’s Minderoo Centre for Technology and Democracy created the new audit tool to evaluate “compliance with the law and national guidance” around issues such as privacy, equality, and freedom of expression and assembly.

Based on the findings, published in a new report, the experts are joining calls for a ban on police use of facial recognition in public spaces.

“There is a lack of robust redress mechanisms for individuals and communities harmed by police deployments of the technology,” said the report’s lead author Evani Radiya-Dixit, a visiting fellow at Cambridge’s Minderoo Centre.

“To protect human rights and improve accountability in how technology is used, we must ask what values we want to embed in technology.”

Researchers constructed the audit tool based on current legal guidelines – including the UK’s Data Protection and Equality acts – as well as outcomes from UK court cases and feedback from civil society organisations and the Information Commissioner's Office.

They applied their ethical and legal standards to three uses of facial recognition technology (FRT) by UK police. One was the Bridges court case, in which a Cardiff-based civil liberties campaigner appealed against South Wales Police’s use of automated FRT to live-scan crowds and compare faces to those on a criminal “watch list”.  

The researchers also tested the Metropolitan Police’s trials of similar live FRT use, and a further example from South Wales Police in which officers used FRT apps on their smartphones to scan crowds in order to identify “wanted individuals in real time”.

In all three cases, they found that important information about police use of FRT is “kept from view”, including scant demographic data published on arrests or other outcomes, making it difficult to evaluate whether the tools “perpetuate racial profiling” say researchers.

In addition to lack of transparency, the researchers found little in the way of accountability – with no clear recourse for people or communities negatively affected by police use, or misuse, of the tech. “Police forces are not necessarily answerable or held responsible for harms caused by facial recognition technology,” said Radiya-Dixit.

Some of the FRT uses lacked regular oversight from an independent ethics committee or indeed the public, say the researchers, and did not do enough to ensure there was a reliable “human in the loop” when scanning untold numbers of faces among crowds of thousands while hunting for criminals.

In the South Wales Police’s smartphone app trial, even the “watch list” included images of people innocent under UK law – those previously arrested but not convicted – despite the fact that retention of such images is unlawful.

“We find that all three of these deployments fail to meet the minimum ethical and legal standards based on our research on police use of facial recognition," said Radiya-Dixit.

Prof Gina Neff, Executive Director at the Minderoo Centre for Technology and Democracy, said: “Over the last few years, police forces around the world, including in England and Wales, have deployed facial recognition technologies. Our goal was to assess whether these deployments used known practices for the safe and ethical use of these technologies.” 

“Building a unique audit system enabled us to examine the issues of privacy, equality, accountability, and oversight that should accompany any use of such technologies by the police,” Neff said.

Officers are increasingly under-resourced and overburdened, write the researchers, and FRT is seen as a fast, efficient and cheap way to track down persons of interest.

At least ten police forces in England and Wales have trialled facial recognition, with trials involving FRT use for operational policing purposes – although different forces use different standards.

Questions of privacy run deep for policing technology that scans and potentially retains vast numbers of facial images without knowledge or consent. The researchers highlight a possible “chilling effect” if FRT leads to a reluctance to exercise fundamental rights among the public – right to protest, for example – for fear of potential consequences.

Use of FRT also raises discrimination concerns. The researchers point out that, historically, surveillance systems are used to monitor marginalised groups, and recent studies suggest the technology itself contains inherent bias that disproportionately misidentifies women, people of colour, and people with disabilities.

Given regulatory gaps and failures to meet minimum standards set out by the new audit toolkit, the researchers write that they support calls for a “ban on police use of facial recognition in publicly accessible spaces”.

 

Researchers devise an audit tool to test whether police use of facial recognition poses a threat to fundamental human rights, and analyse three deployments of the technology by British forces – with all three failing to meet “minimum ethical and legal standards”.  

Building a unique audit system enabled us to examine the issues of privacy, equality, accountability, and oversight that should accompany any use of such technologies by the police
Gina Neff
Image from the report 'A Sociotechnical Audit: Assessing Police use of Facial Recognition'

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Viewing all articles
Browse latest Browse all 4516

Trending Articles