New Police Policy Could Allow Innocent People to Be Put on Facial Recognition Watchlists

New Police Policy Could Allow Innocent People to Be Put on Facial Recognition Watchlists
A mobile police facial recognition facility outside a shopping centre in London, on Feb. 11, 2020. (Kelvin Chan/AP Photo)
Owen Evans
3/23/2022
Updated:
3/23/2022

Civil liberties groups have criticised new UK police guidance that could place innocent people on facial recognition systems watchlists.

The College of Policing published a guide for officers in England and Wales on Tuesday, which they said was to ensure the use of live facial recognition technology is “legal and ethical.”

Police said facial recognition technology can be used in operations to find “people who are missing and potentially at a risk of harm; find people where intelligence suggests that they may pose a threat to themselves or others; and arrest people who are wanted by police or courts.”

But Silkie Carlo, director of the privacy campaign group Big Brother Watch, said the guidance was an “atrocious policy and a hammer blow to privacy and liberty in our country.”

“We warned about mission creep with this Orwellian surveillance technology and now we see that this new policy specifically allows innocent people to be put on facial recognition watchlists. This includes victims, potential witnesses, people with mental health problems, or possible friends of any of those people. It is an atrocious policy and a hammer blow to privacy and liberty in our country,” she said.

The technology has already been used by a small number of police forces. In December, Welsh police trialed futuristic technology that will identify wanted individuals in real-time. Their facial recognition app will now enable officers to confirm the identity of a wanted suspect on their mobile phones almost instantly, even if that suspect provides false or misleading detail.

David Tucker, head of crime at the College of Policing, said: “Guidance issued for police today is clear that live facial recognition should be used in a responsible, transparent, fair, and ethical way and only when other, less intrusive methods would not achieve the same results.

“The technology will help police catch some of the most dangerous offenders including stalkers, terrorists, and others that the public want off our streets. It will be used overtly and unless a critical threat is declared, the public should be notified in advance on force websites or social media about its use,” said Tucker.

“We hope that those with concerns about this technology will be reassured by the careful safeguards we’ve set out as requirements for the police who wish to use it, based on a consistent and clear legal and ethical framework across all police forces,” he added.

PA contributed to this report.