Privacy Concerns as Welsh Police Trial Facial Recognition Tech to Identify Suspects

Privacy Concerns as Welsh Police Trial Facial Recognition Tech to Identify Suspects
A mobile police facial recognition facility outside a shopping centre in London, on Feb. 11, 2020. (Kelvin Chan/AP Photo)
Owen Evans

Welsh police are rolling out futuristic technology that will identify wanted individuals in real-time.

The system, known as Operator Initiated Facial Recognition, will be used initially by 70 officers from South Wales Police and Gwent Police.

Announced last week, the new facial recognition app will enable officers to confirm the identity of a wanted suspect on their mobile phones almost instantly, even if that suspect provides false or misleading detail, they say. The officers will be the first in the UK to develop and use the technology.

South Wales Police and Crime Commissioner Alun Michael said that he undertakes close scrutiny of the operational decisions on the introduction of technology and subjects each new step to independent oversight and scrutiny "because of the ethical and social concerns that have been expressed over the use of facial recognition technology."

"People want to know that members of the public who have done nothing wrong are not being subjected to inappropriate surveillance and that their privacy will be fully respected and protected," Michael said in a statement. "However, people also want us to keep them safe and to use the technology to apprehend people who have committed serious offences and take them off the streets.

"As a result of our robust systems of scrutiny and challenge, I can provide assurance to the public that we are getting that balance right. We are committed to protecting human rights as well as keeping the public safe," Michael added.


Professor of Cybersecurity at Leicester’s De Montfort University Eerke Boiten told The Epoch Times that the police have been clear the images captured by the technology will be deleted. But he added that there also needs to be a wider discussion about data.

He said that's a responsible way of running an app like this, in contrast to a 2019 incident in which South Wales Police used live facial recognition at a football match with a watchlist relating to Football Banning Orders.

 Facial recognition technology is operated in Sydney, Australia in this file photo. (Ian Waldie/Getty Images)
Facial recognition technology is operated in Sydney, Australia in this file photo. (Ian Waldie/Getty Images)

“There have been various historical stories in the UK about police databases that have been kept on longer than anybody had been expecting,” said Boiten, adding that police have in the past retained DNA from people that had not been charged with anything.

“I still have some doubts about facial recognition. It is an approximate technology. It is never going to be 100 percent sure that you will catch someone precisely. It’s also a biased technology. Typically a lot of facial recognition depends on the race of the people involved,” he said.

He added that facial recognition can also generate false positives, meaning it can improperly identify people.


“You need surveillance to watch people who are doing something bad or about to do something bad, but the balance is that it’s clearly not right for the police to be surveilling all the time. That will lead to restrictions to civil liberties and to people behaving differently as they are being watched,” said Boiten.

Security and threat management specialist Will Geddes told The Epoch Times he is not entirely opposed to facial recognition, because it can be used "to catch suspects and individuals that could potentially be a risk.”

Geddes has over three decades of experience in advising international corporations, high-net-worth individuals, celebrities, heads of state, and foreign royal families in the specialist security sector.

“The positive side is access control into a building, rather than having to use a conventional card to enter," he said. "Then we have a problem that if I’ve stolen your card, I can pretend to be you. So on the logs in the access control, all it shows is that you have been in the building. But I could have gone in there and nicked a bunch of stuff and left a bomb,” said Geddes.

In terms of security, this creates an inaccurate audit trail of movement, he said. And when it comes to civil liberties, Geddes said that the question is how that data is processed.

“In an office that makes sense. But when it’s out in the street, my biggest concern is, what are they doing with all this data?” he said.


For example, if police are gathering 1,000 people’s faces a day in order to identify two criminals, then what is happening to the other 9,998 faces?

"Is it throwing out a big net, instead of a fishing rod? Is it pulling information that has been specially requested rather than just pulling all that info then filtering it through the other end at the headquarters, where the data is being processed? Are they being disposed of or are they being stored for future reference like fingerprints?” Geddes said.

He said many companies are fast adopting biometrics, which is the other side of facial recognition.

“With iPhones and other devices, you are giving permission to your device to bypass the standard of putting in a pin number. It’s more about simplicity. Banking does the same, using a much more biometric approach to verify the user,” he said.

He added that companies like Apple do have robust privacy policies, however.

South Wales Police did not respond to The Epoch Times’ request for comment.

Owen Evans is a UK-based journalist covering a wide range of national stories, with a particular interest in civil liberties and free speech.