Police Use of Facial Recognition Technology Could Threaten Charter Rights, Lawyer Tells MPs

Police Use of Facial Recognition Technology Could Threaten Charter Rights, Lawyer Tells MPs
A CCTV camera in Pancras Square near Kings Cross Station in London, England, on Aug. 16, 2019. (Dan Kitwood/Getty Images)
Isaac Teo
3/22/2022
Updated:
3/23/2022

The use of facial recognition technology by law enforcement without the proper legal safeguards in place threatens privacy, free speech, and peaceful assembly rights protected under the charter, a technology and human rights lawyer says.

Testifying before the Standing Committee on Access to Information, Privacy and Ethics on March 21, Cynthia Khoo said the use of facial recognition by police agencies without strict legal safeguards will likely cause damage to Canadians’ fundamental freedoms.
“Even if all bias were removed from facial recognition, the technology will still pose an equal or even greater threat to our constitutional and human rights,” said Khoo, founder of Tekhnos Law and also a research fellow with the Citizen Lab at the University of Toronto.

“Facial recognition used to identify people in public violates privacy preserved through anonymity in daily life and relies on collecting particularly sensitive biometric data. This would likely induce chilling effects on freedom of expression, such as public protests about injustice.”

Canadian law enforcement has come under fire in recent years over the use of algorithmic policing technologies to conduct their work.

At least 10 police agencies, including the RCMP and Calgary and Toronto police services, were reported in October 2020 to have used Clearview AI, a U.S. based facial-recognition company that scraped more than three billion images from the internet for use in law enforcement investigations.
In his investigation concluded last June, Privacy Commissioner Daniel Therrien said the RCMP broke the Privacy Act when it collected information using Clearview.

“In our view, a government institution simply cannot collect personal information from a third party agent if that third party’s collection was unlawful in the first place,” Therrien said.

In February 2020, the Office of the Privacy Commissioner of Canada found Clearview’s practices to be “mass surveillance and illegal” under federal and provincial private-sector privacy laws.

“Notably, we found there were serious and systemic gaps in the RCMP’s policies and systems to track, identify, assess and control novel collections of personal information through new technologies,” Therrien said.

‘Inscrutable Layers of Mass Surveillance’

Khoo stressed it is crucial that strict legal safeguards are put in place to ensure that the public-private partnership police establish with commercial vendors does not circumvent Canadians’ constitutional rights to liberty and to protection from “unreasonable search and seizure.”
“Software from companies such as Clearview AI, Amazon Rekognition, and NEC Corporation is typically proprietary, concealed by trade secret laws, and procured on the basis of behind-the-scenes lobbying,” she said.

“This results in secretive public-private surveillance partnerships that strip criminal defendants of their due process rights and subject all of us to inscrutable layers of mass surveillance.”

Khoo added that as facial recognition technology processes photos from mass police datasets, such as mug shots, it runs the risk of inheriting systemic biases from the records and has led to misidentification in the past. She recommended launching a judicial inquiry into the use of pre-existing police datasets.

“This is to assess the appropriateness of repurposing previously collected personal data for use with facial recognition and other algorithmic policing technologies,” she said.

Carole Piovesan, a managing partner at Toronto-based law firm INQ Law, where she practices privacy and artificial intelligence risk management, said the lack of an all-encompassing framework to guide the use of the technology triggered the problem.

“The issue is that we don’t have comprehensive regulation or, frankly, a comprehensive approach when it comes to the use of facial recognition technology as a technology ‘soup to nuts’—meaning from the collection of that data through to the actual design of the system, through to the use of that system,” Piovesan told the committee.

She said having a clear understanding of what safeguards should be in place from the beginning to the end stage of data usage—including the collection, storage, assessment, and disclosure requirements of the data—are crucial in protecting the rights of Canadians.

“We have a right to know when aspects of our face, or anything that’s an immutable sensitive data point, is being collected and stored, and potentially used in a way that could be harmful against us,” she said.

Khoo recommended placing a national moratorium on the use of facial recognition technology by law enforcement agencies until safeguards can be studied and put in place.

“The moratorium would give time to look further into the issue, to launch a judicial inquiry, for example, until we can determine if it is appropriate to use facial recognition, under what circumstances, and with what safeguards and then including time to put those safeguards in place,” she said.