Microsoft Provides Australian State Police with Object Recognition Services For Surveillance

Microsoft Provides Australian State Police with Object Recognition Services For Surveillance
Objects and faces can be recognised in a crowd using artificial intelligence and machine learning. (Shutterstock)
Daniel Khmelev
6/9/2021
Updated:
6/15/2021

Microsoft has announced it will provide the Australian New South Wales (NSW) Police force with its object recognition technology to speed up the state’s surveillance footage analysis.

The state police’s older systems involved CCTV footage—and other forms of evidence required in investigations—stored on servers locally, which required time-consuming manual review from police.

The new system involves sending footage to the “cloud”—in this case, Microsoft’s own servers—to identify objects linked to suspects using Azure Computer Vision, which utilises artificial intelligence (AI) and machine learning (ML).

Gordon Dunsford, Chief Information Technology Officer for NSW Police, said that the process served to accelerate investigations, freeing officers to do more frontline police work.

“Using computer vision, it can search to recognise objects, vehicles, locations, even a backpack someone has on their back or a tie a gentleman is wearing,” Dunsford said. “It’s significantly sped up investigations and has helped police to get a result in a fraction of the time.”

According to Microsoft, one particular case saw NSW Police collect 14,000 pieces of CCTV for a murder and assault investigation, analysing what would normally require weeks or months in just five hours.

Signage of Microsoft in New York City, United States on Mar. 13, 2020. (Photo by Jeenah Moon/Getty Images)
Signage of Microsoft in New York City, United States on Mar. 13, 2020. (Photo by Jeenah Moon/Getty Images)
“Detectives were able to then within days piece together the time sequence of events, movements and interactions of the person of interest as well as overlay this onto a geospatial platform, visualising the data for detectives and aiding in the preparation of the brief of evidence for Courts,” Microsoft said in a press release.
The news of the sale in Australia comes after Microsoft, along with Amazon and IBM, previously confirmed that it would not be selling object recognition technology to police in the United States until strong federal regulation covering its usage had been enacted.

Australian Human Rights Commission Recommends Banning Facial Recognition

The Australian Human Rights Commission (AHRC) released its 2021 Human Rights and Technology Final Report last week, recommending the government ban facial recognition and other biometric technology until federal and state governments introduced regulatory legislation.

“Australian law should provide stronger, clearer and more targeted human rights protections regarding the development and use of biometric technologies, including facial recognition,” the report stated. “Until these protections are in place, the Commission recommends a moratorium on the use of biometric technologies, including facial recognition, in high-risk areas.”

In particular, the report highlighted risks posed to individuals’ right to privacy, as well the chance of racial biases, which it said could increase the risk of injustice and human rights infringements.

“This necessarily affects individual privacy and can fuel harmful surveillance. In addition, certain biometric technologies are prone to high error rates, especially for particular racial and other groups,” the report said.

Microsoft said the solution had been designed with “ethics front and centre,” and did not utilise real-time face recognition technology.

“The solution uses Azure Computer Vision to identify objects, not faces that assist with police cases,” a Microsoft spokesperson told The Epoch Times. “This video data is not live data and is captured under police process with a warrant.”
A previous edition of this article said Microsoft had offered facial recognition software to the Australian police, this was incorrect and The Epoch Times apologises for this error.