Home Office ‘Exploring’ Use of AI to Detect and Prevent Rape and Sexual Assault

Policing minister says use of the technology is currently being investigated, but security experts are sceptical of its use in preventing sexual attacks.
Home Office ‘Exploring’ Use of AI to Detect and Prevent Rape and Sexual Assault
A model poses as a complainant waiting to seen by a doctor at a specialist rape clinic in Kent, England on Jan. 31, 2007. (Gareth Fuller/PA)
Patricia Devlin
9/26/2023
Updated:
9/26/2023
0:00

The Home Office is “actively exploring” using Artificial Intelligence (AI) to “prevent and detect” rape and sexual assault.

Policing minister Chris Philp said the department is working with both government and “operational partners” in a bid to use the technology to aid police in combating serious sex crimes.

Speaking on Monday, Mr. Philp said investigating use of AI was part of the government’s “key priority” of protecting women and girls from violence.

Responding to a written question on the advanced tech’s use in crime fighting, Mr. Philp said: “The Home Office is working across government and with operational partners to develop our understanding of the threats and opportunities presented by artificial intelligence.

“The Home Office is also actively exploring and investigating options to use AI to both prevent and detect crime, including rape and sexual assault.”

No further details of how the AI would be used alongside police work was given by the minister.

However, his comments have been met with scepticism from safety and security experts who said the advanced technology is unlikely to protect victims—or catch predators.

Opportunist Attacks

Jacqueline Davis, a former police officer who now heads up UK security consultancy firm Optimal Risk, told the Epoch Times that a high number of sexual assaults are carried out by “opportunists.”

“I find it hard to understand how AI could be used to predict or prevent opportunist attacks like these,” she said.

“These people will take advantage of someone out jogging with earplugs in, someone walking down the street glued to their phone late at night—they target women who they see are vulnerable and not fully in tune with what’s happening around them.

“That could be anywhere, at any place at any time.”

Ms. Davies, a former protection officer for the Royals who also runs safety classes for women, said more focus needs to be placed on helping women protect themselves.

“People say a woman should be able to walk down the road wearing what she likes, doing what she likes. In an ideal world, wouldn’t that be lovely? But the reality is you can’t.

“I’m not victim blaming here, but women really need to be more aware of what’s going on around them.

“I run a course called ‘Stay Safe, Be Aware’ and it is teaching women about being aware of what is around them. I’m not quite sure how AI technology is going to make that any better.”

AI (Artificial Intelligence) security cameras with facial recognition technology are seen at the 14th China International Exhibition on Public Safety and Security at the China International Exhibition Center in Beijing on Oct. 24, 2018. (Nicolas Asfouri/AFP via Getty Images)
AI (Artificial Intelligence) security cameras with facial recognition technology are seen at the 14th China International Exhibition on Public Safety and Security at the China International Exhibition Center in Beijing on Oct. 24, 2018. (Nicolas Asfouri/AFP via Getty Images)

Facial Recognition

Last month, the Home Office outlined its ambitions to increase the use of controversial facial recognition technologies to track and find criminals within policing and other security agencies.
The Marketing Exploration document outlined ambitions to potentially deploy new biometric systems nationally over the next 12 to 18 months.

It has called for submissions from companies for technologies that “can resolve identity using facial features and landmarks,” including live facial recognition which involves screening the general public for specific individuals on police watch lists.

In particular, the Home Office highlights its interest in “novel artificial intelligence technologies” that could process facial data efficiently to identify individuals, and software that could be integrated with existing technologies deployed by the department and with CCTV cameras.

Privacy campaigners have previously criticised the technology for being inaccurate and biased.

Facial recognition software has been used by South Wales Police and London’s Metropolitan Police over the past five years across multiple trials in public spaces including shopping centres, during events such as the Notting Hill Carnival and, more recently, during the coronation.

MPs have previously called for a moratorium on the use of facial recognition on the general population until clear laws are established by parliament.