UK Hospital’s Data Deal With Google’s Deepmind a ‘Cautionary Tale’

UK Hospital’s Data Deal With Google’s Deepmind a ‘Cautionary Tale’
Google’s DeepMind was given access to every patient’s complete medical history at one of England’s largest hospitals, Royal Free Hospital, which is part of the National Health Service. (BEN STANSALL/AFP/Getty Images)
Simon Veazey
3/23/2017
Updated:
3/23/2017
Google bought artificial intelligence company DeepMind in 2014.
Google bought artificial intelligence company DeepMind in 2014.

BIRMINGHAM, England—A deal that secretly shared 1.6 million people’s health data with Google’s DeepMind has been criticized over fears that the company could monopolize medical analysis.

The controversial deal between London’s Royal Free Hospital and the artificial intelligence company owned by Google is “a cautionary tale and a call to attention,” said the authors of a report published in the journal Health and Technology on March 16.

With enough data, artificial intelligence (AI) can learn to spot patterns that human experts miss, from security surveillance to fraud detection. AI companies like DeepMind are racing to develop medical algorithms capable of, for example, delivering precise radiotherapy treatments or assessing eye scans to diagnose potential blindness.

But because AI essentially learns on the job, partnering up with health organizations and learning through their data is vital.

“Google, Microsoft, IBM, Apple, and others are all preparing, in their own ways, bids on the future of health and on various aspects of the global health care industry,” said study authors Julia Powles and Hal Hodson.

London-based DeepMind, one of the world’s most high-profile AI companies, was bought by Google in 2014 for $400 million.

In 2015, DeepMind teamed up with the Royal Free Hospital to provide instant warnings via mobile app of acute kidney injury.

But seven months later, New Scientist magazine revealed that the public hospital had given DeepMind access to sensitive patient records stretching back five years, without the public’s knowledge.

The report by Powles, a University of Cambridge academic, and Hodson, a journalist, sparked an investigation by the U.K. Information Commissioner’s Office, which has yet to publish its findings.

*
*

DeepMind and Royal Free have revised their agreement since the controversy erupted and have rejected allegations of wrongdoing. They point to over 1,500 similar agreements between the National Health Service (NHS) in England and third-party organizations as evidence of normal practice.

The current five-year project between DeepMind and the hospital supports an app that gives real-time warnings when test results indicate acute kidney injury (AKI) in hospital patients.

The incidence of AKI could be as high as 1 in 6 in-patients, and the condition is estimated to cost NHS England more than 1 billion pounds ($1.25 billion) a year.

But the scale of information shared in the AKI initiative raises concerns, said Subhajit Basu, associate professor of information technology law at University of Leeds.

“DeepMind has access to every patient’s complete medical history, as well as records of every admission and discharge from the Royal Free hospital, one of the largest in the U.K.,” said Basu, in an email.

He describes three problems with the deal. “First is this sense of entitlement, that DeepMind clearly feels that they can have access to patient medical records without consent,” he said.

The second issue, he said, is secrecy. “It looks like DeepMind/Google got free access to NHS data on the back of persuasive but unproven promises of efficiency and innovation.”

“Third, what NHS did, I think, is far more objectionable. NHS had to exploit a loophole around ‘implied consent,’ which states that they do not require patients’ consent if the data is being used for direct care.”

DeepMind said the data is not used for commercial purposes and that it cannot be linked to Google services or exported from England.

DeepMind said in a joint statement with Royal Free that the latest paper “completely misrepresents the reality of how the NHS uses technology to process data” and “makes a series of significant factual and analytical errors.”

Phil Booth, coordinator at privacy campaign organization MedConfidential said, “They are basically refusing to admit that they could have possibly made a mistake.”

“If they admit that they got this wrong, then they are admitting that they are illegally processing the data of 1.6 million people,” said Booth.

DeepMind’s projects with other hospitals have data agreements that are more transparent and restrictive than the initial one with Royal Free. But Powles and Hodson’s report raises concerns that the deals could lead to a monopoly over health analytics in the U.K.—and internationally.

“We do know that DeepMind will keep all algorithms that are developed during the studies,” wrote Powles and Hodson in their report.

Basu agrees. “We now understand that they need NHS data to train their medical AI algorithms. We are seeing monetization of U.K.’s health data.”

“In principle, I am not opposed to the use of innovative technologies for health care, but when it is to the benefit of one private company, without the say of patients, we are starting down a worrying road.”

Basu says we are in the middle of two scientific tidal waves: medical advancement and unprecedented data-processing power.

“When we put these two together, we will get external systems that can monitor and understand our feelings much better than what we can by ourselves. But is that a problem? Once Big Data systems start to know me better than I know myself, authority will shift from humans to algorithms.”

Simon Veazey is a UK-based journalist who has reported for The Epoch Times since 2006 on various beats, from in-depth coverage of British and European politics to web-based writing on breaking news.
twitter