Google's AI Brains Need to Work on Their Bedside Manner

Commentary May 16, 2017 at 07:12 PM
Share & Print

(Bloomberg Gadfly) — Britain's National Health Service probably needs a little lie-down after the past week. The WannaCry ransomware attack hit more than 50 hospitals. Surgeries, routine appointments and scans were cancelled as this role model for universal public health care was shown to have dangerously outdated tech defenses. 

Then Sky News revealed that DeepMind, a Google-owned artificial intelligence company, had been given "legally inappropriate" access to the personal medical files of 1.6 million Britons.

While unrelated, both incidents show how supremely difficult it is to digitize the vast healthcare sector. Government agencies are struggling to keep their guards up as more machines such as X-ray and MRI scanners are connected to the internet. That the publicly-funded NHS can't match the hackers is no great surprise given its perennial budget crunch.

For similar financial reasons, you can see why it might want private sector help to modernize patient care, as with DeepMind. Unfortunately, it has stepped into another kind of data minefield. It effectively bartered information on the patients of three hospitals for DeepMind's assistance in developing health care apps.

The first was a project to help hospital staff identify patients at risk of kidney failure by sending them smartphone alerts after poor blood tests. While such apps may turn out to be a blessing, the NHS wasn't transparent enough about what data was being shared and how it would be used. An investigation by the national data protection regulator is under way, which could bring fines and reforms.

The backlash is unfortunate because the DeepMind and NHS collaboration has great promise. Using self-learning algorithms to analyze huge data sets is an entirely sensible way to improve and better target expensive patient care. The kidney app is just a taste of what organizations like DeepMind and the NHS could do together.

Indeed, this goes beyond health care. While Europeans often get angry about private sector encroachment on public services, AI is one area where we'll need greater collaboration between the two sides. As well as Google parent Alphabet Inc., there's Facebook Inc., IBM Corp., Baidu Inc. and many others pouring billions into AI research to enable everything from self-driving cars to better online translation.

They'll need access to big pots of data, some of which are controlled by governments and public sector entities such as the NHS. The biggest problem is trust, which is why incidents like the DeepMind deal are so damaging.

In areas as sensitive as health care, Google and the other Silicon Valley giants must be like Caesar's wife: above suspicion. In truth, that hasn't always been the case.

Tech companies' usual arguments that more safeguards will add cost to AI projects and slow them down simply have no place in health care. The old method of relying on miles-long terms and conditions disclaimers that no one reads won't cut it either. Given that Google and Facebook make their billions from targeted advertising, proper rules of engagement must be created to stop them from casually misusing data shared with public bodies like the NHS. And these rules need to be understood easily by the public.

The NHS has been castigated over the past week for being both too antiquated, because of the WannaCry debacle, and too adventurous, in relation to the DeepMind barter deal. What is not in question is that it needs the private sector, just as the tech giants need access to the big data of the public realm. Both sides will have to tread more carefully. 

NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.

Related Stories

Resource Center