We’ve recently seen much breathless news coverage of the Nightingale Project, Google’s half-secret partnership with Ascension, the second-largest healthcare system in the United States.
The details of the project — which involves sharing the healthcare data of tens of millions of unsuspecting patients — have raised significant concerns.
The concerns have centered around issues that by now are familiar: Many people are uncomfortable with Google knowing about their personal, sensitive, and potentially embarrassing health complaints. We worry that employees at Google can read all about us.
The reality is both less and more worrying.
Here’s the good news: Google doesn’t really care about individual patients. The bad news? It uses partnerships like this to build artificial intelligence algorithms that eventually might recommend raising your health insurance premiums.
This controversy raises three crucial issues: what Google is using this data for, whether this data use is legal, and what you can do to avoid having your data being used in this way.
Why Does Google Want My Healthcare Data?
Despite the implications in some of the news coverage of the recent Nightingale controversy, it’s important to realize that Google is not collecting health data in order to sell you products for your bad back. In fact, provisions in HIPAA (Health Information Portability and Accountability Act) explicitly bar the company from doing that.
Instead, the goal of partnerships like this one is more ambitious: to build AI algorithms that can predict the future healthcare needs of the general population. To do that, Google had to overcome a problem. In order for AIs to learn to make such predictions, they must be fed vast datasets. While Google has access to the expertise necessary to build AI systems, it also needs access to a massive quantity of data.
Google has tried several ways of getting access to healthcare data before. It has conducted medical studies via its own apps, and its US$2.1 billion acquisition of Fitbit was in part to gain access to the healthcare data of its huge user base.
However, Google’s partnership with Ascension (and similar arrangements with several other healthcare providers) provides access to its single biggest source of healthcare data.
No matter how uncomfortable this kind of data collection makes us feel, it’s noteworthy that it could have significant positive impacts on health outcomes. AIs could be used to make reliable predictions of the onset of preventable diseases, for instance. They also might be helpful in conducting large-scale automated reviews of health interventions in order to guide future treatments.
These are noble aims, of course. The problem is that AIs like those being developed by Google also are likely, perhaps more likely, to be used for two other purposes: for Google to make money by selling them to other healthcare providers; and for insurance companies to estimate and potentially raise premiums.
This is to say nothing of the risk that the records might be leaked, hacked, or stolen. In short, while Google might have the best of intentions, the widespread dissemination of healthcare data raises significant legal and ethical issues.
Is This Legal?
In the context of these concerns, many analysts have begun to reconsider the legal basis for health information storage and sharing. In the U.S., the Health Insurance Portability and Accountability Act, better known as “HIPAA,” provides the regulatory framework. Under this act, patient records and other medical details can be used “only to help the covered entity carry out its healthcare functions.”
On the surface of things, this is exactly what the Nightingale project aims to do. Therefore, despite all the op-eds published in the last month, no one has claimed that it is illegal. Whether that makes you feel better about your records being shared is another matter.
Indeed, as Dianne Bourque, an attorney at the legal firm Mintz, told Wired earlier this month, “If you’re shocked that your entire medical record just went to a giant company like Google, it doesn’t make you feel better that it’s reasonable under HIPAA, but it is.”
Worrying about the raw legality of the project doesn’t address the range of issues adequately, either. HIPAA, for all its legal force, seems totally outdated when it comes to the kind of mass data collection now possible.
The consent of patients is not required. Many patients find it difficult to access their own data, while commercial operations seem to have no trouble scooping up records and subjecting them to analysis.
It is apparent that Google is aware of these issues: It took a whistle-blower to bring the project into the public realm, and Google is deploying significant resources in a reputation management strategy to reassure its customers that the company didn’t try to do what it allegedly did.
What Can I Do About It?
If the views of patients are out of sync with the current law, perhaps it is time to replace HIPAA. This strategy, however, does not help people whose healthcare data has already been shared with Google.
It’s difficult to recommend a strategy that would keep your personal data from being shared in this way. Sadly, the best you can do is to ensure that you don’t give up any more data than you have to.
You can achieve that in a variety of ways. You should audit your online accounts regularly in order to ensure that they have not been compromised. If you access your medical records online, always do so through a secure virtual private network. The best VPNs use 256-bit AES military-grade encryption to create a private access portal. Pro tip: Read the fine print on logging practices. Less is better.
Above all, you should conduct regular software updates to ensure that your Fitbit (or similar) device is not spying on you. Spyware in the Internet of Things was one of the major cybersecurity concerns over the past year and likely will be one of the most lucrative illegal sources of healthcare data over the next decade.
In short, while consumers already may have lost the privacy war, and while HIPAA is of no help when it comes to Google collecting your health data, don’t give away any more than you have to.
Anything online is at risk, its all a false illusion of security