It is almost axiomatic today that the future of health care delivery will be dominated by gadgets, apps, and everything in between that track your vital signs in real-time, use genomic and phenotypic testing to guide treatment selection, and portable electronic health records that do away with doctors' chicken scratch and seamlessly integrate across thousands of different systems. Of course, we always tend to view the future through rose-colored glasses, and tend to discount the possibility that things may not work out as we hope (electronic health records have not brought the expected cost savings, for instance), and there will still be many kinks to work out when it comes to precision medicine.
Nevertheless, the trajectory of our health care system is clear - much of the data that will fuel the big data revolution is already on the cloud (many apps readily use this data for various purposes but simply the data isn't completely interoperable across systems) - the velocity (whether we get there 5 or 20 years from now) is less relevant.
But amidst all the hubbub about these new developments (like the potential Star Trek-ification of future health care diagnostics) some have cried foul on an issue that Americans take very seriously - privacy.
Imagine a decade from now, that throughout the day, you're wearing the Apple iWatch - it monitors everything including your blood pressure, body temperature, and the amount of calories burned throughout the day. The information is uploaded to your iTunes account so you can track your daily activity, it shares the information with your electronic health record (maintained by your local physician, ACO etc.) so your health care provider knows what you mean when you say "I run every day!", and somewhere along the line, it ends up being part of a study looking at the effects of wearing the Apple iWatch on weight loss. Of course, the researchers are careful to aggregate the data - removing information like zip codes, addresses, social security numbers and other identifiable characteristics to make sure that no one - not the government, not your health insurer, nor any marketing agencies interested in maximizing their outreach efforts - can use the data to identify any individuals. Unfortunately, the reality is that no matter how careful researchers are, someone is bound to make a mistake - although the risks of re-identification (particularly post-HIPAA) are estimated to be fairly low, as information becomes more digitized this will increasingly become an issue.
Not only is re-identification an issue, but security of the records - protecting against data breaches - will become ever more important. As health information gets stored on the cloud - in a server that is just as likely to be located domestically as it is in India or the UK - it becomes, by definition, less secure. Of course, advances such as public key cryptography (where data is locked by one "public" password, but can only be unlocked by another "private" password), MD5 hashes, and ever-increasing encryption complexity, help to protect against these threats. The impending digitization of health care data will make these innovations ever more important, and federal (as well as state) regulatory standards will have to keep pace.
But let's assume for a moment that digital security will be able to prevent nearly all data breaches, personal information will be stripped where necessary, and federal regulations will help, not hinder, these efforts. There still remains a question that privacy advocates should worry about - that of permission. High profile lawsuits against companies like Facebook (which used members' profile information to help target ads) occur because companies can be negligent (whether intentionally or not) about requesting their users' permission to use personal data. For the time being this isn't a major issue (of course, there are exceptions) - health insurers aren't constantly scouring Facebook just yet to help them determine your premiums. But once again, as information becomes more digitized - particularly relevant health-related information (for instance, how much you run during the week, how many calories you intake etc.) - different stakeholders, like insurers, employers, and the government will certainly have an interest in accessing it (whether to adjust your insurance premiums or to investigate fraudulent disability claims).
The incentive for various stakeholders to seek out relevant personal information about you will put enormous pressure on developers of these gadgets and apps of the future, particularly when it comes to using your data. To maximize the use of these innovative technologies and maximize their benefit to society, companies will need to assure their customers that they will retain complete discretion over how their information is used. Privacy policies will need to be simplified from legalese into easy-to-read language so customers understand when they're signing away their information to be sold to third parties. One competitor for Qualcomm's Tricorder Prize (a competition to develop a Star Trek-like medical device), the Scanadu, offers what may very well be a gold standard for privacy:
The most recent data will be stored on your phone. But you will help us define the way data is stored for the long term. It is anonymized and encrypted. It is your data. What ever we'll do - it will always be opt-in.
You are the only person that has access to your data. You do however have the ability to share the data with doctors or even your friends and only if you want to.
It doesn't really get any simpler than that. A simple checkbox that simply asks "yes or no" is all that's necessary (here is where federal regulations can help - the DOJ can issue simple "model" privacy language for developers to build off of).
But what about insurers? Surely, they have a right to know (and risk-adjust appropriately) if you have a genetic abnormality that predisposes you to a particular form of breast cancer. Let's ignore, for a moment, the fact that insurers are currently not permitted to discriminate based on genetics. There are other ways for insurers to encourage consumers to share this kind of information without bringing to mind a "big brother" corporate dystopia. The car insurer, Progressive, for example, lets prospective customers attach a device to their car to track their driving for a period of time. The information gathered can then be used to reduce their insurance premiums, but the company promises not to use the information to increase premiums (say, if you're a more erratic driver than other, similar drivers). Rather than penalizing people for sharing their risk factors, insurers can reward them - for instance, by helping to decide what the most appropriate and cost-effective course of action is, if a mutation in the BRCA gene is discovered.
Nevertheless, the onus will remain on developers that are leading the big data revolution in health care to ensure that customers understand what they're sharing (if anything), who they're sharing it with, and most importantly developers will have to allow customers to opt-out of sharing as well. Indeed, the big data revolution in health care can be accelerated if consumers trust that their data is safe and secure, and that they are in control of their data, rather than the big bad insurer, hospital, or generic "corporate giant." The market, along with smart and lean government regulations will ensure that companies that protect their customers' privacy will be the ones that succeed in the big data era.