Protecting health data has always been important due to the sensitivity of the information kept in medical records.
SkyFlok is designed to help healthcare organizations store and share files securely with their patients. We provide our clients with an incident response plan that increases data accessibility and privacy.
Keep your sensitive information private with SkyFlok!
In 1996, the year Congress passed its landmark health privacy law, there was no Apple Watch, no Fitbit, no Facebook support groups or patients tweeting about their medical care. Health data was between you, your doctor, and the health care system. More than two decades later, that law — the Health Insurance Portability and Accountability Act (HIPAA) — is still the key piece of legislation protecting our medical privacy, despite being woefully inadequate for dealing with the health-related data we constantly generate outside of the health care system. Now, there could be an opportunity for a revamp.
Earlier this month, Sen. Marco Rubio (R-FL) announced a data privacy bill that would direct the Federal Trade Commission to write new privacy recommendations that overrule state laws. Similarly, a prominent technology think tank, the Information Technology and Innovation Foundation, has suggested a “grand bargain” of a new federal data law that would not only preempt state laws, but entirely repeal sector-specific federal privacy laws like the Family Educational Rights and Privacy Act (FERPA), Children’s Online Privacy Protection Rule (COPPA) and, of course, HIPAA.
These proposals are in very early stages, but experts agree that having an overarching federal data privacy law makes more sense than the current mix of sector laws and state-level laws. “Data travels freely across state and continental lines, so to have a patchwork of state laws makes no sense,” says Kayte Spector-Bagdady, a bioethicist at the University of Michigan. To avoid getting in trouble, every institution needs to follow the policy of the state with the most restrictive laws. The result is that California, with its tough new data privacy law, is essentially setting the national policy.
HIPAA has played an important role in protecting patients from harm. In some cases, health information can lead to discrimination, such as higher life insurance premiums. In other situations, privacy itself is valued. You wouldn’t want a gossipy doctor or office secretary telling everyone which patient has cancer, and before HIPAA, that was far more common. “It’s hard to overstate the cultural change that HIPAA did bring about, and that’s been good,” says Margaret Riley, a health law expert at the University of Virginia. A return to those looser attitudes would cause real damage, as it often did in the past.
At the same time, the enormous amount of data we generate holds exciting potential for researchers and doctors who can use it to improve care or find new treatments and insights into disease. The key question now is how to balance the competing interests of privacy and data-sharing. “Privacy is very, very important, but if we focus only on the idea of control for the individual, we will hurt research,” says Riley. So what’s the state of health privacy law, and what would an ideal policy look like?
WHAT IS HIPAA?
Officially speaking, HIPAA blocks medical providers like doctors, nurses, and pharmacies from giving third parties “protected health information,” according to Pamela Hepp, co-chair of the Cybersecurity and Data Privacy Group at the firm Buchanan Ingersoll and Rooney. “Protected health information” means personally identifiable information related to medical conditions and treatments.
That may sound broad, but in practice, HIPAA mostly prevents very obvious medical providers from sharing very obvious pieces of medical information. The policy today is, as health policy experts W. Nicholson Price and I. Glenn Cohen argued in a recent Nature Medicine editorial, simultaneously overprotective and underprotective.
On the overprotective side, HIPAA makes it really hard to share data and do research, says Price, a professor of law at the University of Michigan. For example, hospitals can use identifiable patient information to improve the quality of their care and answer questions like, “do patients being treated in the ER at night do as well as patients treated during the day?” But if an outside academic tries to use the same information for a rigorous, controlled study, HIPAA makes that very difficult. “I’ve done some of that with my own institution, and it’s honestly working through the seven stages of hell as you try to figure out exactly what steps you have to take in order to do that,” Riley adds.
Worse, HIPAA is a “four-letter word that medical people often use when they don’t want to do anything,” according to Cohen, a professor of law at Harvard University. Doctors have used it to avoid giving patients their own medical records, which, he says, “is like working with someone to write an autobiography and then being told you can’t take that biography out of the library.”
Perhaps the biggest weakness of HIPAA, and the way that it underprotects us, is that it doesn’t cover the enormous amount of data we generate in daily life that can hold clues to our health — everything from shopping patterns to which Instagram filters someone uses to how many steps someone walks per day to the size of the pants they order on Amazon. “The notion that this data is health care and that data is not health care is just outdated,” says Riley. “I can probably tell you more about an individual’s health with their grocery list than with their patient record at this point.” And none of that is protected.
HIPAA is really about health care data more than health data, experts say, and the law focuses more on the custodians of the data (or who has the data) rather than what kind of data that is, creating plenty of loopholes. It won’t let a pharmacist share your oxycodone prescription, but it will let an online shopping service tell another company that you bought a knee brace. Tech companies that work in health but aren’t officially “health care” companies — like Fitbit and Apple and the makers of all those sleep and fertility and fitness apps — have deliberately avoided HIPAA and set up camp at the border where they can gather health-related information while retaining their ability to flout the legislation.
If you take an electrocardiogram (EKG) at the doctor, and the doctor puts the results into an electronic health record, that is protected by HIPAA because it’s within the health care system. If you take an EKG with the Apple Watch and don’t share that information with your doctor, that same information is not protected by HIPAA. But if you take an EKG using the new Apple Watch and share it with your doctor and she puts it in her electronic health records, it is protected by HIPAA. You see the confusion.
A final weakness is that HIPAA doesn’t actually prevent hospitals from selling your health information if it’s been de-identified. If you strip away information like name and Social Security number and picture, you’re allowed to share the data without HIPAA restrictions, according to Spector-Bagdady. “People are concerned about other people making money and this data going to the pharmaceutical industry even if it doesn’t have their name on it,” she says. “So that ability to strip identifying information and sell doesn’t reflect what the actual general public wants.”
HOW SHOULD WE THINK ABOUT HEALTH PRIVACY INSTEAD?
First, it’s extremely unlikely that any health privacy overhaul would let tech companies start reading everyone’s electronic health records. “While there is a constituency of people who would love nothing better than for HIPAA to go away, the constituency of people who think health care privacy is extremely important is much stronger,” Cohen tells me. “The idea that somebody would get up on the floor and say ‘Senator X is trying to expose your health care data’ is just too easy an ad to run. I’d be really surprised if a new policy allowed stuff to flow out of electronic health records in a way we don’t see now, to private companies.”
Cohen thinks that, most likely, a lot of the most sensitive stuff that is already protected will stay protected. If new provisions did focus on the type of data (i.e., physiological measures) instead of who has the data (Apple, the doctor), that would cover a lot more people and companies. On the other hand, broader rules could likely mean looser consent and authorization forms than under current HIPAA standards.
Both Cohen and Price say that they favor beefing up laws around the use of health data as opposed to having very restrictive laws on how the data itself can be collected or shared. “I think the potential to collect a lot of health data could make a big difference in health innovation and save a lot of lives,” says Price. “I am in favor of more downstream protections that keep the data from being used in problematic ways.”
Right now, laws like the Americans with Disabilities Act (ADA), the Genetic Information Nondiscrimination Act (GINA), and the Patient Protection and Affordable Care Act (PPACA) prevent various forms of discrimination based on medical data. But, once again, there are loopholes. GINA and the ADA don’t regulate life insurance, and GINA doesn’t protect someone against long-term health insurers. For example, if someone taking a DNA test finds out that they’re likely to have early-onset Alzheimer’s, and that information was shared with a long-term insurer, that’s information the company can use to change the price of a person’s policy or deny them coverage altogether. Closing these loopholes could remove many of the harms of data collection.
For her part, Spector-Bagdady agrees with the importance of strengthening non-discrimination laws, but adds that there should probably be certain types of information that companies can’t track without explicit consent because they’re so sensitive. “The kinds of data that people are particularly sensitive to are mental health-related data, genetic data, and data having to do with sexual health,” she says. “People are buying fertility apps and putting in when they had sex and what their vaginal mucus was like. People are sharing incredibly personal detailed health information.” This is the type of information most people don’t want companies to know, regardless of whether it’s used for anything.
No health privacy overhaul will be able to please everyone. People have different temperatures on what kind of data is personal for them, Spector-Bagdady adds. Some are willing to share everything, and some people are really private. “It’s really hard to craft laws to apply to everybody that is going to protect any given individual, so, inevitably, people are going to be disappointed,” Spector-Bagdady says. Still, a new federal privacy law could provide the opportunity to do better than HIPAA and take into account the connected world we live in today.