Editor’s Note: Marie Cocco is president of Seven Mile Communications, a consultancy specializing in health care communications. She has previously worked in the health insurance industry, in public health and as a journalist specializing in health care. The views expressed here are her own. Read more opinion at CNN.
Seeking help for substance abuse. Monitoring your glucose levels. Signing up to get therapy through virtual visits. Sharing symptoms to a portal that sets up a doctor visit. Ordering prescriptions online.
There’s an enormous trove of personal health information people now feed or tap into digital monitors, health apps, search engines and other online tools. If the same information were provided in your doctor’s office, your privacy would be safeguarded. If you’ve ever sat in a doctor’s waiting room filling out a multi-page questionnaire about your health status and history, you get the picture. But that’s not how the digitized health world works.
Instead, we have an ecosystem of abuse in which technology companies that have become central to the way people now access health care or monitor their health operate largely outside the federal law that requires doctors and other medical personnel, hospitals and insurers to protect an individual’s intimate health information.
That means tech companies can – and do – mine your digital data for clues about your health status, accessing information like prescriptions you have purchased and other health services you might have sought, and potentially link this information to your name, address, email address and other personally identifying information. The data can then be used by platforms including Facebook and Google to help advertisers target promotions or other communications to you.
It’s a gaping hole in health privacy protections that stems from the narrow and outdated 1996 law, the Health Insurance Portability and Accountability Act (HIPAA). The law protects interactions between patients, medical professionals and insurers but does not, in most cases, protect patient health data that is recorded on new technologies.
Closing the patient privacy gap can – and should – be a bipartisan priority for Congress. The consequences of digital exposure for those seeking abortions or other reproductive services since Roe v. Wade was overturned have drawn significant concern and attention. These worries are legitimate. But reproductive care is only one area of health services where private patient information is digitally disclosed.
A recent joint investigation by the health news outlet STAT and The Markup, which probes the intersection of institutions and technology, found that trackers from big tech companies were attached to 49 of 50 telehealth sites. The trackers have been feeding information to platforms including Facebook, Google, Snapchat, TikTok and even the professional networking site LinkedIn. On 35 of the 50 digital health sites, the news organizations found trackers that were sending individually identifying information – including names, addresses and phone numbers – to at least one big technology company. The companies gave varied responses to the investigation, with LinkedIn saying it deletes sensitive information and doesn’t add it to individuals’ profiles, but most declaring that advertisers are responsible for ensuring they aren’t sending sensitive information via the digital tools.
Last month, researchers at Duke University’s Sanford School of Public Policy reported that some data brokers are marketing sensitive mental health information, with no clear consumer privacy protections or even a set of best practices to guide the industry.
In the fall, it was revealed that Meta Pixel, a tracking tool from Facebook’s parent company, was installed in the technology systems of 33 of the top 100 hospital systems in the country. The pixel was attached in some cases to password-protected patient portals used for scheduling appointments. In response to a letter from Virginia Sen. Mark Warner, who inquired about the pixel issue, Meta said that when its systems detect and filter out potentially sensitive data coming from a website or app, it contacts the app developer and asks them to evaluate their implementation of the tool.
For years, medical ethicists and health policy experts have warned of the growing patient privacy threat that results from the way advances in digital communications have leapt outside the bounds of the 27-year-old HIPAA law. A handful of states have acted to allow consumers to staunch the flow of their private information online.
But digital information does not respect state borders. Federal action is urgently needed.
Some proposed legislation, notably a measure sponsored by Sens. Bill Cassidy of Louisiana and Jacky Rosen of Nevada, would close this glaring privacy gap. It would prevent entities that collect consumer health information from transferring, selling, sharing or allowing access to consumer health information or any individually identifiable consumer health information collected on personal health trackers.
It is a welcome first step that would address some of the most troubling practices that have come to light. Still, it would leave consumers’ health information exposed through “digital dust” they are leaving online through activities like searches and health-related purchases, Matthew McCoy, assistant professor of medical ethics at the University of Pennsylvania, told me.
Lawmakers are continuing to consider broader digital privacy protection legislation that extends new safeguards to areas that go well beyond health care information. But this larger measure has failed to advance for an array of different reasons.
Eventually, Congress must take action to protect the privacy of people who are now online for all manner of personal, professional and other reasons. Until then, a narrowly targeted approach that protects health privacy may be politically easier to enact. The explosion of digital health technology, the dramatic increase in its use precipitated by the Covid-19 pandemic and the growing evidence of unrestrained tech industry intrusion, requires it.