黑料吃瓜网

The Private Sector Steps In to Protect Online Health Privacy, but Critics Say It Can鈥檛 Be Trusted

Most people have at least a vague sense that someone somewhere is doing mischief with the data footprints created by their online activities: Maybe their use of an app is allowing that company to build a profile of their habits, or maybe they keep getting followed by creepy ads.

It鈥檚 more than a feeling. Many companies in the health tech sector 鈥 which provides services that range from mental health counseling to shipping attention-deficit/hyperactivity disorder pills through the mail 鈥 have shockingly leaky privacy practices.

A by the found that 26 of 32 mental health apps had lax safeguards. Analysts from the foundation documented numerous weaknesses in their privacy practices.

Jen Caltrider, the leader of Mozilla鈥檚 project, said the privacy policies of apps she used to practice drumming were scarcely different from the policies of the mental health apps the foundation reviewed 鈥 despite the far greater sensitivity of what the latter records.

鈥淚 don鈥檛 care if someone knows I practice drums twice a week, but I do care if someone knows I visit the therapist twice a week,鈥 she said. 鈥淭his personal data is just another pot of gold to them, to their investors.鈥

The stakes have become increasingly urgent in the public mind. Apps used by women, such as period trackers and other types of fertility-management technology, are now a focus of concern with the potential overturning of Roe v. Wade. Fueled by social media, users are exhorting one another to delete data stored by those apps 鈥 a right not always granted to users of health apps 鈥 for fear that the information could be .

鈥淚 think these big data outfits are looking at a day of reckoning,鈥 said U.S. Sen. Ron Wyden (D-Ore.). 鈥淭hey gotta decide 鈥 are they going to protect the privacy of women who do business with them? Or are they basically going to sell out to the highest bidder?鈥

Countering those fears is a movement to better control information use through legislation and regulation. While nurses, hospitals, and other health care providers abide by privacy protections put in place by the Health Insurance Portability and Accountability Act, or HIPAA, the burgeoning sector of health care apps has skimpier shields for users.

Although some privacy advocates hope the federal government might step in after years of work, time is running out for a congressional solution as the midterm elections in November approach.

Enter the private sector. This year, a group of nonprofits and corporations calling for a self-regulatory project to guard patients鈥 data when it鈥檚 outside the health care system, an approach that critics compare with the proverbial fox guarding the henhouse.

The project鈥檚 backers tell a different story. The initiative was developed over two years with two groups: the Center for Democracy and Technology and Executives for Health Innovation. Ultimately, such an effort would be administered by , a nonprofit once associated with the Better Business Bureau.

Participating companies might hold a range of data, from genomic to other information, and work with apps, wearables, or other products. Those companies would agree to audits, spot checks, and other compliance activities in exchange for a sort of certification or seal of approval. That activity, the drafters maintained, would help patch up the privacy leaks in the current system.

鈥淚t鈥檚 a real mixed bag 鈥 for ordinary folks, for health privacy,鈥 acknowledged Andy Crawford, senior counsel for privacy and data at the Center for Democracy and Technology. 鈥淗IPAA has decent privacy protections,鈥 he said. The rest of the ecosystem, however, has gaps.

Still, there is considerable doubt that the private sector proposal will create a viable regulatory system for health data. Many participants 鈥 including some of the initiative鈥檚 most powerful companies and constituents, such as Apple, Google, and 23andMe 鈥 dropped out during the gestation process. (A 23andMe spokesperson cited 鈥渂andwidth issues鈥 and noted the company鈥檚 participation in the publication of . The other two companies didn鈥檛 respond to requests for comment.)

Other participants felt the project鈥檚 ambitions were slanted toward corporate interests. But that opinion wasn鈥檛 necessarily universal 鈥 one participant, Laura Hoffman, formerly of the American Medical Association, said the for-profit companies were frustrated by 鈥渃onstraints it would put on profitable business practices that exploit both individuals and communities.鈥

Broadly, self-regulatory plans work as a combination of carrot and stick. Membership in the self-regulatory framework 鈥渃ould be a marketing advantage, a competitive advantage,鈥 said Mary Engle, executive vice president for BBB National Programs. Consumers might prefer to use apps or products that promise to protect patient privacy.

But if those corporations go astray 鈥 touting their privacy practices while not truly protecting users 鈥 they can get rapped by the Federal Trade Commission. The agency can go after companies that don鈥檛 live up to their promises under its authority to police unfair or deceptive trade practices.

But there are a few key problems, said Lucia Savage, a privacy expert with Omada Health, a startup offering digital care for prediabetes and other chronic conditions. Savage previously was chief privacy officer for the U.S. Department of Health and Human Services鈥 Office of the National Coordinator for Health Information Technology. 鈥淚t is not required that one self-regulate,鈥 she said. Companies might opt not to join. And consumers might not know to look for a certification of good practices.

鈥淐ompanies aren鈥檛 going to self-regulate. They鈥檙e just not. It鈥檚 up to policymakers,鈥 said Mozilla鈥檚 Caltrider. She cited her own experience 鈥 emailing the privacy contacts listed by companies in their policies, only to be met by silence, even after three or four emails. One company later claimed the person responsible for monitoring the email address had left and had yet to be replaced. 鈥淚 think that鈥檚 telling,鈥 she said.

Then there鈥檚 enforcement: The FTC covers businesses, not nonprofits, Savage said. And nonprofits can behave just as poorly as any rapacious robber baron. This year, a suicide hotline was embroiled in scandal after Politico reported that it had shared with an artificial intelligence company between users considering self-harm and an AI-driven chat service. FTC action can be ponderous, and Savage wonders whether consumers are truly better off afterward.

Difficulties can be seen within the proposed self-regulatory framework itself. Some key terms 鈥 like 鈥渉ealth information鈥 鈥 aren鈥檛 fully defined.

It鈥檚 easy to say some data 鈥 like genomic data 鈥 is health data. It鈥檚 thornier for other types of information. Researchers are repurposing seemingly ordinary data 鈥 like the tone of one鈥檚 voice 鈥 as an indicator of one鈥檚 health. So setting the right definition is likely to be a tricky task for any regulator.

For now, discussions 鈥 whether in the private sector or in government 鈥 are just that. Some companies are signaling their optimism that Congress might enact comprehensive privacy legislation. 鈥淎mericans want a national privacy law,鈥 Kent Walker, chief legal officer for Google, said at a recent event held by the R Street Institute, a pro-free-market think tank. 鈥淲e鈥檝e got Congress very close to passing something.鈥

That could be just the tonic for critics of a self-regulatory approach 鈥 depending on the details. But several specifics, such as who should enforce the potential law鈥檚 provisions, remain unresolved.

The self-regulatory initiative is seeking startup funding, potentially from philanthropies, beyond whatever dues or fees would sustain it. Still, Engle of BBB National Programs said action is urgent: 鈥淣o one knows when legislation will pass. We can鈥檛 wait for that. There鈥檚 so much of this data that鈥檚 being collected and not being protected.鈥

KHN reporter Victoria Knight contributed to this article.

Exit mobile version