Search
  • Genevieve Morris

Candy Crush Anyone?

Updated: May 8, 2019

I’m a self-proclaimed policy nerd as evidenced by my Twitter handle. One of my favorite topics of study during my poli-sci days was behavioral economics. I found it fascinating how people make decisions, how they form opinions, and how those opinions are changed. We are far more susceptible to manipulation than we realize, or the control freaks among us want to admit. We’ve seen the effects of behavioral economics time and again; I mean, who hasn’t played Candy Crush? (I may or may not be at a level north of 1800 on Candy Crush Saga, insert proud yet ashamed facial expression here.) Needless to say, there are numerous ways in which our behavior can be influenced, and there is no shortage of individuals and companies who have figured out how to use that influence to generate revenue.


Last week, the New York Times’ Privacy Project released an article on the use of personal data in advertising, These Ads Think They Know You. The focus of the article is on new ways that companies are narrowly marketing to individuals, often in ways the individual doesn’t realize. To demonstrate how it works, NYT built profiles of who they wanted to reach with their advertising, chose attributes, and then matched all of that data up to individuals. They then created targeted ads that stated why the individual was seeing the ad. It’s incredibly fascinating to see how data is put together about people with ads then being served up. Personal example, according to my mom I have a handbag/purse problem. You would be shocked by the number of ads I see on a daily basis that are for handbags, but not just any handbags. There’s clearly a profile somewhere for me that indicates my preferences for certain types of bags, i.e. work appropriate that can double as laptop bags. I also see a shocking number of ads for professional attire. I’m guessing Facebook knows I have a shopping problem. The way that data is now being used is significantly more pervasive and intrusive into our every day lives than it was even 5 years ago. When you couple this type of data with behavioral economics, we’re probably all being more manipulated than we realize. After all, I can only see so many advertisements for bags before I’m going to buy a new one (though maybe that’s just my own personal cross to bear).


So what does this have to do with health IT? The NYT article got me thinking about the implications for health data. Up until now, health data has enjoyed a lot of protections. I typically only get health related advertisements for things related to my google searches, since most of my health data is locked away under HIPAA protections. But we’re moving into a new era. Guidance from OCR (both older guidance and new FAQs released in April) and the CMS and ONC proposed rules have all made clear that third party applications working on behalf of patients (and who aren’t affiliated with a health system or EHR vendor) are not regulated by HIPAA and therefore don’t have to follow the same rules of the road on how data is used. It’s up to patients to figure out the terms and conditions and be wise with their choices, though if every app available has the same terms and conditions on how they use data, patients really don’t have an actual choice to make.


But let’s set aside the basic monetization of the individual’s data for a moment. Once health data is out into the world of unregulated data use, how long do you think it will take before that health data is mashed up with other data about an individual and used to push folks towards particular behaviors? Certainly, we can all see the benefit of pushing folks towards healthier behaviors (think smoking cessation). But the bigger concern here is pushing folks towards care that they perhaps do not need and which ultimately adds significant cost to the system. We already have a cost problem in our healthcare system. We swing wildly between underuse of healthcare and overuse, both with their own cost implications. When behavioral economics is combined with massive amounts of health data, it won’t be hard to subversively push individuals towards particular tests, medications, procedures, care locations, etc., and it’s hard to imagine that this type of push will be motivated by anything other than financial gain. The administration is pushing hard for price transparency and value-based payment programs to attempt to bend the cost curve of healthcare, which is admirable, but frankly is ignoring the very real potential for increased cost to the system when their own policies put health data out into the unfettered and unregulated big data world.


I could not be a bigger supporter of getting patients access to their own health information and health information for those they care for. As someone who will be the caregiver for her parents who live in a different state, I want access to their medical information so I can help them make better decisions about their health. But we shouldn’t have to sacrifice data protections that also protect individuals from manipulative advertising tactics in order to get patients easy access to their data. The Trusted Exchange Framework and Common Agreement 2 (TEFCA 2) attempts to reign in some of this behavior by holding all parties participating in the TEFCA to the same rules (regardless of their HIPAA status) and limiting secondary use of data that has been requested for the individual to only that usage. These are steps in the right direction that HHS and FTC (who regulates apps) should try to emulate. We are dangerously close to creating a free for all for health data that not only threatens patient privacy but will also likely contribute to increasing costs in our healthcare system. We need to have open, honest conversations about these issues and figure out how we simultaneously get patients better access to (and control over) their data while protecting their data from being used to subtly manipulate them.

0 views

©2018 by Integral Health Strategies. Proudly created with Wix.com