Old School Privacy is Dead, But Don’t Go Privacy Crazy
When I have the occasion to drive the kids to school, our music selections range almost as widely as our breakfast choices—some Christian, some country and some 80s, to which I alone know the lyrics. Recently, a particularly funny, somewhat concerning country song, “Redneck Crazy” by Tyler Farr, caught my attention. The song includes the following line, “You done broke the wrong heart baby ... drove me redneck crazy.”
I find these lyrics useful in two discrete circumstances. First, my daughters are 17, and I find great joy in potential male suitors seeing me as “redneck crazy.” I plan to download the song and play it on an endless loop whenever they’re around. Trust me on this. If a 17-year-old boy observes a somewhat unstable dad endlessly playing this song, it’s as good as cleaning the shotgun at the kitchen table.
The second circumstance is perhaps more relevant for this audience. It strikes me that this is how I see many people react in response to those who would dare to collect and use their personal data: “You done used the wrong data baby, drove me PRIVACY crazy.” And like the 17-year-old boys at my door, I’m a bit wide-eyed looking at the “privacy crazies” and wondering if I should step into the room, or run fleeing, leaving the innovators at the threshold. But we who can occasionally go privacy crazy need to face something: Privacy is dead.
There is nothing left to debate. Our old-school privacy, as we’ve known it for decades, is dead and buried.
But, there’s good news. Even great news. Some things change for the better. I never watch “live” television anymore and no one in our generation can likely tell you the names of the nightly news anchors. But I watch Netflix and ESPN3 (mobile version) and HBOgo, and I love it. Same with the VCR. I haven’t “video cassette-recorded” for a decade. But boy do I love our whole-house DVR. IU basketball on the DVR is amazing.
These were great changes. And that’s what I believe to be true about privacy. It’s changing for the better.
If your notion of privacy is defined by your personal control over all of the data about you, well, you’re privacy crazy and I have tragic news: That privacy is lost. And no amount of EU directives, HIPAAs or other central, user control-based regulations will change this fact. These types of regulations that default to all “use” of data as being impermissible unless authorized by the individual are trying to protect a version of privacy that no one really wants—not if we have to go back to using VCRs and flip phones. That brand of privacy isn’t only culturally undesirable and impossible to maintain, the regulations that attempt to enable it are unethical.
The central control regulations force individuals to exercise every precaution (which we don’t, because we feel like we should be able to trust reputable companies), scrutinize every privacy notice (no one reads privacy notices, not even those among us who draft them) and then only provide our consent either through our actions or inactions in circumstances we deem wise given the precautions (we haven’t taken) and the content of the notices (we haven’t read)—and failing to make an intelligent decision, we forfeit our privacy. That’s as absurd as requiring my teens to use a cassette tape in the VCR to record the latest American Idol episode. To use my kids’ lingo, that’s sketch.
And the answer isn’t clearer, layered, surveyed, animated or chip-embedded notices. Society and culture have moved on, and as privacy experts we better at least catch up to the tail end. Stop the notice-requiring legislation and regulations. We don’t read the notices, and you can’t make us. So don’t pass more regulations that force us to read notices and give consents we won’t understand.
But, the good news is that, like TV and VCRs, our parents’ brand of privacy is being replaced by a better, more sustainable and meaningful privacy. This new and improved privacy is privacy that really matters. It’s a privacy that doesn’t depend on security breach fear to get internal buy-in and budget because its value is not defined by the latest data security breach. This is privacy that enables innovation. Privacy that, if it’s violated, actually results in real harm to us, harm we can demonstrate.
This new generation of privacy is just beginning to get defined. And none too soon. With the “Internet of Things,” Big Data and constant innovation, the old privacy never had a chance. I don’t read notices on the apps I download; I can’t imagine trying to read them on the grocery aisles I walk down, the cars I drive, the doors I open, the thermostat I set or the pills I swallow. These innovations will bring wonderful convenience, efficiency and quality to our lives, and these uses will not harm our privacy, because we will define appropriate uses of the data.
And while we don’t think any protections exist, the marketplace is a powerful force. Bad ideas get sniffed out, panned and abandoned. It happens all the time. We simply need to understand and harness the market forces and translate it into reasonable use standards. Society, in its common culture and through its market actions, understands that—just as it understands the difference between annoying data collection by marketers and serious data collection by the NSA.
But now, as we start down this new privacy road, we need the deep privacy experts to step through the door, face the old-school privacy crazy, and take the innovators by the hand. These privacy experts among us need to help the innovators understand their obligations relative to real privacy and data security, show them how to use data responsibly and enable them to innovate with that in mind. We also need to step into the policy fray and help define appropriate and responsible use. Help define real harm. THIS is going to be fun. This is a worthwhile pursuit. This can help save lives and drive economies.
It won’t be all country music and 80s rock, however. We will make mistakes.
That’s the caution we need to bear in mind. As we define what we really care about in privacy-appropriate use while defining the harms that really should drive our policies, we need to make sure we get it right. To do this, we need full inclusion of all of the stakeholders—not just those who scream the loudest. In healthcare, that means including the patients and patient advocates, the innovators, researchers, healthcare providers, industry folks and those who care about and understand privacy. We don’t just have cassette tapes and network news anchors to lose. We have economies, innovation, real new generation privacy and, ultimately, our way of life at stake. This debate is real, and this opportunity is why we’re in this profession.
About the Author
Stan Crosley is director of the Indiana University Center for Law, Ethics and Applied Research (CLEAR) in Health Information, counsel to Drinker Biddle & Reath as well as a principal in Crosley Law Offices, LLC. He is the former chief privacy officer for Eli Lilly Company, where he initiated Lilly’s global privacy program in 1998. The program received the 2007 Innovation Award from the IAPP. Crosley also co-founded and served as chair of the International Pharmaceutical Privacy Consortium and was a member of the IOM Medical Research and Privacy Committee. He serves on the boards of the Indiana Health Informatics Technology, Inc., the IAPP and The Privacy Projects. Crosley is a member of the Brookings Institute’s Active Surveillance Implementation Council, which is providing guidance to the U.S. Food and Drug Administration (FDA) on issues surrounding medical product surveillance and the FDA’s Sentinel project. He is a member of the board of Shepherd Community Center, dedicated to breaking the cycle of poverty on Indianapolis’s east side, and is active in his church and community. You can find him on Twitter @crozlaw