By Sam Pfeifle
There are few privacy principles more generally ingrained than the ideas of notice and choice for consumers. People should be told when their data is being collected and that data should be constrained to the use for which those consumers consented.
Everyone knows that.
However, said Viktor Mayer-Schönberger from the IAPP Data Protection Congress keynote stage here in Brussels, “The naked truth is that informational self-determination has turned into a formality devoid of meaning and import.”
Rather than protect the consumer, notice-and-consent mechanisms have simply become methods whereby consumers can either accept, the co-author of Big Data and professor of Internet governance and regulation at the Oxford Internet Institute argued, “or remain outside modern society.”
“It is nothing more than another hoop we all go through,” he said, “when we want services online and offline.”
Once he came to this realization, he said, after years of studying data protection laws, “I started to doubt.” Perhaps the radical changes in storage capabilities, making it unnecessary to ever discard any data, combined with the much-improved capacity for collecting data, really did represent the death of privacy, once and for all.
Further, by keeping the ineffective notice-and-consent regime in place, he said, we’re also limiting that value of data and hampering the good that could be done with Big Data applications. Previously, “data’s value never was exhausted by using it for the primary purposes for which it was collected,” Mayer-Schönberger said. “It always had latent value far beyond primary use, but it was too expensive, so we rarely bothered. It made economic sense to throw it away.”
Now, because of plummeting costs, the economics have completely changed.
“Data can, and ought, to be reused,” he argued, “unless we desire the resource wastage that we currently work so hard against with recycling in the physical world.”
However, that’s often not possible because of constraints created by specifying purpose at the time of consent. And that data that might be most valuable for social good, in the fields of health and education, say, is nominally protected most closely “while failing to protect the privacy of the subjects in the first place.”
Yes, said Mayer-Schönberger, he almost gave up on privacy.
“We don’t need to give up on privacy,” though, he realized. “Rather, what we need is a new protection mechanism. A paradigm adjustment to ensure privacy in the age of Big Data.
“What would that mechanism be?” he asked rhetorically. “My colleague Fred Cate (C. Ben Dutton Professor of Law at Maurer School of Law, Indiana University) has devised an amazing plan. Rather than focusing on notice and choice, we should focus on the use of personal data. It makes intuitive sense.”
Mayer-Schönberger referred the audience to a new whitepaper released on the Oxford Internet Institute’s website, which he co-authored, titled “Data Protection Principles for the 21st Century: Revising the 1980 OECD Guidelines” and released just in time for the Data Protection Congress.
“It’s not that the data is problematic,” he said, “but how it’s being used, especially in the context of complex data analysis. Protection for the consumer should not depend on the ability to comprehend what’s going on with her data and ability to take action.”
The idea is to hold users accountable, whether they have persuaded a consumer to provide consent by clicking a button or not. The burden would be shifted from the consumer to complain about misuse of data and shifted to the user of the data and regulatory bodies to monitor that use.
“That would require assessments of risks and harms,” he said, devising the safeguards and insuring that the safeguards are implemented. Further, data users should be legally liable for the risk assessment and the implementation of the safeguards.
Then, there would be enforcement by agencies provided with larger budgets and powers, “rather than hope the individuals would enforce their rights, which we know they rarely do.”
In return for that new regulatory burden, and cost, data users would be permitted to reuse for novel purposes without having to ask for re-consent every time. Rather than accountability being a tack-on of privacy thinking, “make it the core mechanism of protecting information privacy in the coming decades,” Mayer-Schönberger argued.
Is this practical? The paper’s authors think so. Certainly, they understand it’s a concept that might be initially foreign, especially to professionals who have worked in a world of notice and consent for many years.
“I confess that I initially was uneasy with these ideas,” he said. “I couldn’t permit myself to think beyond conventional mechanisms of notice and consent.
“But now I find it utterly appealing.”
Read More By Sam Pfeifle:
EU, U.S. Officials Indicate Potential Privacy Agreement at Data Protection Congress
Top Six Inadequacies Found During Privacy Audits
Big Data Jobs Board Sees Privacy Jobs Growing Fastest
EuroPriSe Seal To Change Hands January 1