Opinion

Consent and Personal Control Are Not Things of the Past

Note from the Editor:

Editor’s Note: The authors present this post in response to the arguments presented by Victor Mayer-Schönberger's in "Data Protection Principles for the 21st Century," as reported by Sam Pfeifle from the IAPP Data Protection Congress, on December 12, 2013, in the article “Forget Notice and Choice, Let’s Regulate Use.”

We will be releasing a white paper, early in the new year, challenging the view that consent and personal control of one's data by data subjects is a thing of the past—it is not. In fact, in the wake of Edward Snowden's revelations, we are witnessing the opposite: a resurgence of interest in strengthening personal privacy.

There is no question that the field of Big Data and data analytics is growing exponentially, which in turn is leading to new challenges with respect to data privacy. At the same time, there are strong solutions that have been proposed and which are being deployed in Big Data analytics contexts. That these solutions are not widely used yet only means that we need to double our efforts to transition best practices rather than abandon ship. One solution involves the application of strong de-identification measures at the earliest opportunity to remove the harm of having personal identities linked with the data, thereby enabling extensive data analytics to be performed on non-personally identifying data. This can be done at the point of collection or the first use of the collected data. Other solutions may also be pursued in the form of encrypting personal identifiers, or aggregating datasets.

To suggest that Big Data’s entry into the world of personal data must inevitably lead to the obliteration of Fair Information Practices, which form the basis of virtually all privacy laws around the world and, which will be further strengthened in the forthcoming EU Data Protection Regulation, is sadly misguided. Yes, Big Data will lead to invaluable findings, but this need not happen at the expense of privacy. Privacy by Design rejects such dated zero-sum thinking in favour of doubly-enabling positive-sum solutions.

We must also not overlook public sentiment. To argue that the public would readily accept taking away all control of their personal information and giving it to private sector companies and to the government would be a colossal misread of the public`s views. There is no evidence that legislators and the public are prepared today to cast aside their existing privacy interests. In fact, there is growing intolerance of data breaches and privacy infractions (with specific reference to unacceptable Big Data pursuits). We need changes that will increase public trust—erosion of personal control will most likely not be one of them.

Applying the appropriate tools and methods, given the context of data collection and use, makes ultimate sense. This means that we need to have a tool box consisting of many tools. Controls at the point of collection, and controls at the point of use may be suitable at different times, and for different reasons. Both are valuable measures, but shrinking the toolbox limits the capacity of privacy professionals to address complex data analytics situations.

Our paper will argue that privacy does not impede innovation—quite the contrary, it breeds innovation and creativity! New methods will be discovered whereby the value derived from Big Data and data analytics will be achieved with privacy embedded directly into the design process. Privacy by ReDesign may also be used on existing datasets to de-identify personal identifiers before submitting the data for use in analytics.

In our upcoming paper, we will point to recent developments that are serving to strengthen Fair Information Practices and privacy interests. Not only through the Snowden revelations but also as a result of other developments such as the growth of the Personal Data Ecosystem, the determinations of the European Parliament LIBE Committee, forthcoming EU Data Protection Regulation—all of this will strengthen the resolve and pursuit of Fair Information Practices, not the reverse. Never before have so many demanded that their right to privacy, and in turn their freedoms, be respected.

About the Author

Ann Cavoukian is Ontario's Information and Privacy Commissioner, appointed in 1997 and currently serving her third term. She joined the Office of the Information and Privacy Commissioner in 1987, during its start-up phase, as its first Director of Compliance. In 1990, she was appointed Assistant Commissioner. She is perhaps best known for her mantra of “Privacy by Design.” 

Alexander Dix was elected as Commissioner for Data Protection and Freedom of Information by the Berlin State Parliament (Germany) in June 2005. Previously he had been Commissioner in the State of Brandenburg for seven years. 

Khaled El Emam is an Associate Professor at the University of Ottawa, Faculty of Medicine, a senior investigator at the Children’s Hospital of Eastern Ontario Research Institute, and a Canada Research Chair in Electronic Health Information at the University of Ottawa.

Comments

  • January 09, 2014
    Richard Beaumont
    replied:

    I agree that whilst there is room for looking at all aspects of privacy control, now is not the time to abandon notice and consent.
    Instead much could be done to improve it.  In particular, what is needed is pressure to make notice more meaningful, and understandable to the average consumer. Notices need to move away from being solely about protection for the data controller, towards informing the data subject.
    Perhaps what is needed in fact is a legal challenge to long, legalistic privacy policies on that basis that the average consumer cannot understand them, and therefore is unable to give valid consent.

  • January 09, 2014
    Malcolm Crompton
    replied:

    As is so often the case, the answer lies somewhere in between.

    I don’t think anybody is suggesting that notice and choice be done away with. 

    But we have to face some realities.  We are human. 

    Each of us can only make a finite number of decisions in a day.  As one of my colleagues said to me years ago, “I can only make a finite number of decisions in a day.  What matters is that I make those decisions on things that matter”.

    This hard biological fact is what has been abused in the current digital world in which we live.  Notice and choice is being used so often, usually on the basis of a Privacy Policy or T&Cs; that in effect say ‘if you want to use this service, you have no choice but to agree to us collecting as much information as we can, keep it forever, use it for anything we like and sell or share it with anybody.  Forever’.  The result is meaningless.

    The solution to that isn’t to add to the customer burden with even more notice and choice. 

    Nor is the solution to take away notice and choice. 

    What is needed is a world in which defaults are appropriately set, the organisation not the individual does most of the work and it acts much more in the privacy interests of the individual and the organisation meets the bulk of the costs of compliance not the regulator.

    Then the individual will be able to exercise choice where it matters which we can draw upon.

    Outside of privacy, there are any number of analogies. But two will do for the moment:

    1. A motor vehicle with automatic gear change is a system where most of the time, the vehicle makes the gear change decision which means the driver can focus on things that matter more to the driver, such as what music station to listen to.  But at any time, the driver can override auto and can select the gear that matters for the moment.  Note, though, that most of the time most drivers don’t do this.  But for example, only the driver knows when the vehicle needs to be put into Reverse instead of Forward - it will be a long time before vehicles get that one right.

    2. Corporations law in most parts of the world require financial information to be managed and revealed to a standard that must be assessed regularly against well developed standards by an accredited third party.  In other words, an industrial grade audit process.  And very importantly, the cost of this process is a cost of doing business for the corporation.  The regulator is only a backstop when this goes wrong.  By contrast when it comes to personal information, the regulator (DPA) is in effect the first line third party compliance body and will never be resourced sufficiently to do the task, just as the corporations regulators aren’t.

    We need to work on a framework that works and makes a difference.

  • January 10, 2014
    Trent
    replied:

    The risk is that the ‘appropriate standard’ will be determined by the data collectors and sellers, who will keep hoovering up data while asserting (correctly) that they comply with the standard that they defined, and that they are open and transparent about how they meet that standard. And hey, we get ads about things we might want to buy!

    Continuing the car analogy, it might be like the algorithm for gear-shifting being determined by car and gas companies in a way that ensures I use up a little more gas and put more wear on my engine than I really need to, while they tell us that they’re doing us a favour.

    Manual transmission is great if you want full control. It’s hard to exercise that control properly if the rpm dial is hidden on a dashboard of thousands of dials, just like it’s hard to give meaningful consent to a 40,000 word privacy policy.

To post your comment, please enter the word you see in the image below:

To post your comment, please enter the word you see in the image below:

Get your free study guide now!
Get your free study guide now!