TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Big Data’s Thirst Is Driving Change in Minimization Philosophy Related reading: US NTIA releases AI Accountability Policy Report

rss_feed

""

""

The recent National Security Administration (NSA) revelations demonstrate a broader trend: A retreat from minimization in collection and a move toward minimization in use. If you trust the collector not to break the rules, then a collect-first, minimize-later privacy model shouldn’t present a privacy impact, but recent revelations by The Washington Post have shown what happens when the collector becomes distrusted.

First, let’s not pretend there aren’t plenty of tools for guidance on when to apply minimization. The Fair Information Practices from the U.S. Department of Health, Education and Welfare, 1973, require collection limitations; the Privacy Guidelines from the Organization of Economic Cooperation and Development, 1980, require collection limitations; the EU Data Protection Directive 95/46/EC, 1995, in Article 6 addresses data minimization by specifying that personal data must only be “collected for specified, explicit and legitimate purposes;” and the White House's Consumer Privacy Bill of Rights, 2012, includes a right to focused collection, which places "reasonable limits on the personal data that companies collect and retain." It's clear that data minimization at the time of collection is broadly recognized as a fundamental privacy principle.

So what has changed?

Why, after decades of focus on collection as the point of minimization, has there been interest in applying minimization at the time of use? Two words: Big Data.

The Big Data business model threatens data minimization at the time of collection. In order to receive the benefits of Big Data, there has to be lots of data. When traditional minimization at the time of collection is challenged by the desire to benefit from large stores of data, privacy principles must be balanced against other societal values. Modern privacy scholars such as Omer Tene and Jules Polonetsky, have called for "a risk matrix, taking into account the value of different uses of data against the potential risks to individual autonomy and privacy" because minimizing information collection isn't always a practical approach in the age of Big Data.

While this focus on balancing values is important, the role of trust needs to be further developed. How can we develop systems that engender trust? How can we train people so that we can have faith that they will not abuse the systems they have access to? The more I think about it, the more I think Bruce Schneier has it right, we need to focus on “enabling the trust that society needs to thrive.”

Comments

If you want to comment on this post, you need to login.