TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | With Big Data and Privacy, What Should the Regulators Know? Related reading: A view from Brussels: EDPS sends signal on data transfers 

rss_feed

""

""

In the third and final series of meetings called for by the White House as part of its Big Data and privacy initiative, privacy experts, academics, industry representatives and government regulators convened to hash out the benefits and challenges posed by the Big Data ecosystem. Hosted by the White House Office of Science and Technology Policy, the UC Berkeley School of Information and the Berkeley Center for Law and Technology, the day featured panels covering privacy values, the challenges of health and education, algorithms and transparency and privacy governance.

Ubiquitous data collection, highly advanced data analytics and rapidly evolving technologies are challenging not only traditional social norms but outdated privacy laws. Panelists discussed the imbalance between data collectors and subjects, the importance of the Big Data balancing test for discovering benefits versus harms, how privacy-protecting systems cannot work in isolation, European approaches to Big Data challenges, the need for abstract and technologically neutral legislation and the potential role that market forces can play in a more transparent environment.

Georgetown University Law School Prof. David Vladeck, a former Federal Trade Commission (FTC) official, pressed panelists in the day’s closing session to discuss what privacy regulators need to know and consider with Big Data and privacy.

MIT Media Lab’s Cam Kerry, formerly of the Department of Commerce and one of the authors of the Consumer Privacy Bill of Rights, backed a holistic approach to Big Data, one with a dynamic framework beyond notice and choice. He said solving this issue should not be simply up to government alone, that multi-stakeholder processes will help, but he did warn that the current marketplace isn’t working for the privacy of consumers.

“In the Big Data world, there is a real disparity between those who collect data and those who supply it,” Kerry said. “I think there is a disproportion there. We need to empower consumers” and “drive the market to do more things for consumers.” He said technology can solve many of the problems, but the U.S. should lead on driving a market for privacy-protecting technology.

FTC Commissioner Julie Brill agreed that often consumers are at a disadvantage. Consumer-facing industries, such as LinkedIn, rely heavily on the trust of their consumers, but non-consumer-facing industries, including many in the data broker industry, have less market pressure to protect the privacy of individuals. Non-consumer-facing industries also generally do not have to provide consumers with notice, choice or methods of redress to improve data accuracy.

“In the Big Data world, in analytics for example, it’s harder to have a statement about what you’re going to do. You’re not interfacing with the consumer.” For Brill, this means that unfairness will be a more “fruitful” litmus test in the Big Data context. She noted in two FTC settlements—Eli Lilly and Sears—more information was collected about consumers than was necessary. Calling it a balancing test, Brill said, similar to the EU’s concept of “legitimate interest,” weighing benefits versus harms will “be more important to regulators” moving forward.

LinkedIn Chief Privacy Officer Ericka Rottenberg said there already are existing privacy frameworks for protecting the consumer and warned that legislation could stymy innovation. “One-sized legislation will fail,” she warned, adding the FTC now has the appropriate tools to “go after the bad guys.”

Data collection made more robust by technology and algorithms, though bringing benefits for organizations and individuals, also challenges certain privacy-protecting concepts such as de-identification. Microsoft’s Cynthia Dwork warned that regulators and businesses should not “look at systems to protect privacy in isolation.” She said two data sets alone could be totally innocuous, but when brought together, or analyzed over time, new privacy challenges may emerge.

“What’s the harm in learning that I buy bread?” she asked. “There is no harm in learning that, but if you notice that, over time, I’m no longer buying bread, you may conclude that maybe I have diabetes … What’s going on here is a failure of privacy mechanisms; they’re not composing effectively.”

Offering a European perspective, Rainer Stentzel of the German Federal Ministry of the Interior pointed out that neither the current EU Directive nor the proposed Data Protection Regulation would likely cover the now-famous Target pregnancy test scoring case. In looking at the big picture, Stentzel said “we need to ask, what is the whole thing about? We call it data protection in the EU.” But there is something behind it. “What is the danger?” he asked. “What is the risk?” A risk-based approach may be part of the answer, he noted.

The “tyranny of the algorithm” was also a huge topic over the course of the day. Should businesses be more transparent with their algorithms? Should there be open-sourced communities gauging whether discrimination or unfairness are built in to the algorithms? What role should legislation play, if any?

The FTC’s Brill continues to back a multipronged approach to Big Data, noting that algorithmists, consumer privacy boards, best-practice standards, codes of conduct, data minimization and de-identification all have a role to play in weaving a privacy-enhancing tapestry. She discussed categories of sensitive data—including health, finance, children and eligibility—noting that each one, in the U.S., has a corresponding law. “But now,” she said, “those laws are in silos. My concern is that data doesn’t understand silos.”

“I think we should strive to make the questions we’re asking as abstract as possible,” said Stanford Prof. Mitchell Stevens. “Primarily, this should be a normative discussion. Don’t get caught in Julie’s tapestry. Let’s start with a national conversation first about what that tapestry should be.”

Comments

If you want to comment on this post, you need to login.