TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Perspective: What DPAs need to know Related reading: Evolving privacy law 'exciting' for IAPP Westin Scholar

rss_feed

""

""

During the session "Data Protection and Defining Personal Information” at the annual Conference of Data Protection and Privacy Commissioner in Mexico City last November, one panelist asserted that privacy regulators need a better toolkit. Specifically, Prof. Charles Raab of the University of Edinburgh said regulators need to better understand probability theory, statistics and risk analysis. The Privacy Advisor caught up with Prof. Raab recently to find out why he feels this important.

There has been much concern in recent years that data protection authorities (DPAs) or privacy commissioners do not have sufficient knowledge of information and communication technologies (ICTs). This lack is said to leave DPAs prey to the pleading of special interests who are concerned to portray technological development as innocuous in terms of its effects on privacy and other social values that DPAs are supposed to protect. Many DPAs cannot find the resources to hire technologically knowledgeable staff at levels high enough, and in the numbers that might be required, to make a difference. Of course, other governing institutions that make or carry out public policy are at the same disadvantage: how many legislatures and executives are up to speed on ICTs? Especially at a time when Privacy by Design (PbD), and related tools such as Privacy-enhancing technologies (PETs) and Privacy Impact Assessment (PIA) are to the fore as privacy protection instruments, how can DPAs and others adequately assess the impact of the latest technologies and information processes so that they can take appropriate regulatory action?

These concerns are valid, and they highlight crucial issues for privacy protection. But it is also important—as part of technological understanding, but not only for that reason—that DPAs have a grasp of probability theory, statistics and risk analysis. This is because so much of the debate about technology, privacy and security (both information security and national security) revolves around evaluations of the likelihood of events and broader phenomena happening, so that laws, technical solutions and regulatory activity can be at least commensurate to the threats, and at best anticipate them. For different reasons—competing interests included—different proponents or critics of ICTs and their application promote light or dark scenarios about the benefits and losses of new innovations, playing up or playing down the benefits or the dangers. Regulators have to make up their own mind about this, and they are handicapped without an understanding of the likelihood and severity of privacy risks. How far should they entertain worst-case scenarios? How far should they espouse glowing visions of the technical boon? How, and in what way, should they develop their regulatory strategy around applications of the precautionary principle, or instead should they wait for things to happen and then respond resiliently?

PbD and PIA are predicated on an appreciation of probabilities and magnitudes of the consequences of using certain technologies, and they are founded on a basis that includes, at its very centre, the assessment of risk. This requires both conceptual understanding and numeracy, and a socio technical perspective on the ICTs and systems to which DPAs are supposed to apply laws, codes and other instruments of regulation. The latter include raising the level of public understanding; education about risk questions is at least implicit in this. How likely are data breaches? How likely are we to suffer harm from them? How large are the dangers of putting huge quantities of personal data on social networking sites? How many crimes would go undetected without the creation of yet more inter-operative databases of personal information, and is that a risk we can live with? Finer discriminations are needed than to say, too simply, that “x poses (or does not pose) a threat to privacy (or to national security),” or that “you are (or are not) at risk through this form of data processing.” If they had the ability and inclination to so do, DPAs would be in a good position to offer guidance on these questions. They could demand evidence and sound reasoning— ideally, scrutinised publicly—from interested parties when claims are made or denied about the privacy-friendliness or the necessity and proportionality of new ICTs, information systems and applications. Many DPAs may already act in something like this way, because assessments of necessity and proportionality are central to many regulatory judgments that are made daily, and to the developing jurisprudence on privacy. But how well equipped are DPAs to get their minds around the risk issues and to analyse them in a nuanced and sophisticated way? Arguably, as little equipped as they are to understand how technologies work, and what they are capable of, let along what their social and privacy consequences might be. How can they do better?

This note is not the place to develop these points in any depth, or to explore the complex issues of improving regulatory policy and practice to which they give rise. But the IAPP might be well placed to take up these matters, along with academics and others, from its own vantage point. It could provide the means and locus for focusing attention and deliberation on what, precisely, DPAs—not uniquely—need to know about risk, and how they might acquire and incorporate the necessary knowledge and understanding within their own structures. Many DPAs are already in the throes of taking stock of their roles as they enter a new era of global information flows and patterns, new regulatory challenges and new legislation—for instance, the new European Union regulatory pattern that will supersede the 1998 European Data Protection Directive. This would be an opportunity for the IAPP to foster and disseminate greater learning about crucial questions of risk amongst privacy professionals and broader constituencies and publics with a stake in privacy, including DPAs and chief privacy officers. Without this, we might only be left with yet more of the deadly antinomy of scare stories and complacent whitewash about ICTs and the corporate or governmental life in which they play a large part. This is not a happy prospect for DPAs or anyone else.

Comments

If you want to comment on this post, you need to login.