TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

""

""

By David Hoffman, CIPP

Having just attended the 30th annual International Data Protection Authorities Conference in Strasbourg, France, I am struck by how data protection narratives have in some ways notably changed, but in other ways echo similar refrains as those heard at the last conference held in France in 2001.

What is the same?

In October 2001 in Paris, much of the conversation revolved around regulators' distrust of how corporations and technology companies collect, store, and process personal data. At that time, Intel and Microsoft were the two most discussed companies. The conference came on the heels of issues with the release of Intel's Processor Serial Number feature in the Pentium III processor, and Microsoft's plans for launching the Passport offering. The conference was only weeks after the tragic events of September 11, and Richard Purcell from Microsoft and I were two of only a handful of U.S. representatives at the event. We spoke on a panel about our efforts to design privacy into our respective company's product development processes. In 2001, the regulators were welcoming and respectful, but also skeptical about whether Microsoft or Intel would actually meet their expectations for processing personal data.

Fast forward to 2008. The discussion was strikingly similar, with Facebook and Google replacing Microsoft and Intel as the focus of concern. Regulators expressed grave misgivings about how individuals can properly exercise choice and control over the collection, processing, and storage of information relating to them by the applications run by these companies. Facebook, in particular, was the center of attention, with an entire panel dedicated to children's use social networking Web sites. Both Google and Facebook have highly respected representatives working on data privacy. Chris Kelly, Facebook's CPO,  and Mozelle Thompson, former FTC Commissioner and now outside consultant to Facebook, did a commendable job expressing a desire to work cooperatively with regulators to show how their privacy settings provide users with control of their information, and how their policy provides transparency. These efforts were similar to recent statements from Google, where the company's privacy representatives—primarily North American privacy counsel Jane Horvath (former chief privacy officer at the U.S. Department of Justice) and European Privacy Counsel Peter Fleischer (former European Counsel to Microsoft)—made conciliatory statements in the press about working "constructively" and "concretely" with the regulators. Both Facebook and Google have invested tremendous resources in providing detailed privacy policies and statements on their Web sites and in explaining to regulators and advocacy groups how their applications process data.

However, as in 2001, regulators are left with the key question as to what standards these companies (and the rest of industry) should be held. European data protection laws are notable in that they demand proportionality in the processing of personal data. This standard of proportionality requires a balancing of the interests of the individual to whom the data relates, and the interests of the organization processing this data. Determining what particular benchmarks to use in this balancing of proportionality has not progressed significantly since 2001. The individual DPAs maintain strikingly different opinions on what is proportionate data processing, which is completely reasonable given the diversity of cultural, historic, and economic conditions existing in these countries. Further, corporations have made little progress in proposing reasonable interpretations of how to define what is proportionate data processing. In 2001, most corporate efforts focused on providing "transparency," which often meant making certain a privacy policy was posted on the Web to describe how the company processes personal data. These privacy policies remain the chief discussion point on how the companies process personal data.

What has changed?

The aforementioned discussion about "transparency," while qualitatively the same as that in 2001, is quantitatively different today. In 2001, a privacy policy for how a Web site collected and processed personal data could manage a moderately effective description of how that was done. Today, however, the sheer amount of data processed, exchanged, collected, posted, and sent to third parties makes any thoughtful articulation in a posted privacy policy impracticable to create in a way the individual can understand in even a detailed examination. Google and Facebook are uniquely presented with this challenge, as both of these companies offer destination Internet applications where individuals can manage much of their lives. These applications provide tremendous functionality and efficiency for obtaining information, analyzing that information, organizing the results of that analysis, and communicating with others.

This efficiency and functionality, combined with the great difficulty in describing how the information is processed (e.g. how targeted advertisements are served to an individual, or how the content of messages sent between two individuals will be used), call into stark relief how the "transparency" requirement of 2001 is still necessary, but is sadly insufficient to provide reasonable privacy protection for individuals so that they can have trust in the way they use technology in new and innovative ways.

This recognition of the limited value of "transparency" created a situation at the conference where the most-used phrase for creating trust became "accountability." While discussions around accountability have been a central theme in the APEC Privacy Principles, it is not clear that everyone using the term would define it the same way. However, accountability is generally used to ask the question about how regulators, NGOs, and individuals can be confident that companies are processing data in a way that provides a reasonable level of privacy. This accountability discussion is a useful development, but unfortunately no practical solutions for accountability materialized. Microsoft's Chief Privacy Officer Peter Cullen discussed a need to move to a "Globally Certified Company," but was not able to provide specifics due to the limited time granted for remarks. The concept of global certification, though, does reveal another element of the conference that has changed dramatically: the geographical breadth of participation. In Paris in 2001, there was a focus on international issues, but predominantly, they were issues relating to HIV data in the developing world. Today, active newcomers from all hemispheres (including representatives from Mexico, Israel, Hong Kong, China, and elsewhere) added to the discussion. We now find ourselves in a global conversation about privacy, which can reflect the worldwide nature of how individuals travel, purchase goods and services, conduct business, and communicate in 2008.

So, where do we go from here?

 

A global triangle of trust

Further definition of what we mean by "accountability" is critical, and we will have to make certain it fits hand in glove with "assurance." What assurance brings to the table is the concept of making certain that the rules to which an entity is held accountable actually have the end effect of providing a reasonable level of privacy for the individual. It is one thing to be able to come to global agreement on basic, fair information practices or principles for the processing of data (e.g. the HEW guidelines, the OECD Guidelines, the EU 95/46 Data Protection Directive, the APEC Privacy Principles), but it is a much greater task to provide specificity for how to handle particular types of data (e.g. What level of access should be provided to credit risk databases? How long should IP addresses be retained by Internet search engines? Should data aggregators have the ability to do anything they want with any information they find on the Internet or digitize from public records? Or what type of choice and consent should hospital patients have about whether RFID tags should be used to track them in the hospital?).
Assurance has become a hot discussion topic among security professionals. In the security arena, the "assurance" discussion centers on how to make certain technology products provide reasonable levels of integrity and protection against malicious attack. This security assurance discussion has created great difficulty as governments are not necessarily in the best position to describe to technology companies how they should design their products at the risk of slowing innovation and curbing economic growth. The same concerns apply to the privacy and data protection discussions. This situation is highlighted by the fact that, currently, we do not have an effective model, nor the right players, to create an environment where we can have "accountability" through establishing a reasonable level of "assurance" to commonly accepted best practices of how to process personal data. As noted above, cultural, historic and economic differences make coming to such common global best practices challenging, but that is not the only issue.

For accountability and assurance to work together properly, government, industry, and NGOs will have to work together to create a Triangle of Trust. Each leg of this triangle should have clear roles and responsibilities. Industry is the place where the best knowledge resides about technology development, new business models for processing personal data, and opportunities to process the data while providing reasonable privacy. However, as the current meltdown of the financial services industry shows us, industry is not well-positioned to fully regulate itself, and it is easily susceptible to a race to the bottom from those industry players who make money by not reasonably protecting the privacy of individuals.  That is where the NGOs can play an important role in coordinating the key members of industry in each area (e.g. who can best contribute to a discussion on social networking best practices) and then taking the resulting best practices to government.

Many of the existing standards-setting organizations do not function well for this type of collaborative and flexible best-practice setting, but privacy-specific NGOs in the past have shown an ability to do this. For example, current international standards bodies such as ISO and ITU are not well positioned for the development of such flexible and policy-oriented guidance, as their strengths often lie in the area of complex technical and process standards to enable global interoperability. The previous development of Codes of Conduct is an example of how this has worked well in certain industries. The NGO could then take the industry best practice to governments and regulators for comment and eventual ratification.

Once an industry best practice is ratified, NGOs can operate dispute resolution authorities to help ease the burden on scarce governmental resources, while allowing industry to quickly remedy issues as they happen. NGOs can also play critical roles in educating industry privacy professionals. The International Association of Privacy Professionals is the best example of this, as the IAPP serves as the central place for educating and certifying those individuals who work to provide protection for personal data.

With the IAPP and other NGOs playing their roles, then government regulators could focus their resources on strategic enforcement focus areas, which should endeavor to be robust, predictable, and globally harmonized (similar to the manner in which the U.S. Federal Trade Commission (FTC) has conducted its privacy and information security actions over the past few years). This backstop of robust, predictable and harmonized legislation will necessitate legislation that shares certain key components, particularly some version of the proportionality requirement of the EU 95/46 Directive or the unfairness doctrine of Section 5 of the U.S. FTC Act. This Triangle of Trust would provide companies with clear guidelines for proportionate data processing, and a mechanism to engage productively with regulators.  The time is now for industry to work with NGOs to create a model with which regulators can work. Hopefully, the end result of all of these efforts would be a policy environment where individuals could use technology while trusting that their privacy will be protected.

David Hoffman, CIPP, is director of security policy and global privacy officer for Intel Corporation, based in Munich. David leads a team that oversees privacy compliance and privacy and security policy issues associated with Intel's technologies. He is on the board of directors for the International Association of Privacy Professionals, and has been a member of TRUSTe's Board of Directors since 2000. David also serves on the U.S. Department of Homeland Security's Data Privacy and Integrity Advisory Committee. He is a former member of the Federal Trade Commission Advisory Committee on Online Access and Security.

Comments

If you want to comment on this post, you need to login.