Privacy, Transparency and Google’s Blurred Glass
By Jonathan I. Ezor
No matter the context or jurisdiction, one concept underlies every view of the best practices in data privacy: transparency. The mandate to disclose what personal information is collected, how it is used and with whom and for what purpose it is shared is essential to enable informed consent to the collection, along with the other user rights that constitute privacy best practices. That disclosure may be to governmental agencies as well as the users themselves, but the ultimate goal is that the users will know what information from and about them is being collected and used by the organizations with which they interact.
One recent description of best practices, the Consumer Privacy Bill of Rights issued by the Obama administration, defines transparency this way: “Consumers have a right to easily understandable and accessible information about privacy and security practices.” At times and in places that are most useful to enabling consumers to gain a meaningful understanding of privacy risks and the ability to exercise individual control, companies should provide clear descriptions of what personal data they collect, why they need the data, how they will use it, when they will delete the data or de-identify it from consumers and whether and for what purposes they may share personal data with third parties.
Other lists of privacy ideals and requirements, including the Federal Trade Commission’s Fair Information Practice Principles as well as the European Union’s Data Protection Directive, include and highlight transparency through disclosure to users. Transparency of data collection is also a key part of the best practices long recommended by advocacy groups including the Electronic Privacy Information Center, the Electronic Frontier Foundation and the Future of Privacy Forum.
Transparency has become a keyword for companies’ and service providers’ discussions of their own data practices. Notably, Google, whose business goes far beyond its original focus on online search, publishes a semi-annual transparency report disclosing the number and disposition of government requests of information from Google, along with other statistics such as content removal requests. Twitter has recently begun publishing its own transparency report, following Google’s lead.
As one illustration of the challenges of transparency even with the best intentions, consider Google. While the company publicly promotes its dedication to privacy, offers a user Dashboard for profile management and has received praise from advocacy groups such as the Electronic Frontier Foundation, Google users still have no clear way to determine all the ways Google is using their personal information.
The challenge comes from the sheer diversity of Google’s operations and frequent acquisitions. It’s almost impossible to discover the entire range of products, services and brands Google owns and controls, and through which it is collecting user information. The company’s products page lists categories including web, mobile, media, geo, home and office, social, specialized search and innovation, and includes products such as Orkut, Blogger and Picasa, whose names might not immediately identify them as Google services. A link to mobile on that page leads to another page which briefly describes the Android operating system owned and licensed by Google and running on devices from many manufacturers, the Google-branded Nexus devices, and further links to Google apps running on both Android and Apple’s iOS operating system.
The example of Google demonstrates that transparency must be maintained through ongoing review of and revisions to an organization’s disclosure and procedures. “Privacy by design” must include embedding privacy awareness and reviews in all business practices, especially in the area of mergers and acquisitions. Otherwise, even a company dedicated to and intending to promote best practices in privacy may fail to provide enough information to consumers for them to make informed decisions about whether and how to share their personal information with the company, leading to potential reputational and legal risks.
Jonathan Ezor is assistant professor of law and director at the Center for Innovation in Business, Law and Technology at Touro College’s Jacob D. Fuchsberg Law Center. He can be reached at firstname.lastname@example.org.