Despite the uproar over the Edward Snowden leaks, government data collection and use is nothing new, said Corporate Vice President of Microsoft’s Trustworthy Computing Group Scott Charney to a sold-out crowd at the IAPP’s Global Privacy Summit last week in his keynote address. After all, governments are essentially just big service providers; they want to exploit the Internet and they want access to data.
Knowing this, it’s essential to set yourself up for success with iron-clad and transparent policies that effectively communicate to users what you’ll do with their data now and, in future cases, who may gain access to it. Equally as important? Tell them in ways that are fair.
The Snowden revelations changed things because, when people’s suspicions of those with access to their data become verified as true, it affects the way they react, Charney said, presenting society with the conundrum we find ourselves in now—in which the government and Internet users both are conflicted over their desires. Sure the government wants to protect security, public safety and privacy, but there is often an inherent conflict there. And users want to be protected, but they don’t want to be spied upon.
That’s why Microsoft has built in protections to its systems that aim to be completely transparent to users—and without caveats. It’s a policy that has served the company well, he said. When the Snowden disclosures broke and firms around the globe panicked, Microsoft remained calm, Charney said. There was no reason to panic. Microsoft bases its conduct on three principles: security, privacy and transparency. That means employing a “defense-not-offense” model for security.
“There are no back doors in our products,” Charney said. Period. And Microsoft has never received a request for bulk data.
“We’ve had these principles in place for 10 years,” Charney said. “If you act in such a way, you know your conduct will be okay with your customers. So we haven’t worried about the Snowden disclosures. If we got a request for bulk data, we would fight it in the courts. We turn over data in countries with jurisdiction over us with a warrant. We have rules in place about when we’ll turn over data and when we won’t. We would fight bulk requests because we don’t feel that’s the right place to go.”
To communicate with customers, Microsoft publishes transparency reports on law enforcement requests twice a year and manages a Government Security Program—in which it allows government customers from many countries to see its source code
“Transparency is critically important,” Charney said. “You have to be able to tell customers what you can commit to them and what you can’t.”
Being transparent also means using modes of communicating to users that make sense, he said. That means, in this age of Big Data, re-examining how we apply the Fair Information Practice Principles—particularly those that apply to collection, notice and choice. Sure, it’s essential—now, especially—to notify users that you’re collecting their data and informing them of how you’ll use it. But current models for doing so aren’t helping the user in meaningful ways. In fact, they are kind of screwing them.
“Folks who are still using notice-and-choice models have their head in the sand,” he said. “By going with notice and choice, you put this huge burden on the individual.”
And that’s not fair because it assumes there’s a relationship between the data subject and the data collector.
“But that’s not true at all, especially with the Internet of Things,” he said. While it’s a better idea to move toward a data-use model, it’s important to remember that the notice that’s given at the time of collection may not necessarily be what the data is used for in the end. Now, it’s more like the company is saying to the data subject, ‘Here’s what we’re going to do with the data … that we can think of.’ But what about the greater technology of tomorrow?” Charney said. That new technology may create myriad ways for the data to be repurposed.
It’s time to start thinking about different models, Charney said, and he presented a simple process that Microsoft has adopted that focuses on the actors, objectives, actions and impacts associated with an product or service.
Take airport security for example. The actors are the U.S. government. The objective is to keep people safe. The action is to deploy metal detectors. Unfortunately the impact is that liquid explosives aren’t detected. So the government deploys x-ray machines. But it turns out those are too radioactive. So, they settle on backscatter machines, which accomplish the objective and have the most satisfactory impact.
Similarly, he argued, companies should use the same methodology in deploying new products. “It’s a workable model that allows us to assess risk,” he said, and it gives him confidence moving into the great unknown of new technologies and their associated privacy questions.
“We always have challenges,” he said. “I’m optimistic. We’ve had hard problems before. We’ve always been able to solve them.
Read More by Angelique Carson:
European Regulators, FTC Unveil Cross-Border Data Transfer Tool
With America Politically Divided, Multi-Stakeholder Processes Are the Future
How Do We Stop Big Brother? We Look Back
Podesta: Big Data Means Big Gains … and Risks