TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

""

""

In his discussion of Universal People Sensors in Portsmouth, NH, earlier this summer, Alex “Sandy” Pentland made the case for the benefits of Big Data. He said it’s not about privacy, it’s about using data as an asset. For Pentland, the question is, “Who controls the data, and can you be secure in sharing it for particular purposes?”

As co-leader of the World Economic Forum Big Data and Personal Data Initiatives and employee at MIT, Pentland presented a slew of examples bolstering the need for collecting vast sets of information in order to track and better learn about our world. (My colleague Emily Leach wrote a great piece on some of Pentland’s work here.) It can help travelers drive more safely, better connect those suffering from mental health issues with others and track infectious diseases.

“If I can watch your phones on a minute-by-minute basis—how you move, who you call, where you go—I can do a really good job telling if you’ve been infected with the flu. And, if I can watch the people around you, I can do a tremendously good job watching the flu move from person to person. Well, that’s George Orwell to the cubed power, it’s the most invasive thing you can imagine. But now, here’s the trade-off: There are no protections today against pandemics.”

So, he queried, do we spy on people or let 200 million people die?

Okay, so now the dichotomy is set. Privacy versus security. Personal liberty versus the public good. I’ve written about it before on this blog. So here we go.

In a response to a paper published by Paul M. Schwartz, University of Colorado Law Prof. Paul Ohm makes the case for the “underwhelming benefits of Big Data.” He argues, “Too many commentators have too often overstated the benefits of Big Data, inflating studies and praising the merely trivial.” Ohm notes there are benefits, but adds that “some Big Data projects will also lead to bad outcomes, like invasions of privacy and hard-to-detect invidious discriminations.” With it, governments can spy on citizens and bad actors “can prey on their victims.”

However, Pentland presents the New Deal on Data. For him, we should not be talking about privacy, “we should be talking about how we can get public goods to save our kids, to make our life better” while keeping business thriving, “but in the interests of the consumer.” A win-win-win, he says.

So how the heck do we do that?

Well, this paradigm includes promoting and instilling personal data ownership rights, trust networks and personal data storage capabilities. In the video below, he explains the openPDS system, which can tie contractual agreements into code, allowing for automatic auditing so users can have more control of their personal data and which can be enforceable through such contracts. These “trust networks” become not only technical solutions but legal solutions—similar to the SWIFT Interbank network, he argues. Contract law is interoperable among the patchworks of global jurisdictions. Users can choose the types of personal data they want to share, and with whom, and can take it back, if they so choose.

“Most of the time,” Pentland states, “you don’t need to share data, you need to share answers.”

Yet, implementing systems for the public good aren’t always well designed. The AMBER alert system made some headlines this week, after many Californians received a rude and unexpected awakening from their mobile phones. Last Monday, a statewide set of California residents, who had not opted out of the Wireless Emergency Act system, awoke to “a 10-second spurt of high-pitched noise and buzzing” followed by the text, “Boulevard, CA AMBER Alert. UPDATE: LIC/6WCU986 (CA) Blue Nissan Versa 4 door.” There was no context as to what this message meant. Unfortunately, two children from San Diego went missing (and as of Slate’s posting, they still are missing). The news prompted Slate columnist Justin Peters to opine that “the AMBER Alert system doesn’t work.” Though the system is predicated on protecting children and uses the aid of the public for the safety of children, Peters, and some others, argue that the system is not effective in actually finding and saving imperiled children.

He explains further,

“The California alert I mentioned above reportedly went out to people across the entire state, not just in the areas where the kidnapper was likely to be. If you’re regularly bombarded with text messages about a kidnapping that happened hundreds of miles away, you’re eventually going to start disregarding all of the alerts you receive. That’s not an effective way to raise public awareness.”

As an alternative, why not give residents more choice in the matter? Make the system opt-in. Social media moves fast. Tweet out the AMBER Alerts. This gives users some choice in spreading the message as well.

Also, how will we feel good about trusted networks in light of government access to private data. Just yesterday, Lavabit and Silent Circle—both encryption e-mail service providers—decided to shut down rather than give user data to the U.S. government.

Not all is bad, though. This week we also learned that the National Institutes of Health, in an unprecedented decision, gave two descendants of Henrietta Lacks control over which biomedical scientists get access to the full genome derived from her cells.

In order to promote the public good in a Big Data world, hopefully this is part of a bigger trend where individuals are getting more control over their personal data.

Comments

If you want to comment on this post, you need to login.