Inside 1to1:Privacy

Privacy and the Bottom Line

September 1, 2007

By Don Peppers and Martha Rogers, Ph.D.

Any number of studies over the past decade have taken big-picture looks at consumers' attitudes toward online privacy, and specifically what incentives would prompt them to surrender personal information in the online environment. Few, however, have attempted to put a price tag on privacy.

A recent Carnegie Mellon University study did just that, and its conclusion could prove quite interesting to companies struggling with privacy/trust issues in the online-selling space. In a tightly controlled lab study, the researchers found that people will pay more for goods if they're secure in the knowledge that their privacy will be protected.

The researchers determined, not surprisingly, that people are more likely to make purchases from sites with "good" privacy policies, as determined by an online privacy-policy widget. As for quantifying the premium that people will pay for privacy, the study subjects showed a willingness to pay around 60 cents more on a $15 item or about the same as sales tax would be in many states.

According to Lorrie Cranor, an associate research professor and Director of the Carnegie Mellon Usable Privacy and Security Lab at Carnegie Mellon University, it's relatively well-established that consumers will readily give up their privacy for a small reward of some sort and that privacy policies are "pretty useless in communicating what they do." What she and her Carnegie Mellon team sought to do, then, was create a set of conditions in which the subjects would have little difficulty learning that their privacy is protected.

"One of the theories about why consumers will readily give up privacy is that it's too difficult to sort through all the policies," Cranor explained. "So if we made it easy for them to take a privacy policy into account, would they do it?" For the study, the Carnegie Mellon team found 72 individuals -- eliminating candidates who "perceived little or no privacy risk when shopping online," according to the final report -- and charged them with purchasing batteries and an adult toy. Participants were given $45 to make the two purchases, with each item costing around $15. They got to keep the items and any leftover money, a decision that would seem to incentivize them to buy from the cheapest sites. They were required to expose their actual personal information as part of the exercise.

There was one other "hook." Some subjects used Privacy Finder, a privacy-policy evaluation tool developed by Cranor and her students, which automatically evaluates a site's privacy policy and presents its verdict on the search-results page. Those using Privacy Finder bought from "high privacy"-rated sites for 50 percent of the battery purchases and 33 percent of the adult-toy purchases.

Additionally, the Carnegie team rigged it so that the more expensive sites would have better privacy ratings. "We tried to make it as real as possible. There were real stores, real money, real credit cards, real information at risk. The only artificial thing about it was that we told people what to buy." Fred Cate, a professor at the Indiana University School of Law-Bloomington and Director of the Indiana University Center for Applied Cybersecurity Research, disagrees with Cranor's contentions about the study's lack of artificiality. While he praises the study as "very well-presented" and says he hopes to see more research on the topic, he also dismisses its central conclusion -- that people will pay extra for privacy -- as "fundamentally useless."

His major concern is with the way the study was structured, most notably that the subjects weren't paying for the items out of their own pockets. "We're all less careful with somebody else's money. If  I say, 'Here's 50 bucks, go spend it on the Web,' you'll be less price-sensitive than you'll be with your own money. It makes [the study] almost a theoretical exercise."

Cate also takes issue with the exclusion of people who don't worry about privacy when shopping online, saying that such a decision skews the results. "That's awfully subjective, saying that person X is or isn't privacy-sensitive enough."

Finally, he believes that asking individuals to shop for items they don't care about is a fatal flaw. "If you're shopping for something that you're not really interested in, you'll be a lot more tolerant about interferences," he said. "For this [study] to be valid, [the subjects] would have to be searching for something they actually want. People are less worried about their privacy when there's something they're trying really hard to find."

Asked about Cate's comments, Cranor strikes a conciliatory chord. "I understand where Fred is coming from. There are always artificial elements -- an experimenter watching you, you're in a lab," she said. "But we minimized any incentive not to behave naturally. We didn't know for sure that people would act the way they did." Also, it should be noted that subjects did spend their own money -- not from their pockets but headed to their pockets, since any amount they didn't spend, they could keep.

Either way, Cranor goes to the mat defending what the Carnegie Mellon team was trying to accomplish. "The notion that there may be ways to get people to pay for privacy is very, very important," she said. "There have been lots of questions as to whether privacy really makes a difference for companies' bottom lines. What we learned suggests that privacy can have an impact, that companies that distinguish themselves on the basis of privacy practices can see a return down the road."

You can reach Don Peppers and Martha Rogers at dpeppers@1to1.com or rogers@1to1.com.