Privacy Engineering

From Privacy to Trust Professionals

Note from the Editor:

This is the first in a series of posts by Westerman exploring the role of trust in the marketplace. Future posts will delve into consumer perceptions of company behavior, the role of design in fostering trust and trust design best practices.

Businesses should stop focusing on privacy and start focusing on trust, which creates value and revenue. Privacy professionals should become trust professionals and become involved in overall product creation. That is how to create trusted experiences. Privacy and trust are two sides of the same coin but lie at opposite ends of the emotional spectrum.

 

Privacy is a negative, generating feelings of distrust and anxiety. The goal is to control; the best outcome is to remain neutral.

 

Privacy is a cloak we wear to keep the “bad guys” out so they cannot hurt us. We want to control our identity and how others see us. We want to hide anything that might embarrass us. When it comes to maintaining privacy, everyone is on the defensive.

 

Trust is positive, generating feelings of freedom and possibility. It is about removing the cloak.

We don’t worry. We don’t need to hide. We have no secrets. We let down our guard. When customers trust a company, they feel empowered by the relationship. They believe that the company has their best interests at heart, which frees them from feeling defensive, suspicious, wary. With trust, there is no goal, simply the freedom to enjoy.

In the middle of this emotional spectrum lies neutrality, low emotion, logic. Muted emotions such as like—not love, want—not covet, confidence—not trust, which may provide a sense of reassurance perhaps but not excitement. Such experiences do not move people to buy products. Rather, low-affect confidence simply leads to product commodification. For example, customers have equal confidence in the quality of major TV manufacturers, so the choice is simply a question of cost and features. People feel low emotional investment in such products, unlike Apple’s, which generate strong positive emotions—love, not just like.

Apple’s success demonstrates that positive emotion is key not only to a successful customer experience but to a company's bottom line. Positive emotions motivate people to buy; negative emotions create a barrier or brake on purchase decisions.

 

Privacy costs companies money in more ways than one. It requires a focus on regulation and compliance. Privacy policies are about reducing risk, fighting fires, which costs the company money in terms of paying lawyers, issuing privacy statements—not innovation or solutions that generate new income streams.

Trust makes money. Positive emotional engagement leads to strong brand identification and loyalty, which in turn generates positive revenue. Loyalty means customers are less price-sensitive. They don’t consider the product a commodity. Rather, they want that brand.  Positive emotions also create buzz, which is a powerful sales tool; customers sell your products for you. 

Businesses should stop focusing on privacy and start focusing on trust, which creates value and revenue. Privacy professionals should be trust professionals and should be involved in overall product creation. That is how to create trusted experiences.

 

Where there is opportunity, there is also risk. In leveraging customers’ emotional engagement, companies must carefully maintain the positive. If people feel that their trust has been breached, they will flip from love to hate, not like to dislike. The key to successful brand building is earning and keeping people's trust.

Stay tuned for the next blog posts, in which we will explore what consumers trust and distrust in company “behavior,” how design can fail at earning customer trust and best practices for developing trust through good design.

+++

The Digital Trust Initiative is an independent effort to study digital design and privacy policy in digital technology. The work of the initiative is funded, in part, by a variety of partners: Yahoo!, Create With Context, AOL, The Future of Privacy Forum, Verizon and Visa. The views expressed in this article and the conclusions drawn do not reflect the views of these partners. Further, the partners have not independently verified the results of the study, nor do they make any representation as to the accuracy or value of any statements made herein.

More from Ilana Westerman

About the Author

Ilana Westerman is CEO of Create with Context, Inc., a leading digital strategy consulting firm. For the past 15 years, she has championed the role of people plus context as key drivers behind the design of innovative technology solutions, helping ensure that digital products and services align with human needs, goals and desires.

Westerman began working on digital innovation in the mid-1990s, including award-winning work on the IBM Nagano Olympics web presence. Then, as one of the early members of Yahoo!, she helped build the Yahoo! User Experience team, leading R&D teams for key Yahoo! properties. She now serves as CEO at Create with Context which, under her leadership, has seen significant growth since its inception in 2005.

See all posts by Ilana Westerman

Comments

  • March 26, 2013
    Jason
    replied:

    While it’s easy to say, trust is a hard thing to build. Trust is earned over time through deeds and actions not just words. Not to dismiss it, I think it’s extremely important but it’s not easy and certainly encompasses much more than a company’s relationship to customer data.

  • March 26, 2013
    Malcolm Crompton, CIPP
    replied:

    This is a very useful article.  Thank you.  May I make one suggestion:  privacy has been designated as a pure negative.  Yes, that might be part of the story, but by no means all of it.  The simplest example:  the description here doesn’t embrace the Warren & Brandeis privacy as ‘the right to be let alone’.  The article would be much stronger if it recognised that trusted relationships include respect and discretion.

  • June 09, 2013
    IAPP Member
    replied:

    Case in point - as an example.  When we discover that the government has been spying on us “in order to protect us”  or using its IRS power to thwart the free thought of free people, trust evaporates.  I am not saying that the NSA and trusted parts of our government do not need to watch for terrorists, but when they cancel our privacy “without notice” or prior approval, trust erodes.  When companies mine our data and skirt the privacy laws, trust is harmed in the market. We need transparency in government and business to regain trust and maintain it.  We have to explain what we are doing with data and why before people will trust us with it.  Then we have to really protect it and be honest if we can guarantee that it is is safe.  If we have to show and tell what we do with data - it will make us think twice about how we use it and if we can really protect it.  Then we might be trusted.

To post your comment, please enter the word you see in the image below:

To post your comment, please enter the word you see in the image below:

Get your free study guide now!
Get your free study guide now!