Acxiom, MasterCard CPOs Talk Transparency, De-identification, FTC Consent Orders
By Jedidiah Bracy, CIPP/US, CIPP/E
What do you get when you put chief privacy officers (CPOs) from two of the world’s largest Big Data businesses in the same room with an outside privacy counsel and privacy academic? Based on just one of the many compelling panels at this year’s IAPP Privacy Academy, you get conversation as robust as some of Seattle’s finest blends.
Big Data and Transparency
Just a month after Acxiom unveiled its much-anticipated “About the Data” web portal for consumers, Jennifer Barrett-Glasgow, CIPP/US, the company’s CPO—and one of the first-ever CPOs—discussed how to responsibly implement an outward-facing portal for consumer data access. It was a big move for Acxiom—really the first of its kind in the data broker industry, which remains under the Federal Trade Commission’s (FTC) microscope.
Most of the organization’s collected data derives from third parties, she said, noting, “We don’t have direct contact with individuals, and we need to be careful with data access.”
The challenge for Acxiom, she said, was identity verification; it needs to be relative to the data a consumer is accessing.
“That’s a subjective decision,” Glasgow pointed out. When considering the verification page, “our tech guys wanted the whole Social Security number (SSN)” from a consumer requesting portal access. “Our focus group said, ‘NO!’ So we compromised,” she said. Instead, consumers wanting to access their data use the last four digits of their SSN.
Glasgow noted that the data Acxiom uses for AboutTheData.com is marketing data, but if the nature of the information is more sensitive, such as health or financial information, identity verification would have been much stricter. “We made (this portal) with the consumer in mind to explain how marketing data is used,” she said.
Why offer consumer access in the first place? It’s not because of consumer demand. InfoLawGroup Partner Boris Segalis, CIPP/US, said companies like Acxiom are offering this type of access because they are under pressure from regulators—particularly the FTC. Segalis argued that it’s not necessarily the consumers who will be engaged with the access tool but investigative journalists and privacy advocates instead.
“They are the noisy consumers,” MasterCard Worldwide Global Information Governance & Privacy Officer JoAnne Stonier noted, and not all consumers are alike. MasterCard’s research found that traditional market demographics—age, gender, race—are not strong indicators of privacy preferences.
But marketing data—like much of the information within the Big Data universe—resides in a complex landscape. If consumers get too much data, said Glasgow, they will get fatigued and walk away.
“If you want to show transparency, you need to figure out how to give consumers data that is actionable and relevant to what it’s used for,” she said, adding, “That begins the education process … We intend to expand over time as we find that consumers can understand that.”
Consumer education was on the mind of Stonier as well. She agreed with Glasgow, applauded Acxiom’s data portal and added that educating consumers and clients on the collected data is very important. Though MasterCard does not directly work with consumers—MasterCard almost always connects the merchant to the customer’s bank—“We only see your credit card number, so we don’t see your purchase history,” but added, “That being said, we are a Big Data company. We do validate fraud security, and yes, we do anonymize transactions.”
Anonymization and Re-identification Risk Factors
IAPP VP of Research Omer Tene, the session’s moderator, pressed what he called the anonymization issue. He pointed out the potential risk of re-identification: “How do you deal with this risk factor? What do you call de-identification? What’s the acceptable degree of risk, and should we take your word for it?”
Segalis also urged organizations to really think about why they de-identify data. Financial companies have to worry about the Gramm-Leach-Bliley Act and the Fair Credit Reporting Act (FCRA), for instance, but no guidance on de-identification is provided in either statute. “But,” he said, “you should think about what The Wall Street Journal and regulators may think,” noting that EU regulators may believe something is only de-identified if it’s impossible to re-identify the given data set.
And likewise, regulatory mandates—particularly with regard to anonymization standards—vary across the globe. Privacy professionals have to weigh these factors—probability of re-identification, for example—into their decision-making.
“If you hated math,” Stonier warned, “you’re in the wrong career.”
Further, “With Big Data comes Big Security,” Glasgow said. Organizations tend to overlook that as more data is collected. She said companies should always de-identify but also assess how to do so. Glasgow also pointed out the importance of contracts and that companies who share data with third parties should balance technological approaches with binding contracts that are enforceable. Sometimes organizations “focus too much on the technology side of solutions and not on the agreement side of solutions,” Glasgow said.
For those in the trenches, how the data is obtained and protected is paramount. Stonier said pros need to put rules around who accesses certain data and implement a separation of duties. Your IT teams may not like it, but get in there and get your hands dirty. These controls can lead to an environment where the individual record is protected.
And as for de-identification, don’t make it your only defense. Anonymization is but one tool in an arsenal of data protection.
“Regulators had originally heard that de-identification was a magic bullet,” Stonier said. “It’s one thing to have the internal rules for protecting data, but what most Big Data companies are trying to do is look at how populations are trending, economic indicators, security threats, but not the individual person.” And with that, she said, “You can have really good internal controls, but they’re not easy to explain to a customer.”
And those internal controls, Stonier said, are often not understood by regulators.
Glasgow agreed. “I think we do ourselves a disservice when we talk about anonymization as a magic bullet,” she said. Organizations anonymize data internally, get comfortable and implement risk security. Then the business or the data use changes, and organizations don’t go back and reevaluate whether that works for the new use. “Then it becomes a slippery slope,” she said, adding, “There should be a big wall between your internal use, how you share with third parties and how you turn it loose for public access.”
A Section 6 FTC Investigation
Any organization that’s already been subject to a Section 6 investigation by the FTC knows that once the FTC has made contact, there’s not much choice but to comply. Under Section 6, the FTC can investigate entire industries. And as the panelists knew all-too-well, the agency’s most recent investigation involves the data broker industry.
“There’s no limit to what they can request to understand how the industry works,” said Segalis. When an organization receives such a request, it tends to be broad, and it doesn’t necessarily mean the company has done anything wrong. “You can resist it, delay it,” but, Segalis noted, “Anything you say can potentially be used against you.” He also said that many of the questions coming from the FTC to the data broker industry concern whether the industry is violating the FCRA.
Yet, Glasgow also said working with the FTC during a Section 6 investigation can be a chance to educate the agency about your business practices. “The data broker community should not be surprised with this FTC inquiry,” she noted.
And in light of such an active FTC, both Glasgow and Stonier urged privacy pros to read all of the agency’s consent decrees. “Even if it’s not your industry,” Glasgow said.
“There are examples in those decrees that I encourage you to check out.” Stonier agreed, saying, “Those consent orders are the best case law we have about the government’s thinking.”
Read more by Jedidiah Bracy:
Cato Conference: We Have Problems, Is NSA Biggest One?
Three Steps to Heaven, St. Rita and the Future of the EU Draft Regulation
Data Brokers, Universities Breached; Was Nurse Fired for Privacy Breach or Whistleblowing?
White House Names NSA Review Panel