By Lauren B. Steinfeld and Maura Johnston
For most organizations, the question is not whether to take measures to protect the privacy and security of confidential information, but how. In strongly hierarchical entities, the approach to protecting information may focus on policies that are issued centrally and designed to be applied uniformly. In higher education and other industries, that model alone is usually not enough.
Many organizations are large and have diverse and distributed units. Senior leadership often empowers local management to act largely independently to foster flexibility, speed and innovation. These units may have their own IT infrastructure; they may have different ways of collecting, using, sharing and protecting data. They may utilize central sources of data or collect their own. They may be selling products, conducting research, or providing service. They may use different operating systems, software, and devices. They may practice in different countries. A one size fits all policy can be a very poor fit for this type of organization.
This article describes a voluntary privacy and security risk assessment program adopted at the University of Pennsylvania (Penn) that, coupled with other strategies, has proven very effective in a complex environment of innovative and often independent units. Other organizations with similar qualities, whether higher education or not, may also benefit from a voluntary privacy and security assessment program. We hope this article will help privacy professionals in such organizations consider whether elements of Penn's SPIA program will help them address the many challenges in privacy and security that exist today.
At Penn, there are multiple strategies in place aimed at protecting confidential data. For example, there are top-down and locally developed policies; technology controls; internal and external audits; training programs; and communications forums that work together to increase our defenses against a wide range of threats to such information. And yet, even with the many established and effective controls in place, challenges remain. Because our organization is vibrant, large, diverse, and ever-changing, we continually have new people, new systems, new technologies, and new data, as well as new rules and strategies to stay on top of.
Penn's Privacy, Security, and IT Audit offices began to work together in 2005 to address our privacy and security challenges in a different way, drawing from existing models, such as the Federal Privacy Impact Assessment and Virginia Tech's STAR model, and tailoring them to our own environment. Our goals were to reach deeper into the organization with a flexible program, voluntarily adopted by units, to raise awareness of privacy risks and solutions, to engage a broader population in understanding and action, and to help individual units build their own roadmaps for change.
The SPIA Program in a Nutshell
We created SPIA, shorthand for the Security and Privacy Impact Assessment.
The SPIA program is a voluntary program that individual units sign up for on a yearly cohort basis. SPIA essentially is a people process that raises awareness deep into organizations about what confidential data exists and what systems store such data. It establishes a common vocabulary and common standards for assessing risks to data in systems; fosters discussion among IT staff and members of the academic and administrative communities; and prompts remediation of major risk areas.
In outline, the program consists of the following major steps:
Step 1. Develop an approach to the project, including selection of a team. It is recommended that the team include IT staff as well as operational staff with the most knowledge of the relevant data.
Step 2. Inventory confidential data in databases and applications, collecting high level information about how much of what type of data is involved and where it is kept.
Step 3. Schedule detailed risk assessments for these applications and databases over three years, using the high level information gathered in Step 2, to create a prioritization list aimed at prompting earlier assessments of the systems with the most risk.
Step 4. For each database or application, conduct the detailed risk assessments as scheduled, using the following methodology:
- Develop a risk score (probability of threat multiplied by consequence of threat) for each of seven major threat areas, considering which safeguards, listed alongside each threat, are currently implemented (The seven threat areas are: information compromised by external hacker or malicious software; information intercepted in transit by unauthorized persons; information mistakenly disclosed to unauthorized persons; information knowingly or recklessly misused by staff, faculty, vendors, or temporary workforce members; physical theft; community objections regarding privacy practices; and inadequate business continuity procedures/information not available to support operations.)
- Consider, if the risk score is too high, implementing additional safeguards from the list alongside the threats
- Develop specific improvement plans for that application or database based on the full set of safeguards that are not currently implemented; the selected safeguards should include those that the unit believes are appropriate to adopt in the future
Step 5. Summarize findings in an executive level annual report. This report describes, among other things, the findings—including greatest concerns and successes, as well as improvement plans with timelines, estimated resource requirements and expected risk-reduction outcome.
Success Factors in SPIA
The SPIA program has been received very well by participating units and its popularity has grown over the years, since its inception in 2006, prompting more units to join new cohorts. Many have wondered and asked how it is that so many have signed up to participate. There are several reasons we believe that the SPIA program has been attractive and we encourage organizations considering a distributed assessment program to think about the following:
1. Participation in SPIA is voluntary. While mandates often spur action and can easily gain the attention of senior leaders and keepers of the budget, we have found that the voluntary nature of SPIA has indeed spurred many to action as well. We describe to each cohort that "SPIA is being done â€˜for you,' not â€˜to you.'" SPIA is a resource and a methodology for units that are concerned about privacy and security risks and are looking for support and structure to better understand what to do about those risks. The voluntary nature and supportive aspects of the SPIA program continue to attract schools and centers to the program.
2. SPIA safeguards are voluntary. Many individuals who thought that SPIA might require implementation of roughly 70 safeguards listed in the detailed assessment tool were heartened to learn that SPIA merely prompts consideration of those safeguards and does not mandate a single one. It is true that other policies do mandate implementation of many of the safeguards, but SPIA itself does not. And, SPIA describes other risk mitigation strategies that are not mandatory across the board, but that are reasonable to consider and often to adopt to address a certain area of risk. SPIA recognizes that some units will decide to implement strategies that others will not, and that each of these decisions may be perfectly sensible and appropriate.
3. There is little written reporting to SPIA central coordinators. While significant inventorying and assessing are part of the SPIA program for participating units, SPIA deliberately does not ask for central reporting of the detailed work created. Instead, information submitted to the coordinating offices is kept to a minimum; it consists of the summary of approach, for quality assurance purposes, the annual executive summary, and certain informal status updates along the way. This aspect of SPIA — keeping local units largely in control of their own analysis — has been appealing.
4. Senior leadership is engaged in understanding privacy and security risks. The SPIA annual executive summary is signed by each school's IT director and senior business administrator. This element of the program is helpful to both IT and operational staff who may otherwise have trouble bringing attention to privacy and security risks at a senior level in their organization. It is also helpful to senior leadership by providing a consistent and organized presentation of risks, strategies to address the risks, budget impacts and timelines. SPIA helps organizational leaders plan for the future.
5. SPIA prompts communications at all levels and across organizations. The most significant accomplishment arising from SPIA, according to almost all participants to date, is the fact that SPIA prompts conversations and these conversations help connect individuals throughout the organization with privacy and security issues. The inventory step cannot be completed without reaching out to people at all levels to determine what systems and data they are using. The executive summary engages senior management. And the entire process closely connects central privacy and security personnel with IT and operational staff in the local units. There is a consensus that relationships built and the awareness created at all levels are the most successful outcomes of SPIA.
We continue to feel the very real pressures of privacy and security threats and we recognize that no single program, or even a combination of many, keeps our data completely safe. The SPIA program was an experiment for us and one that we continue to revise and refine. While not a perfect solution, it has helped build a larger and stronger community of people knowledgeable about privacy and security and empowered to raise questions and try to implement solutions. And in an environment as dynamic as ours, and as dynamic as the ones Privacy Advisor readers are operating in, it can help move us all forward to our goal of better protection of confidential information.
The SPIA tool and instructions can be viewed at Penn's Privacy Web site: www.upenn.edu/privacy, click on "SPIA."
Lauren B. Steinfeld serves as chief privacy officer and institutional compliance officer at the University of Pennsylvania. In her position, Ms. Steinfeld works on privacy issues involving medical information, student records, electronic data, Social Security numbers, and other personal information. Prior to her work at Penn, Ms. Steinfeld worked at the Office of Management and Budget as the associate chief counselor for privacy and prior to that as attorney advisor to Federal Trade Commissioner Mozelle Thompson.
Maura Johnston has a leadership role in the privacy program at Penn, with a focus on raising awareness about privacy risks and choices, identifying evolving areas of risk, and mitigating privacy risks. She has extensive experience in law and health care, having worked as director of healthcare financing systems for the New Jersey Department of Health and Senior Services, as a special assistant U.S. attorney, and as a deputy attorney general, litigation section in the Pennsylvania Office of attorney general. Maura has an MBA from the Wharton School and a J.D. from the University of Pennsylvania Law School.