Though public records stand among the pillars of an open society, the information economy is challenging traditional notions of what should be considered readily accessible to the general community. As information brokers collect and store greater amounts of information, public agencies implement e-government initiatives and telecommunications systems improve the ease with which information flows across the globe, should the line around what constitutes public data be reassessed?
A number of recent incidents may be exposing fault lines that could call for a different mode of thinking.
What’s public? What’s private?
In early March, a New Jersey appellate court ruled that an individual’s right to obtain documents under the state’s Open Public Records Act (OPRA) “trumped” the right to privacy of home addresses. In the case, the plaintiff requested a list of addresses of senior citizens who had signed up to receive mailings from Union County, NJ, for the purpose of expressing her views on the contents within county mailings. To protect the recipients, Union County officials redacted the addresses. In response, the plaintiff argued that she had the right to those addresses under the OPRA. The state appellate court agreed with the plaintiff, and now citizens who provide personal information to the mailing list are subject to the OPRA and may have their data shared with other third parties upon request.
One local official fears the ruling sets a dangerous precedent for companies that want to mine data on the cheap. Also, some are concerned that the precedent could “put a chill on people seeking” government services once citizens realize their names and addresses could be shared with third parties.
Similar red flags have been raised about whether emergency call transcripts should be available to the public. George Washington University Law School Prof. Daniel Solove points out in a Concurring Opinions piece that 911 calls often involve the disclosure of personal medical information. “Doctors and nurses are under a duty of confidentiality,” opines Solove, “so why not 911 call centers, especially when people are revealing medical information?”
Though 911 call centers are not regulated under HIPAA, Solove argues the public release of 911 calls without the consent of the caller “violates people’s constitutional right to information privacy.”
Deborah Peel of Patient Privacy Rights argues in a GovInfoSecurity report that public release of emergency calls is a violation of HIPAA because 911 operators “are in effect working on behalf of hospitals and emergency centers as part of the patient’s treatment team.”
Some argue that public release of 911 calls promotes transparency and provides citizens with a window into the effectiveness of call centers. Yet, Solove has said this can be done without violating the privacy of callers. “I would think,” writes Solove, “that if 911 operators didn’t handle the call well, most people would consent to disclosure so the 911 center could be held accountable.”
In light of the recently publicized 911 call of actress Demi Moore, California Assemblywoman Norma Torres (D-Pomona) plans to introduce legislation to bar the disclosure of such calls. Missouri, Pennsylvania, Rhode Island and Wyoming all have laws that keep 911 calls private, and lawmakers in Alabama, Ohio and Wisconsin are considering similar rules.
The pros and cons of e-Government
Mining of public agencies’ data is also being used for identity theft. Media reports show that the Social Security Administration’s Death Master File (DMF) has been under recent scrutiny by citizens and lawmakers because of alleged inaccuracies within the list and the ease with which identity thieves can access the public data.
The government-backed database lists more than 87 million deceased Americans, including their names, addresses and Social Security numbers. Intended to assist medical research and help government, financial, investigative and credit reporting organizations fight identity fraud and verify a citizen’s death, the DMF can be purchased and distributed online. In some cases, using the Freedom of Information Act, information in the file is sold to private companies and posted on the Internet. The information is also commonly used for genealogy research.
Identity thieves have used data gleaned from the DMF to file bogus tax returns, and the practice has become widespread enough to gain the attention of those in public policy.
According to Fox 6 Now, the National Consumers League (NCL) has expressed concern over the availability of DMF data. NCL Vice President of Public Policy John Breyault said, “While NCL generally supports transparency of government data…In this case, we believe the risk that DMF data can be used for nefarious purposes outweighs the benefit.”
SSA Inspector General Patrick P. O’Carroll, Jr., testified in February that as many as 1,000 cases each month have involved a living individual mistakenly being added to the DMF. Additionally, a 2008 Social Security audit revealed that as many as 20,000 living Americans’ Social Security numbers were publicly disclosed through the DMF. Though O’Carroll suggested that public disclosure be limited, it would take an act of Congress, he said.
In response to these issues, LifeHealthPro reports Rep. Sam Johnson (R-TX) has introduced the Keeping IDs Safe Act of 2011. Pending in the House Ways and Means Committee, the legislation would limit the public disclosure of DMF data.
Court records are also public data, serving legal scholars, investigative journalists and others. As New York University Prof. Helen Nissenbaum writes in her book Privacy in Context, “With few exceptions, court records of both civil and criminal cases are also part of the larger class of public records and contain a great deal of personal information.” In addition to plaintiffs and defendants, court records can also reveal personal information about jurors and jury member pools and can include Social Security numbers, financial, medical and other “exquisitely personal” information.
These records--once only on paper--were concealed in what some call “practical obscurity.” Because material documents were housed in specific locales, accessing the records required some degree of physical limitation. In digital form, court documents are not only easily accessible, Nissenbaum writes, they “can be rapidly retrieved, searched and reassembled in novel ways not previously imagined.” A simple query on a search engine by a potential employer could return compromising information that might not appropriately reflect a job applicant’s current situation, for example.
Nissenbaum also notes that, as a consequence, the twofold combination of public record digitization and e-Government initiatives that are making access to public records easier across government agencies, allows “interested parties, from journalists and information brokers to identity thieves and stalkers,” to “avail themselves of these services.”
A shifting privacy paradigm?
According to a recent article by Alexis Madrigal in The Atlantic, NYU’s Nissenbaum has “played a vital role in reshaping the way our country’s regulators think about consumer data.” The concept of context--or what she refers to as “context-relative informational norms”--was used in the Federal Trade Commission’s recently released privacy report 87 times.
Rather than treat what is public and private as a strict binary, Nissenbaum has put forth a need for recognizing “contextual integrity” with a “reasonable expectation of privacy.” Over the course of time, society’s norms and values change, and, depending on the social situation, expectations of privacy change.
For example, new technologies such as facial recognition are challenging our notions of privacy while in public. When entering the public sphere, we are not only letting ourselves be identified, but we can also identify other people. It’s a two-way street, and while not everyone knows everyone else, facial recognition can potentially collect, store and identify individuals in the public sphere and connect those images with a vast database. In this latter context, our expectation of privacy most likely does not match the flow of information captured from images of our face to a database. Yet, the expectation changes when this same technology is employed in a casino to help troubled gamblers who have opted in to the gambling-prevention program.
As Madrigal writes, “Nissenbaum puts the context--or social situation--back into the equation.” What we decide to share with our friends, we might not share with our boss. What we share with a doctor, we might not share publicly, and so on. “Furthermore,” he adds, “these differences in information sharing are not bad or good: they are just the norms…Perhaps most importantly, Nissenbaum’s paradigm lays out ways in which sharing can be a good thing.” Using facial recognition to prevent an addiction to gambling could be one such example.
Rethinking traditional notions of what is private has also made its way to the U.S. Supreme Court.
In United States v. Jones, justices countered the government’s argument that “citizens have no privacy interests in their public movements.” The placement of a GPS monitoring device without a warrant was essentially trespassing on private property. Yet, as Kashmir Hill points out in a Forbes report, what if a suspected criminal used a GPS navigation device in his vehicle? According to the third-party doctrine, privacy is lost once information is shared with a third party, as are any Fourth Amendment protections against illegal searches and seizures.
The concurring opinion of Justice Sonia Sotomayor, in this case, however, suggests a potential reconsideration of the third-party doctrine.
Noting that “an individual has no reasonable expectation of privacy in information voluntarily disclosed to third parties,” Sotomayor queried whether the doctrine is “ill suited to the digital age” since so much information is now shared with third parties during “the course of carrying out mundane tasks.”
As an American Criminal Law Review blog post points out, “It’s unclear whether other justices on the court share” Sotomayor’s concurring opinion, “but this will probably affect arguments in future information privacy cases before the court.”