Making the Case for Online Obscurity and Less Anonymity…Wait, huh?
Privacy has a problem.
That’s what Prof. Woodrow Hartzog told participants at our Navigate event earlier this summer. What does he mean by this? Is privacy dead? Well, no. Not necessarily. “The problem with privacy,” he said, “is that it doesn’t really mean anything; it has ceased to be an effective term to guide policy because it can mean so many different things.”
Besides such a protean definition, our traditional view of the public/private dichotomy is also part of the problem. Did you post “that comment” on Facebook last night? Oh, you did? Well, then, it’s not private. Or should it be? Why does it have to be either/or?
Think about that last plane ride you were on, he intoned. Or that dinner out the other night. How many faces can you remember? How many conversations did you hear? Unless you have a photographic memory, those faces and conversations have dissolved away.
Or to put it another way: if a photo captures my image in a crowd, that’s not a real invasion of my privacy because I’m standing in a public space. But if a drone mounted with a camera follows me everywhere I go while in public…well…that’s a problem. It’s easy to show how your privacy can be invaded, even in a public space.
Think about this Alabama-based investigative reporter who posted some “candid confessions” on her personal blog (her blog is pictured above). Perhaps indulgent, or funny, or just plain truthful. You be the judge of that. But the point is, she thought only her friends were going to see this. Instead, she gets fired from her job and ends up on the Today Show. “If I could go back and do it over, I would have never written the post,” she wrote. “My job was worth 100 times more than any of this media circus.”
That semi-private space she thought she had was an illusion. Is it fair to say that since she posted it online—in public—that she had no expectation of privacy?
Prof. Hartzog promotes the use of online obscurity, or “the context in which information has a low probability of being found or understood by people you don’t want to find or understand it.” He gave a few examples.
- Searchability. If it’s not indexed by Google or one of the other search engines, that helps.
- Use your privacy settings and/or encryption for better access control.
- Use identification/name variance or pseudonyms so you’re not identified by your real name.
- Control the clarity of your message, or post something couched in a way that only your friends or family will understand.
Hartzog said the law and public policy have been terrible on these issues. Instead of thinking about the public/private dichotomy, as the law tends to do, we should think of a continuum marked by a more nuanced approach. We should promote the use of confidentiality agreements, whereby information is exchanged but is predicated on the promise of keeping that information confidential. And, allowing users to create their own obscurity should be considered in design strategy.
Trolling and hate speech run rampant on the Internet and often hide behind the mask of anonymity, a mask created by people who understand how to manipulate the obscurity the web does provide quite well. We have posts on this blog on the subject here and here. Christopher Wolf notes that, “the harm hate speech can cause requires a careful look at circumstances when privacy needs to give way to reducing the increasing instances of online hate,” adding that “when it comes to hate speech, privacy may have to take a backseat.”
Lindy West surely agrees. For women around the world, particularly, anonymity allows for rape speech and other hateful commentary. She writes this about trolls:
“Cumulatively, the sheer volume of hate that we're expected to shoulder, in silence, every day, is wearing a lot of people out and shutting down rational discourse. Female bloggers are being hounded off the internet. Teenage girls are being hounded off the earth. There's no good solution, but we have to do what we can to stop these people—unmask them, shame them, mock them, cement their status as social pariahs—for our own sanity and for those whose armor isn't so thick (upgrade yo greaves, son).
Unmasking trolls, as we've seen, can produce some tangible and satisfying results. And I don't mean just in a punitive way, I mean in a changing-the-larger-culture kind of way. People need to understand and internalize that online harassment, violent hate speech, rape threats, slut-shaming little girls until they hang themselves, and so on, are express violations of the social contract. They will not be tolerated and they will result in real-life consequences. That's a long way off, and probably a bit of a pipe dream, but it might be our only hope for cleaning up this shitshow.”
Strong words, but words worth thinking about. Just as Hartzog warns of the uselessness of the public/private dichotomy, and gives us ideas for bringing public policy into the 21st Century, it’s clear we must also consider how our attempts to carve out our own spheres of online privacy will have to be tempered with the realities of masked hate speech because of the very obscurity for which he advocates.
Privacy certainly has a problem.
About the Author
As editor of the Privacy Perspectives, Jedidiah Bracy moderates the many views, angles and, well, perspectives that inform information privacy and all its adjacent professions.
In addition to editing the Privacy Perspectives, Bracy facilitates the vetting, writing, editing and curation for the Daily Dashboard, the IAPP Canada Dashboard Digest, the IAPP Europe Data Protection Digest and the IAPP ANZ Dashboard Digest. He writes feature articles for The Privacy Advisor on information privacy law, data protection and the privacy profession.
When not mulling over the current state of information privacy in the digital age, Bracy enjoys watching international soccer, listening to his music library and tasting a finely wrought craft beer. You can follow him @jedbracy