Privacy Advisor

Workers Using Workarounds Put Brands at Risk

October 22, 2013

By David Houlding, CIPP/US

User behavior is a major and growing source of privacy risk. Users now have so many tools available — personal devices, apps, social media, websites, texting, USB keys, personal email — that when usability of solutions or security, or IT departments, get in the way they can easily use these alternatives in workarounds, driving both non-compliance with privacy policy and additional risk.

Beyond vulnerabilities in the hardware or software exploited by sophisticated external hackers, inadvertent side effects of user actions are what lead to much—if not most—of the privacy and security risk today. Increased sophistication of mobile devices and apps—as well as wearables and the Internet of Things—are sure to exacerbate these risks going forward, collecting increasing types and quantities of sensitive personal information in near realtime. While we continually strive to provide bulletproof hardware and software free of vulnerabilities, a user installing a file transfer app and using it to transfer unsecured sensitive personal information or texting personal information in the clear to a co-worker can drive non-compliance with privacy policies and introduce major privacy risks, including breaches.

We can see the extent, drivers and types of user behavior causing these noncompliance issues and risks in 2013 research in Workarounds in Healthcare, a Risky Trend, which found 52 percent of healthcare workers globally use risky workarounds that are out of compliance with policy either every day or sometimes. A research study on U.S. federal agencies including HHS and VA found that 66 percent of users say security protocols are burdensome and as a result 31 percent report using workarounds at least once a week.

Yet another research study shows that 66 percent of nurses use personal smartphones for clinical communication, yet 95 percent say their IT departments don’t support this usage for fear of security risks. Although this research is healthcare-focused, results show general drivers for noncompliant risky behavior include lack of usability, cumbersome security and IT departments that are too slow or overly restrictive—all drivers that apply not only to healthcare but to all types of industries. In many cases, workers performing workarounds are under time and cost-reduction pressure and often are not aware of the additional privacy and security risks resulting from their workarounds.

Interestingly, these workarounds can happen on both corporate and personal devices and even in thin client models such as Virtual Desktop Infrastructure, as long as the user has the ability to install apps, access the web or text, for example.

Since many of the workarounds involve a backend or server component, often cloud-based, these workarounds often have a cloud component. In the case of Bring Your Own Device (BYOD), this can escalate to a Bring Your Own Cloud (BYOC) problem, where sensitive personal information can be at increased risk of confidentiality or breach, as well as trans-border data flow where clouds associated with workarounds are in locales with different privacy regulations or data protection laws. This “side channel” flow of sensitive data can also create a data integrity problem, since often data moving in these side channels doesn’t update the master record. In a healthcare example, this may take the form of one healthcare worker texting an update on a patient to a coworker rather than communicating through the official electronic health record system, a situation which can often lead to the patient record becoming incomplete, inaccurate or out of date. In a best case, this can lead to suboptimal healthcare and in a worst case a patient safety issue.

Clearly, there is not a simple technological solution to this problem. Rather, a holistic multi-layered approach is needed. On the administrative side, controls such as policy that comprehends new technologies and trends and effective privacy and security training are key. On the technical side, we need Privacy Enhancing Technologies (PETs) that can shine a light on privacy risks and influence user behavior in a way that both enables them to achieve their goals while enabling better decisions on alternatives to get their jobs done that reduce privacy risk.

Several examples of this type of technology currently exist, including endpoint DLP (Data Loss Prevention) that can detect when users are attempting an action that is out of compliance with policy—such as copying sensitive personal information unencrypted to a USB drive—and then prevent the noncompliant action, while notifying the user in the “teachable moment” to both educate them on safer alternatives and minimize chances of recurrence.

Some MDM (Mobile Device Management) solutions can include DLP-like capabilities with similar benefits and present one way of reducing risks with BYOD personal devices. Examples from the smartphone and tablet space include apps that scan other apps and, in particular, permissions granted to them, then show the user the most risky apps from a privacy standpoint given the permissions granted, enabling them to make an informed decision on which apps to keep or uninstall. Another type of app scans devices such as smartphones to determine which ad networks are currently installed and active and which apps they are bundled with, similarly shining a light on privacy risk and enabling the user to make an informed decision on which apps to keep or uninstall.

Several security vendors provide plugins for web browsers that cross-check search results against reputation databases and flag search results as either safe or dangerous, influencing user behavior with respect to what URLs they click on before they click one that’s malicious and incur a drive-by download.

Clearly, there are many privacy risks still hidden from users, especially in workarounds they use that have not been properly vetted by privacy and security teams. This presents opportunities for further privacy-enhancing technologies that can expose these privacy risks to users and influence them to make better decisions that both meet their goals minimize privacy risk—and do so without slowing them down. This will become increasingly urgent as users are further empowered with new technologies and alternatives that can be used in a variety of ways, many out of compliance with privacy policy and with major hidden privacy risks. Successfully addressing this paves the way for embracing new technologies, maximizing associated value and benefits, while minimizing privacy risks and breaches of personal information.

David Houlding, CIPP/US, is Senior Privacy Research Architect at Intel. His Twitter handle is @DavidHoulding.