TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

""

""

An absolute certainty on which everybody seems to agree is that legislating takes longer than programming.

This is not just a bland statement based on guesswork and half-baked common sense. According to a comprehensive survey of app developers carried out in 2013, the average timeframe for developing a mobile app is 18 weeks. That is less than five months. But the interesting fact about that timeframe is that when the survey was published, there was a bit of an uproar amongst developers, who quickly pointed out that creating an app did not really need to take that long. On top of that, since the beginning of the 21st century, new methods of software development—such as the highly successful Agile approach—have enabled even more dynamic and evolutionary ways of developing new technologies.

However you look at it, it is difficult to imagine a law being devised, crafted and passed at the same speed at which software developers and engineers do their work.

Some will see that as a weakness; some will see it as a strength. We should see it as a reality. Its most immediate consequence is that lawmakers will never be able to catch up with technological developments in the sense of regulating for every iteration and version of a product or service. Aiming for the adoption of legal solutions to technological challenges will at best be a temporary patch. So, for example, in California and a few other U.S. jurisdictions, people's privacy is currently protected through a ban on skin-implanted RFID chips, but what about the myriad other devices—from our phones to our cars to our running shoes—which are equally "implanted" in our lives and can serve the same tracking purpose as an RFID tag?  Similarly, in Europe, Internet cookies are caught by the consent requirements that apply to the storage of information on users' devices, but what about digital fingerprinting or IPV6?

You get the point, and let us hope that policy-makers do too.

Given the speed of technological development, not even the fastest-paced legislative process will ever be able to address each and all of the privacy implications raised by the overwhelming creativity and output of those involved in that development.

But whilst technology is always changing, there is something that has not really changed that much for thousands of years: human behaviour.

Quite a bit has transpired between our reliance on cave paintings and the creation of the iPhone. Much has been achieved in the field of medicine since the Hippocratic Oath was first written in ancient Greece. Equally, it took us a while to figure out how to go from inventing the wheel to getting a 400-tonne aircraft to fly 500 people between continents at nearly 1,000 km/h in one go. But in all those cases, there is something that has lasted many, many generations: our human need to communicate, survive and prosper. Technology may change, but we humans are quite consistent and predictable.

The upshot of this is that when it comes to policy-making, we are more likely to get it right if we focus on behaviour rather than technology. The same technology can almost always be used for virtuous and evil purposes and everything in between. That fact makes it particularly complicated to decide the most appropriate regulation, and things are even more difficult when a particular technological application is developed for the first time. Is the use of mobile devices' location data beneficial or freaky? Is smart metering an environmental hope or a threat to our privacy? Are Internet cookies helpful or a data parasite? Are Google glasses good or bad? None of these questions can be answered in a definite way because it all depends on the uses made of the information gathered.

Going forward, therefore, we need to steer away as much as possible from trying to protect our privacy by regulating technology. Instead, we must direct our attention to the behaviour that should either be encouraged or prevented—irrespective of the technology in place. Another way of putting this is that laws should be geared toward achieving certain outcomes, such as incentivising compliance, empowering individuals or preventing harm whilst facilitating progress and technological innovation. So whatever the rules and the state of technological development, it is clear to all what the direction of travel should be and where the behavioural red lines lie.

1 Comment

If you want to comment on this post, you need to login.

  • comment Jason • Jul 15, 2013
    I think you raise a great point. This also applies to a company/employee perspective--creating policies, procedures, guidelines, etc., that are very specific leads to "loopholes" that were not covered and the need for further detail. If you can get people to understand and align around behavioral expectations, judgment and ethics should be good guides. Of course, then comes the dilemma of "common sense" not being as common as we would like.
    
    
    I think it is time well spent to focus on the core elements of what is expected (e.g., notice, consent, etc.) and then use real-world case studies to put these elements into context.