Cutting data use a little slack

December 15th, 2014 / Author: John Berard

It is accepted as an article of faith that there will be no new U.S. privacy legislation in the coming year (or, perhaps, any year).  The betting money is on bits and pieces, but no systematic approach to what continues, at best, to be a fragmented landscape of similar but not-the -same state-based laws. At worst, the rules are in conflict or they do not exist at all.

This circumstance makes it difficult for businesses, all of whom operate in multiple jurisdictions and are increasingly dependent on the collection, storage and use of customer information.  Even though the U.S. Supreme Court has decided that corporations are people, too, there is little sympathy for commercial relief when each of us real people is under such relentless scrutiny.

The result is paralysis brought on by the competing urges to do something but not the wrong thing.  This has led us to where we are with regard to privacy legislation.  The perfect is most certainly the enemy of the good enough.  The downside has been the public’s need to accommodate anxiety driven by identity theft, data breaches and tracking.  The upside, well, there has been no upside.  But now comes word of a deeper downside.

As one report puts it, lingering concerns about privacy continue to hold up wider adoption of the most insightful sort of behavioral data — that generated by our mobile devices — to help speed disaster response and blunt disease outbreaks, like Ebola.  In the report from MIT, the culprit is found to be too much emphasis on a limited risk of re-identification of anonymized data and unclear harm without considering the social benefits of using mobile and social data.

We have come to understand that there are no perfect ways to anonymize data.  In light of the relentless evolution of technology, there probably will never be.  To quote the MIT report, there will always be some risk that must be balanced against the public good that can be achieved.  They suggest a good-enough solution which gives special consideration to cases where data will be used for significant public good or to avoid serious harm to people.

Clearly there will be those who define “significant public good” or “avoid serious harm” in different and, perhaps, contradictory ways.  Cue the arguments. But the potential positive uses of the data we create need to be given a little freedom from the current black-and-white regime. This is not to say that if data are outlawed only criminals will have data, but it is to say that the social benefit of social data should be given a little slack.