TOC The legal angle Your Turn

September 15, 2009

Since their inception, these fillips have stressed how technology has played havoc with the law, especially with property and privacy law.

When data is concerned, Western legal systems are now distinguishing the privileged few, who are allowed to cling to copyright laws made for the past Industrial Age, from the common rabble, whose personal data may be captured for free and exploited for profit.

David Segal's report on Kindle's thefts shows that emergent ownership issues are not limited to data (*). Wireless technology enables the company which controls tethered devices to locate where they are and, since the user needs to register, who the user is. But Amazon declines to help the original buyer track down the thief who steals a Kindle, nor will it deactivate the device.

Amazon's position is not without merit. It will act, but only "if [...] contacted by a police officer bearing a subpoena", a reasonable protection of its users' privacy. Yet, when copyright is at risk, it strikes back at once against users whose only offense has been to pay for a download fulfilled by Amazon itself before it changed its mind. Put on a leash, some dogs drag their masters along. With tethering, the master reigns supreme.

So may I ask what property does a Kindle buyer acquire, practically speaking? Books? Hardly since they can be erased at any time, together with "any digital annotations" made by the reader, a point derived from Miguel Helft's article about Jeffrey Bezos' apologies (**). A device? Not if anyone can borrow it without neither permission nor sanction. No, the user is simply granted access to a service cancellable at anytime. And for this, the user must further agree to whatever terms are imposed by the so called privacy policies of a company famous for past egregious violations (1).

Amazon's hypocrisy makes for amazing copy. Nevertheless companies whose overbearing behavior is apparent to all are less dangerous than those whose actions stay hidden from public scrutiny. So beyond the fun, these fillips strive to search for a better framework of both laws and regulations.

The same constructive perspective inspires the work of law professor Paul Ohm, whose draft on the limits of anonymization I strongly recommend (***). That personally non identifiable information (PII) can preserve privacy, I have long derided. If the FTC wanted to buttress its recent doubts on the subject, it could do no better than read Paul Ohm's well documented argument, ranging from the AOL blunder to Narayanan and Shmatikov's research.

Taking stock of this failure of anonymization, Paul Ohm asserts that "data can either be useful or perfectly anonymous but never both". Because on the contrary current privacy laws relies on the promise of anonymization, they must be revised. And Paul Ohm proceeds to suggest how. Go back and reeavaluate the balance between the risks to privacy and the benefits of sharing according to each domain.

Proposing to assess risks based on such factors as the data handling techniques to be used, how much trust the intended audience can receive and the volume of data to be shared, Paul Ohm warns preventively against the illusion of "mathematical precision". Weighing risks versus benefits is indeed fraught with uncertainty.

One of the author's strengths is to combine first hand knowledge of programming with legal training. This enables him to show the fatal shortcomings of the three alternative solutions, waiting for harm to be done, hoping for a breakthrough technology and creating unenforceable legal sanctions.

Having myself developed a technology which Paul Ohm would characterize as "interactive", i.e. a method which allows "questions about the data without ever releasing the underlying data", I can appreciate why he thinks such advances are no solution. They "require the constant participation of the data administrator. This increases the cost of analysis and reduces the rate of new analysis". The sad truth is, he is right on both counts.

But Paul Ohm has missed an important point which I urge him to consider in the next revision of his draft. In most cases, information sharing is there to feed a pattern recognition process and such tasks fall into two very distinct categories. During the training phase, one tries to characterize a new pattern among a given sample set. In the subsequent classification phase, one tries to find whether a given pattern is present or not in a new sample.

Because classification deals with one sample at a time, ePrio technology can deliver total privacy by sending the controller's rules to the user rather than the user's data to the controller. Its usefulness does assume some willingness from the users to participate. But don't consumers know targeted advertising benefits them? Don't patients want better medical diagnostics? As for forbidding entry to foreign terrorists or checking on ordinary citizens, one can focus on the few intended targets and spare the crowd of innocent bystanders while acknowledging inevitable recognition errors.

In reality Paul Ohm's critiques apply to training, whose efficiency requires central access to data. But training is best done on relatively small sample sets. If to ensure its safety and usefulness a new prescription had first to be tried on the whole population, it would defeat the whole purpose of the trial, wouldn't it? In that case, my recommendation is to mandate the data aggregator bear any associated data risk with the explicit and uncoerced consent of the sample participant. Let her decide if there is any benefit left.

The data aggregator might propose the sample participants waive their rights for a price, as Nielsen does to collect viewer data. Law may fix domain dependent minimum prices and protection measures, as it does with human labor. But otherwise let the market balance offer and demand. Ethics may forbid patients to be compensated for their data as already the case for their physical participation in medical trials. But society should not allow universities, hospitals and pharmaceutical companies, in the hope of lucrative patents, to take any patient data without prior consent.

In our shared information society, shedding light through the prism of privacy, sharing and money illuminates the legal angle and all its dark recesses.

Philippe Coueignoux

September 2009
Copyright © 2009 ePrio Inc. All rights reserved.