Category Archives: Law: Privacy

Privacy Myopia, Economics of

I was talking with someone after lunch today at the Internet Identity Workshop (#IIW), and mentioned the privacy myopia problem. That drew a blank. So I thought I’d reprint here something I wrote a few years ago, so that I could point him, and others, to it. Obviously, the P3P stuff is dated, but there are other technologies and apps trying to fill the gap.

This is an excerpt from The Death of Privacy?, 52 STAN L. REV. 1461 (2000). If you want the footnotes, you’ll have to download the original…

The economics of privacy myopia

Under current ideas of property in information, consumers are in a poor legal position to complain about the sale of data concerning themselves.172 The original alienation of personal data may have occurred with the consumer’s acquiescence or explicit consent. Every economic transaction has at least two parties; in most cases, the facts of the transaction belong equally to both. As evidenced by the existence of the direct mail industry, both sides to a transaction generally are free to sell details about the transaction to any interested third party.

There are exceptions to the default rule of joint and several ownership of the facts of a transaction, but they are relatively minor. Sometimes the law creates a special duty of confidentiality binding one of the parties to silence. Examples include fiduciary duties and a lawyer’s duty to keep a client’s confidence. Overall, the number of transactions in which confidentiality is the legal default is relatively small compared to the total number of transactions in the United States.

In theory, the parties to a transaction can always contract for confidentiality. This is unrealistic due because consumers suffer from privacy myopia: they will sell their data too often and too cheaply. Modest assumptions about consumer privacy myopia suggest that even Americans who place a high value on information privacy will sell their privacy bit by bit for frequent flyer miles. Explaining this requires a brief detour into stylized microeconomics.

Assume that a representative consumer engages in a large number of transactions. Assume further that the basic consumer-related details of these transactions—consumer identity, item purchased, cost of item, place and time of sale—are of roughly equivalent value across transactions for any consumer and between consumers, and that the marginal value of the data produced by each transaction is low on its own. In other words, assume we are limiting the discussion to ordinary consumer transactions, not extraordinary private ones, such as the purchase of anticancer drugs. Now assume that aggregation adds value: Once a consumer profile reaches a given size, the aggregate value of that consumer profile is greater than the sum of the value of the individual data. Most heroically, assume that once some threshold has been reached the value of additional data to a potential profiler remains linear and does not decline. Finally, assume that data brokers or profile compilers are able to buy consumer data from merchants at low transactions costs, because the parties are repeat players who engage in numerous transactions involving substantial amounts of data. Consumers, however, are unaware of the value of their aggregated data to a profile compiler. With one possible exception, the assumption that the value of consumer data never declines, these all seem to be very tame assumptions.

In an ordinary transaction, a consumer will value a datum at its marginal value in terms of lost privacy. In contrast, a merchant, who is selling it to a profiler, will value it at or near its average value as part of a profile. Because, according to our assumptions, the average value of a single datum is greater than the marginal value of that datum (remember, aggregation adds value), a consumer will always be willing to sell data at a price a merchant is willing to pay.

The ultimate effect of consumer privacy myopia depends upon a number of things. First, it depends on the intrusiveness of the profile. If the profile creates a privacy intrusion that is noticeably greater than disclosing an occasional individual fact—that is, if aggregation not only adds value but aggravation—then privacy myopia is indeed a problem. I suspect that this is, in fact, the case and that many people share my intuition. It is considerably more intrusive to find strangers making assumptions about me, be they true or painfully false, than it is to have my name and address residing in a database restricted to the firms from which I buy. On the other hand, if people who object to being profiled are unusual, and aggregation does not cause harm to most people’s privacy, the main consequence of privacy myopia is greatly reduced. For some, it is only distributional. Consumers who place a low value on their information privacy—people for whom their average valuation is less than the average valuation of a profiler—would have agreed to sell their privacy even if they were aware of the long-run consequences. The only harm to them is that they have not extracted the highest price possible. But consumers who place a high value on information privacy will be more seriously harmed by their information myopia. Had they been aware of the average value of each datum, they might have preferred not to sell.

Unfortunately, if the marginal value175 to the consumer of a given datum is small, then the value of not disclosing that datum will in most cases be lower than either the cost of negotiating a confidentiality clause (if that option even exists), or the cost of forgoing the entire transaction. Thus, in the ordinary case, absent anything terribly revealing about the datum, privacy clauses are unlikely to appear in standard form contracts, and consumers will accept this.

Furthermore, changing the law to make consumers the default owners of information about their economic activity is unlikely to produce large numbers of confidentiality clauses in the agora. In most cases, all it will do is move some of the consumer surplus from information buyers to information producers or sellers as the standard contracts forms add a term in which the consumer conveys rights to the information in exchange for a frequent flyer mile or two.

In short, if consumers are plausibly myopic about the value of a datum— focusing on its marginal value rather than its average value, which is difficult to measure—but profilers are not and the data are more valuable in aggregate, then there will be substantial over-disclosure of personal data even when consumers care about their informational privacy.

If this stylized story is even somewhat accurate, it has unfortunate implications for many proposals to change the default property rules regarding ownership of personal data in ordinary transactions. The sale will tend to happen even if the consumer has a sole entitlement to the data. It also suggests that European-style data protection rules should have only a limited effectiveness, primarily for highly sensitive personal data. The European Union’s data protection directive allows personal data to be collected for reuse and resale if the data subject agrees; the privacy myopia story suggests that customers will ordinarily agree except when disclosing particularly sensitive personal facts with a high marginal value.

On the other hand, the privacy myopia story suggests several questions for further research. For example, the myopia story suggests that we need to know how difficult it is to measure the value of privacy and, once that value has been calculated, how difficult it is to educate consumers to value data at its average rather than marginal value. Can information provide a corrective lense? Or, perhaps consumers already have the ability to value the privacy interest in small amounts of data if they consider the long term consequences of disclosure.

Consumers sometimes have an interest in disclosure of information. For example, proof of credit-worthiness tends to improve the terms upon which lenders offer credit. The myopia story assumes this feature away. It would be interesting to try to measure the relative importance of privacy and disclosure as intermediate and final goods. If the intermediate good aspect of informational privacy and disclosure substantially outweighed their final good aspect, the focus on blocking disclosure advocated in this article might be misguided. European data-protection rules, which focus on requiring transparency regarding the future uses of gathered data, might be the best strategy.

It would also be useful to know much more about the economics of data profiling. In particular, it would be helpful to know how much data it takes to make a profile valuable—at what point does the whole exceed the sum of the data parts? Additionally, it would be important to know whether profilers regularly suffer from data overload, and to what extent there are diminishing returns to scale for a single subject’s personal data. Furthermore, it could be useful to know whether there might be increasing returns to scale as the number of consumers profiled increases. If there are increasing returns to scale over any relevant part of the curve, the marginal consumer would be worth extra. It might follow that in an efficient market, profilers would be willing to pay more for data about the people who are most concerned about informational privacy.

There has already been considerable work on privacy-enhancing technologies for electronic transactions. There seems to be a need for more research, however, to determine which types of transactions are best suited to using technologies such as information intermediaries. The hardest work, will involve finding ways to apply privacy-enhancing technologies to those transactions that are not naturally suited to them.

Perhaps the most promising avenue is to design contracts and technologies that undercut the assumptions in the myopia story. For example, one might seek to lower the transaction costs of modifying standard form contracts, or of specifying restrictions on reuse of disclosed data. The lower the cost of contracting for privacy, the greater the chance that such a cost will be less than the marginal value of the data (note that merely lowering it below average cost fails to solve the underlying problem, because sales will still happen in that price range). If technologies, such as P3P, reduce the marginal transactions costs involved in negotiating the release of personal data to near zero, even privacy myopics will be able to express their privacy preferences in the P3P-compliant part of the marketplace.

Posted in Econ & Money, Law: Privacy | 1 Comment

DataBase State (UK)

A quarter of the UK's largest public-sector database projects, including the ID cards register, are fundamentally flawed and violate European data protection laws, according to DataBase State, a report published today. The report also fingers the UK's national DNA database and the Contactpoint index of all children in England as particularly flawed.

Funded by the Joseph Rowntree Reform Trust, the report identifies 46 UK government databases and systems, more than half of which it says fail tests of privacy or effectiveness, and thus could be illegal under European privacy law.

Posted in ID Cards and Identification, Law: Privacy, UK | Comments Off on DataBase State (UK)

Skype Security Considerations

Financial Cryptography: Skype: the gloss is losing its shine has lots of food for thought.

I just wish financialcryptography.com would format its RSS feed in a way my reader could parse better…

Posted in Cryptography, Law: Privacy | 1 Comment

My Privacy Damages: A Whole Dollar

Florida settles lawsuit — and I get $1.

Yes, one whole dollar for the State of Florida illegally selling personal info from my drivers' license to marketing firms.

Posted in Law: Privacy | 1 Comment

Abbreviated Survey of US Anonymity Law

Over the weekend, I posted drafts of two chapters I wrote for a forthcoming book on anonymity and privacy around the world. Here's the back story on Anonymity and the Law in the United States.

I had originally agreed to write one piece — Identity Cards and Identity Romanticism — and then the book's editor, the incomparably wonderful Ian Kerr, asked me to write a survey of US law on anonymity. I thought it would be do-able, and I very much wanted to repay Ian for all his many kindnesses over the years.

But it wasn't easy. The problem wasn't so much that the US law in the area is chaotic, I'm used to that. Nor was it mainly that (after I'd agreed) they sent me an outline of the topic they hoped I would cover, a list which went well outside my comfort zone into areas like criminal procedure and juries, because I'm up for learning new things. No, the problem was the @#$@# word limit. I had to compress everything into tiny little spaces. I hated doing that. I found it excruciating, in fact. And it results in generalizations which while not, I hope, erroneous are on occasion not as precise as I'd ideally like.

Anonymity and the Law in the United States

This book chapter for “Lessons from the Identity Trail: Anonymity, Privacy and Identity in a Networked Society” (New York: Oxford University Press, 2009) — a forthcoming comparative examination of approaches to the regulation of anonymity edited by Ian Kerr — surveys the patchwork of U.S. laws regulating anonymity and concludes the overall U.S. policy towards anonymity remains primarily situational, largely reactive, and slowly evolving.

Anonymous speech, particularly on political or religious matters, enjoys a privileged position under the U.S. Constitution. Regulation of anonymous speech requires a particularly strong justification to survive judicial review but no form of speech is completely immune from regulation. Anonymity is presumptively disfavored for witnesses, defendants, and jurors during criminal trials; the regulation of anonymity in civil cases is more complex. Plaintiffs demonstrating sufficiently good cause may proceed anonymously; conversely, defendants with legitimate reasons may be able to shield their identities from discovery.

Despite growing public concern about privacy issues, the United States federal government has developed a number of post 9/11 initiatives designed to limit the scope of anonymous behavior and communication. Even so, the background norm that the government should not be able to compel individuals to reveal their identity without real cause retains force. On the other hand, legislatures and regulators seem reluctant to intervene to protect privacy, much less anonymity, from what are seen as market forces. Although the law imposes few if any legal obstacles to the domestic use of privacy-enhancing technology such as encryption it also requires little more than truth in advertising for most privacy destroying technologies.

I do think there's some value to a survey like this, especially in a collection where it will appear right next to similar surveys from lawyers in other countries. So I'm not sorry to have done it. But it's a little more of a laundry list than my usual work.

Posted in Law: Privacy | Comments Off on Abbreviated Survey of US Anonymity Law

EFF Fighting the Good Fight on Wiretap Case Immunity

The Electronic Frontier Foundation filed a reply brief yesterday in response to the federal government's and telecoms' motion for dismissal or summary judgment in an ongoing lawsuit against the telecoms for their (alleged) participation in illegal warrantless surveillance. The case is captioned “In re National Security Agency Telecommunications Records Litigation, Mdl No. 1791”.

This is the suit that motivated the immunity provisions of the FISA amendments. But they were drafted in a very very odd way that leaves some substantial daylight for challenges. And the great lawyers at EFF have done a first-rate job of running for daylight.

[Disclosure: In addition to serving on EFF's Advisory Board, I had a minor role in assisting the EFF legal team on one of the issues.]

Posted in Civil Liberties, Law: Privacy | Comments Off on EFF Fighting the Good Fight on Wiretap Case Immunity