I was talking with someone after lunch today at the Internet Identity Workshop (#IIW), and mentioned the privacy myopia problem. That drew a blank. So I thought I’d reprint here something I wrote a few years ago, so that I could point him, and others, to it. Obviously, the P3P stuff is dated, but there are other technologies and apps trying to fill the gap.
This is an excerpt from The Death of Privacy?, 52 STAN L. REV. 1461 (2000). If you want the footnotes, you’ll have to download the original…
The economics of privacy myopia
Under current ideas of property in information, consumers are in a poor legal position to complain about the sale of data concerning themselves.172 The original alienation of personal data may have occurred with the consumer’s acquiescence or explicit consent. Every economic transaction has at least two parties; in most cases, the facts of the transaction belong equally to both. As evidenced by the existence of the direct mail industry, both sides to a transaction generally are free to sell details about the transaction to any interested third party.
There are exceptions to the default rule of joint and several ownership of the facts of a transaction, but they are relatively minor. Sometimes the law creates a special duty of confidentiality binding one of the parties to silence. Examples include fiduciary duties and a lawyer’s duty to keep a client’s confidence. Overall, the number of transactions in which confidentiality is the legal default is relatively small compared to the total number of transactions in the United States.
In theory, the parties to a transaction can always contract for confidentiality. This is unrealistic due because consumers suffer from privacy myopia: they will sell their data too often and too cheaply. Modest assumptions about consumer privacy myopia suggest that even Americans who place a high value on information privacy will sell their privacy bit by bit for frequent flyer miles. Explaining this requires a brief detour into stylized microeconomics.
Assume that a representative consumer engages in a large number of transactions. Assume further that the basic consumer-related details of these transactions—consumer identity, item purchased, cost of item, place and time of sale—are of roughly equivalent value across transactions for any consumer and between consumers, and that the marginal value of the data produced by each transaction is low on its own. In other words, assume we are limiting the discussion to ordinary consumer transactions, not extraordinary private ones, such as the purchase of anticancer drugs. Now assume that aggregation adds value: Once a consumer profile reaches a given size, the aggregate value of that consumer profile is greater than the sum of the value of the individual data. Most heroically, assume that once some threshold has been reached the value of additional data to a potential profiler remains linear and does not decline. Finally, assume that data brokers or profile compilers are able to buy consumer data from merchants at low transactions costs, because the parties are repeat players who engage in numerous transactions involving substantial amounts of data. Consumers, however, are unaware of the value of their aggregated data to a profile compiler. With one possible exception, the assumption that the value of consumer data never declines, these all seem to be very tame assumptions.
In an ordinary transaction, a consumer will value a datum at its marginal value in terms of lost privacy. In contrast, a merchant, who is selling it to a profiler, will value it at or near its average value as part of a profile. Because, according to our assumptions, the average value of a single datum is greater than the marginal value of that datum (remember, aggregation adds value), a consumer will always be willing to sell data at a price a merchant is willing to pay.
The ultimate effect of consumer privacy myopia depends upon a number of things. First, it depends on the intrusiveness of the profile. If the profile creates a privacy intrusion that is noticeably greater than disclosing an occasional individual fact—that is, if aggregation not only adds value but aggravation—then privacy myopia is indeed a problem. I suspect that this is, in fact, the case and that many people share my intuition. It is considerably more intrusive to find strangers making assumptions about me, be they true or painfully false, than it is to have my name and address residing in a database restricted to the firms from which I buy. On the other hand, if people who object to being profiled are unusual, and aggregation does not cause harm to most people’s privacy, the main consequence of privacy myopia is greatly reduced. For some, it is only distributional. Consumers who place a low value on their information privacy—people for whom their average valuation is less than the average valuation of a profiler—would have agreed to sell their privacy even if they were aware of the long-run consequences. The only harm to them is that they have not extracted the highest price possible. But consumers who place a high value on information privacy will be more seriously harmed by their information myopia. Had they been aware of the average value of each datum, they might have preferred not to sell.
Unfortunately, if the marginal value175 to the consumer of a given datum is small, then the value of not disclosing that datum will in most cases be lower than either the cost of negotiating a confidentiality clause (if that option even exists), or the cost of forgoing the entire transaction. Thus, in the ordinary case, absent anything terribly revealing about the datum, privacy clauses are unlikely to appear in standard form contracts, and consumers will accept this.
Furthermore, changing the law to make consumers the default owners of information about their economic activity is unlikely to produce large numbers of confidentiality clauses in the agora. In most cases, all it will do is move some of the consumer surplus from information buyers to information producers or sellers as the standard contracts forms add a term in which the consumer conveys rights to the information in exchange for a frequent flyer mile or two.
In short, if consumers are plausibly myopic about the value of a datum— focusing on its marginal value rather than its average value, which is difficult to measure—but profilers are not and the data are more valuable in aggregate, then there will be substantial over-disclosure of personal data even when consumers care about their informational privacy.
If this stylized story is even somewhat accurate, it has unfortunate implications for many proposals to change the default property rules regarding ownership of personal data in ordinary transactions. The sale will tend to happen even if the consumer has a sole entitlement to the data. It also suggests that European-style data protection rules should have only a limited effectiveness, primarily for highly sensitive personal data. The European Union’s data protection directive allows personal data to be collected for reuse and resale if the data subject agrees; the privacy myopia story suggests that customers will ordinarily agree except when disclosing particularly sensitive personal facts with a high marginal value.
On the other hand, the privacy myopia story suggests several questions for further research. For example, the myopia story suggests that we need to know how difficult it is to measure the value of privacy and, once that value has been calculated, how difficult it is to educate consumers to value data at its average rather than marginal value. Can information provide a corrective lense? Or, perhaps consumers already have the ability to value the privacy interest in small amounts of data if they consider the long term consequences of disclosure.
Consumers sometimes have an interest in disclosure of information. For example, proof of credit-worthiness tends to improve the terms upon which lenders offer credit. The myopia story assumes this feature away. It would be interesting to try to measure the relative importance of privacy and disclosure as intermediate and final goods. If the intermediate good aspect of informational privacy and disclosure substantially outweighed their final good aspect, the focus on blocking disclosure advocated in this article might be misguided. European data-protection rules, which focus on requiring transparency regarding the future uses of gathered data, might be the best strategy.
It would also be useful to know much more about the economics of data profiling. In particular, it would be helpful to know how much data it takes to make a profile valuable—at what point does the whole exceed the sum of the data parts? Additionally, it would be important to know whether profilers regularly suffer from data overload, and to what extent there are diminishing returns to scale for a single subject’s personal data. Furthermore, it could be useful to know whether there might be increasing returns to scale as the number of consumers profiled increases. If there are increasing returns to scale over any relevant part of the curve, the marginal consumer would be worth extra. It might follow that in an efficient market, profilers would be willing to pay more for data about the people who are most concerned about informational privacy.
There has already been considerable work on privacy-enhancing technologies for electronic transactions. There seems to be a need for more research, however, to determine which types of transactions are best suited to using technologies such as information intermediaries. The hardest work, will involve finding ways to apply privacy-enhancing technologies to those transactions that are not naturally suited to them.
Perhaps the most promising avenue is to design contracts and technologies that undercut the assumptions in the myopia story. For example, one might seek to lower the transaction costs of modifying standard form contracts, or of specifying restrictions on reuse of disclosed data. The lower the cost of contracting for privacy, the greater the chance that such a cost will be less than the marginal value of the data (note that merely lowering it below average cost fails to solve the underlying problem, because sales will still happen in that price range). If technologies, such as P3P, reduce the marginal transactions costs involved in negotiating the release of personal data to near zero, even privacy myopics will be able to express their privacy preferences in the P3P-compliant part of the marketplace.