New Privacy Paper: Mass Surveillance as Privacy Pollution

I just uploaded a draft of my new paper, Regulating Mass Surveillance as Privacy Pollution: Learning from Environmental Impact Statements to SSRN. Be the first on your block to read it!

US law has remarkably little to say about mass surveillance in public, a failure which has allowed the surveillance to grow at an alarming rate – a rate that is only set to increase. This article proposes ‘Privacy Impact Notices’ (PINs) — modeled on Environmental Impact Statements — as an initial solution to this problem.

Data collection in public (and in the home via public spaces) resembles an externality imposed on the person whose privacy is reduced involuntarily; it can also be seen as a market failure caused by an information asymmetry. Current doctrinal legal tools available to respond to the deployment of mass surveillance technologies are limited and inadequate. The article proposes that — as a first step towards figuring out how to understand, value, and ultimately regulate this mass-privacy-destroying behavior — we should borrow from the environmental movement and require anyone planning a large-scale public data collection program to file a Privacy Impact Notice (PIN). The PIN proposal is contrasted to the existing much more limited federal privacy analysis requirement, known as Privacy Impact Assessments. The bulk of the article then explains how PINs would work and defends the idea against three predictable critiques (the claim that there is a First Amendment right to data collection, the claim that EISs are a poor policy tool not worthy of emulation, and the claim that notice-based regimes are in general worthless). It argues that PINs have applications to surveillance and data-collection in online public spaces such as Facebook, Twitter, and other virtual spaces. It also considers what the PINs proposal would have to offer towards addressing the now-notorious problem of the NSA’s drift-net surveillance of telephone conversations, emails, and web-based communications.

Modeling mass surveillance disclosure regulations on an updated form of environmental impact statement will help protect everyone’s privacy: Mandating disclosure and impact analysis by those proposing to watch us in and through public spaces will enable an informed conversation about privacy in public. Additionally, the need to build consideration of the consequences of surveillance into project planning, as well as the danger of bad publicity arising from excessive surveillance proposals, will act as a counterweight to the adoption of mass data collection projects, just as it did in the environmental context. In the long run, well-crafted disclosure and analysis rules could pave the way for more systematic protection for privacy – as it did in the environmental context. Effective US regulation of mass surveillance will require that we know a great deal about who and what is being recorded and about the costs and benefits of personal information acquisition and uses. At present we know relatively little about how to measure these; a privacy equivalent of environmental impact statements will not only provide case studies, but occasions to grow expertise.

I welcome your comments. I really mean that.

And if you are a law review editor, I’ll be sending it out soon…

This entry was posted in Writings. Bookmark the permalink.

4 Responses to New Privacy Paper: Mass Surveillance as Privacy Pollution

  1. Zorensen Leverthal says:

    Introduction:
    I’m struck by the extraordinary lengths you go to in order to justify why we need privacy. Which suggests a more fundamental problem: why do we need to justify privacy? What are the cultural influences that make privacy seem “outdated?”

    First, I would posit that, rather than draw upon anecdotal evidence for this justification, you draw upon research in anthropology and evolutionary psychology. This may be less familiar to casual readers, though, as a supplement, I think establishes a firmer ground for the argument. In hunter-gather society, there was no privacy. There was also no growth, no god, and no city. Anonymity and privacy are necessary psychological adaptations to an urbanized, industrial society.

    Second, as for what cultural factors influence this diminution of privacy’s role in society, I would posit an identification of economic growth with the social good. Thus, if technology is “growing” the economy, and technology diminishes privacy, it becomes difficult to see what could be wrong with this. I think this is part of the “myopia” you discuss later in the article.

    Footnote 2 — on “freedom of association.” While we have these doctrines and constitutional safeguards, the degree to which ordinary citizens are actually protected in the exercise of such rights is highly variable. Consider, for example, the historical case of union members exercising their “freedom of association” in the years before the National Labor Relations Act granted unions federal sanction. Specifically, consider the case of US v. Workingmen’s Amalgamated Council (1893), decided just three years after the passage of the Sherman Anti-Trust Act. In Workingmen’s Amalgamated, the Supreme Court used the Sherman Act — ostensibly passed to reign in corporations — to block union organizing (even though unions were not yet federally recognized organizations). We need a culture that appreciates why laws exist.

    Page 4 — “Panopticon” and the “all-seeing eye.” You come back to this later in your paper, and mention Foucault specifically, though I think it is worth making explicit some statement on the psychological function of the panopticon. First, the full title of Bentham’s treatise is informative:

    “PANOPTICON; Or, The Inspection-House: Containing The Idea Of A New Principle Of Construction Applicable To Any Sort Of Establishment, In Which Persons Of Any Description Are To Be Kept Under Inspection; And In Particular To Penitentiary-Houses, Prisons, Houses Of Industry, Work-Houses, Poor-Houses, Lazarettos, Manufactories, Hospitals, Mad-Houses, And Schools: With A Plan Of Management”

    As is Foucault’s characterization of how the Panopticon functions (in decidedly non-technological terms):

    Hence the major effect of the Panopticon: to induce in the inmate a state of conscious and permanent visibility that assures the automatic functioning of power. So to arrange things that the surveillance is permanent in its effects, even if it is discontinuous in its action; that the perfection of power should tend to render its actual exercise unnecessary; that this architectural apparatus should be a machine for creating and sustaining a power relation independent of the person who exercises it; in short, that the inmates should be caught up in a power situation of which they are themselves the bearers.

    Page 5 — “ubiquitous sensors.” In relation to the aforementioned quote from Foucault, that “inmates should be caught up in a power situation of which they are themselves the bearers,” consider that many ubiquitous sensing devices are things people choose to purchase, and which they perceive as a convenience or other benefit. This is PR conquering rationality. Consider this news story — with analogs in many countries — where parents want to track their kids:

    http://www.informationweek.com/uk-kids-get-rfid-chips-in-school-uniforms/d/d-id/1060768?

    Page 8 — surveillance in New York. Some years ago it was widely reported that London was the most surveilled city on earth. Presumably, this was the result of policies set in motion in reaction to IRA bombings. Perhaps London can offer some historical lessons on what to expect here:

    http://news.bbc.co.uk/2/hi/uk_news/6108496.stm

    Page 12 — research and institutional review boards. An interesting example of such social science research done without ethical review is a paper by Pass, Chowdhury, Torgeson, “A Picture of Search.” The paper is written about a collection of user data that AOL published on their home page. This data has been used for purposes other than social science research. An interesting example of what privacy invasion looks like is the site http://www.aolstalker.com which uses the same data as the above paper.

    Page 15 — “Privacy Destruction as Market Failure” — Markets may not be an adequate model to explain privacy or its value, which is otherwise the product of evolutionary psychology. Markets are no more an all-encompassing explanation than the Marxist-Hegelian dialectic purports to be, but is a model designed to describe a limited set of social interactions under a number of assumptions (which yo discuss). “Perfect markets” is one assumption, as is the assumption of “individual, befit-maximizing free agents” which has difficulty explaining why one might buy a low-end Lexus instead of a high-end Toyota. Again, the role of the PR industry in subverting individual rationality is an important dynamic here.

    Page 19 — the analogy of privacy pollution to air pollution is interesting, but in terms of the specific dynamics you discuss here, another parallel worth exploring might be water rights in places like Boulder, Colorado. Boulder has depleted its under-water aquifers and uses a number of means to encourage residents to reduce water usage. One noteworthy exception involves rain barrels. While many people may consider rain barrels a “green” alternative to using municipal water for certain uses, rain barrels are illegal in Boulder because farmers buy the rain rights ahead of time. The residential collection of rainwater in Boulder is technically stealing from farmers:

    http://www.nytimes.com/2009/06/29/us/29rain.html?_r=0

    page 21-22 — the paper here really starts to get interesting, specifically, the discussion of the value of aggregation. There may be system-theoretic reasons for this, mainly, thresholds. If you’re marketing to a network as an arbitrary subset of interconnected elements (i.e., amazon customers who are also new yorkers) there will be some threshold at which a disparate collection of data points (individuals) becomes a patter (social network). This threshold may not be predictable mathematically (consequences of Godel’s Incompleteness theorem, and Alan Turning’s related “halting problem”), but must be uncovered empirically. Stephen Wolfram’s tome “A New Kind of Science” discusses at length such issues in “empirical mathematics.” The marginal value of transactions seems to be a productive avenue of inquiry.

    Page 23 — in discussing contracts, you might find some value in Solove & Hartzog, “The FTC and the New Common Law of Privacy.” The paper discusses a corpus of FTC rulings that concerns precisely the privacy notices everybody ignores on every website, and posits that this corpus is equivalent to a body of common law, which could be codified in new legislative privacy protections.

    page 26, footnote 99, Sotomayor’s comment — “the government usurped Jones’ property for the purpose of conducting surveillance on him” — a question i’ve had for some time, the various 5th amendment due process implications of surveillance (eminent domain, miranda rights) and 3rd amendment implications (effectively quartering solderies in homes) as opposed to typical 4th or 1st amendment arguments

    page 27 — role of logical inference — since you bring up drug law, consider hallucinogenic mushrooms, and the role of inferential logic in prosecuting offenders. See, for example, Bemis v. State (indiana court of appeals, 1995) 652 N.E.2d 89. In prosecuting drug offenses for hallucinogenic mushrooms, lawyers have to jump through a number of legal hoops. The Controlled Substances Act does not specify that mushrooms are illegal — rather, two substances naturally produced by certain mushrooms of the Psilocybe genus are restricted. To prosecute individuals for possession of mushrooms, prosecutors must not only maintain that natural mushrooms are “mixtures” — a term used in the Controlled Substances Act to refer to street dealers who adulterate certain drugs to increase revenue — but prosecutors must also establish that defendants knew that the mushrooms they possess produce regulated substances. So in arguing the material facts of the case, prosecutors must use inference to establish that defendants possess certain cognitive states. Incidentally, by this legal reasoning, it is illegal to possess a human brain, since the human brain is a “mixture” that produces the scheduled substance DMT. But I think there are lessons here where law enforcement uses massive data sets to infer cognitive states in people who are not otherwise suspected of any wrongdoing.

  2. nedu says:

    Page 9:

    So long as the private actors are not collecting the data so closely at the direction state that they become state actors…

    I think I know what you meant here, but this just does not scan.

    Anyhow, I’m on my first pass through your paper now. I’m always interested in what you have to say, and perhaps I’ll have some more comments after I’ve finished reading.

    But, on the whole, I am entirely discouraged –even a little bit depressed. For the past several months, I have seen no realistic hope for a better future. It is a grim world we are finding ourselves in: The trend seems altogether towards ubiquitous, pervasive, unceasing surveillance. I am consoled only by the thought that utter dystopias are most probably as unlikely as any complete utopia.

    • utter dystopias are most probably as unlikely as any complete utopia.

      Er, North Korea?

    • That was a clunky sentence. How’s this for a re-draft:

      Because neither the First nor the Fourth Amendment constrain private actors, so long as the state avoids turning them into state actors (and absent the private actor committing a tort or a crime), the private watchers are able to collect data in ways not available to the state directly.

Comments are closed.