Lessons Learned Too Well: The Evolution of Internet Regulation (3)

Read Part One and Part Two.

The search for more effective legislation to fight disruptions to settled expectations led governments even at an early stage to experiment with chokepoint regulation, a development that would come into its full fruition later. If the end-users in democracies were too difficult to police then the intermediaries on whom they depended for services – Internet Service Providers (ISPs), credit card companies, domain name registrars, makers of computer and telephone hardware and even software – were far less numerous, were easier to find, and far easier to persuade to comply with rules that end-users, given a choice, might well have balked at. The lesson was not lost on regulators in both democracies and despotisms. Where once a government might have sought to set a technical standard, or influence the marketplace, now it would legislate it. If code was not law enough, then bring on the law to determine the code – or even the hardware.

An early example of this type of legislation – and of its dangers – was the US’s Communications Assistance for Law Enforcement Act (CALEA), in 1994. CALEA was sold to lawmakers as a way to preserve – preserve, not expand – law enforcement wiretapping capabilities by requiring telephone companies to design their networks to be wire-tap ready. Since 1994, however, the FBI has used CALEA to expand its capabilities, turning wireless phones into tracking devices, requiring phone companies to collect specific signaling information for the convenience of the government, and allowing interception of packet communications without privacy protections. In 2005, the Federal Communications Commission granted an FBI petition and expanded CALEA to broadband Internet access and VOIP services, a decision upheld by the DC Cir 2006.

A similar chokepoint strategy was deployed against”cybersquatters”, the name coined to describe people who registered domain names that shared identical character strings with trademarks, and which a small group of profiteers snapped up and then attempted to ransom to brand managers late to the Internet. There the chokepoint was the domain name system, and the central data bases run by registries provided easy leverage. The cybersquatter problem was worldwide, and the solution there was not just domestic US legislation, but the creation in 1998 of a new formally private body, the Internet Corporation for Assigned Names and Numbers, to take over regulation of domain names. Its first policy was to create a lightweight arbitration-like system to adjudicate domain name disputes, one that ended up righting some wrongs, and creating some new ones – in both cases to the advantage of trademark holders, often large firms, some of whom were able to secure victories they could never have won in court, and for only a fraction of the cost.

One thing about ICANN stand out from a legal perspective: the regulations that it imposed on domain name registrants – notably that they had to agree that their domain names could be taken away if ICANN’s arbitration-like process so determined – were an important objective of the US Department of Commerce that settled on ICANN as the Domain Name system manager. But because a domain name is acquired by contract between a registrant and a private company that is two private contracts away from ICANN (and thus three away from the US Government), due process had no traction. Enlisting private parties as de facto regulators proved to be an effective legal workaround.

A larger battle, also with a less-than-happy outcome raged over file-sharing and copyrights. The copyright industry achieved an early victory by securing passage of the DMCA in 1998. DMCA §1201 created what has come be known as a “paracopyright” – legal protection for copy-protection technologies used by copyright holders. This goes beyond traditional copyright in that it not only prohibits copying of the work and circumventing copy-protection software, but also prohibits the creation or trafficking of tools designed to circumvent copy-protection software. Indeed, §1201 applies regardless of whether the copy-protection technology is effective or not.

Equally important, the DMCA created a method – the takedown notice – by which an allegation of copyright violation would suffice in most cases to force ISPs to immediately take content offline – no injunction needed. That provision, and regular copyright law, sufficed to enable the killing of file-sharing giant Napster (which was far from an innocent victim). In no time, however, other less centralized music-sharing systems sprang up to replace it.

By 2000, that is, about a decade ago, the first wave of Internet enthusiasm had already crested. The early heady days of people making use of new technologies and routing happily around legal rules were almost a subject for nostalgia. Even if Internet exceptionalism was still alive, in important ways the unregulated Internet had already been subjected to – often ham-handed – attempts to regulate. The Empire – Law’s Empire – had struck back.

But this was only the beginning.

Governments and industry learned from both their successes and failures, and these shaped a second wave of internet regulation. While there are things about the second wave that are encouraging, there is even more that is troubling.

It is surely good that Internet regulation is increasingly based on a sound understanding of the technology, thus minimizing pointless and ineffective rules. But as regulatory strategies get more effective, there are collateral consequences.

In the past decade, the copyright police have stepped up their efforts to stamp out file-sharing. In addition to their legislative successes, they have embraced technological solutions, focusing on the chokepoints of ISPs, and especially hardware and software manufacturers. The model technology may be region coding, in which nearly all commercial DVDs, and both hardware and software DVD players, must be locked to prevent the playing of DVDs sold far away – the fear being the gray market, a form of competition that is legal for almost all other goods. The targets of regulation by technology have been expanded to limit how other home theater devices interconnect in order to limit home taping. And, in the Orwellian-named “Trusted Computing” intiative, chip-makers are being encouraged (and might some day be required) to place unique identifiers on computer chips that could be invoked by software to identify the machine, perhaps without the knowledge or consent of the user. Intel’s latest generation of chips, Sandy Bridge, includes a unique identifier (they call it the “Intel Insider”) just waiting for software – not necessarily under the control of the user – to identify it. The hope, not yet realized, is that having this capability will give more content providers the courage to stream top-quality movies online because they can encrypt it in a way that only a chip with that unique identifier will be able to decrypt. Of course, every internet-connected device already has a unique MAC number, but it is more feasible to change or mask those than something hardwired on the CPU.

Regulation aimed at closing down cross-border gambling focused on the most vulnerable chokepoint: the credit cards used to move value. We’ve seen mandated location tracking for cell phones, justified as a way to help emergency services locate callers, but with side effects only now coming into focus.

And surprisingly quickly, the liberty-enhancing aspects of the Internet were on the defensive. Where a decade ago it was still reasonable to see the constellation of technologies around the Internet as fundamentally empowering and anti-totalitarian, that optimism is increasingly hard to sustain as regulators in both democratic and totalitarian states have learned how to structure rules that cannot easily be evaded, and – increasingly – how to use Internet-based technologies to achieve levels of regulatory control that would not have been possible previously.

The second wave of regulation has been growing in the past decade, but it has yet to peak. It draws strength from a market-driven shift towards closure and centralization in both hardware (e.g. the iPhone) and software (e.g. Facebook, Twitter) which create new chokepoints. It may be easier to see how someone will make money off Hulu or even YouTube than off Gnutella or Bittorent, but it is also far easier to regulate them.

The crest of the second wave, however, is now in sight: the abolition of online anonymity. First wave internet regulation could never have achieved the identification of every user and every data packet, but the second wave is both more international and more adept; when law harnesses technology to its ends, law can achieve far more than when law either regulates outside technology (categorization) or regulates against it.

The consequences risk being severe. More than a decade ago the Internet seemed poised to serve libertarian values; a decade ago some of us thought they might, with some pushing, be Habermassian. The future looks rather more grim, threatening to vindicate earlier Foucaultian predictions . The challenge for theorists and activists is to structure the coming era of inescapable tracking and information so that we encourage a responsibility society, but still have one in which the democracy-enhancing aspects of Internet technology are nurtured, and not one where, as is too common in times of fear and hardship, authorities become empowered at the expense of all of us.

Continued…

This entry was posted in Law: Internet Law, Talks & Conferences. Bookmark the permalink.

2 Responses to Lessons Learned Too Well: The Evolution of Internet Regulation (3)

  1. Just me says:

    “[In 2009], law enforcement agents pinged users of just one service provide —Sprint—over eight million times.” U.S.A v. Pineda-Moreno (9th Cir., 2010)(Dissent by Chief Judge Kozinski)(citing Christopher Soghoian, 8 Million Reasons for Real Surveillance Oversight, Slight Paranoia (Dec. 1, 2009) http://paranoia/dubfire. net/2009/12/8-million-reasons-for-real-surveillance.html.)

    I can’t help but point out this great dissent whenever the issue of erosion of privacy comes up. The full dissenting opinion is available at http://www.ca9.uscourts.gov/datastore/opinions/2010/08/12/08-30385.pdf.

  2. Just me says:

    I posted a comment on this earlier today. Doesn’t appear to have come through. Your filter probably thought it was spam because it included a couple of URLs.

    btw…I am enjoying this series of posts.

Comments are closed.