Выбрать главу

Here the government is regulating indirectly by using the structures of real-space code to effect its ends, but this regulation, again, is not seen as regulation. Here the government gets an effect at no political cost. It gets the benefit of what would clearly be an illegal and controversial regulation without even having to admit any regulation exists.

In all three cases, the government is commandeering the power of another modality — another structure of constraint — to effect its own ends[63]. This in itself is not necessarily improper. There are plenty of examples that anyone would consider proper. A requirement that streets be well lit, for instance, is a regulation designed to reduce crime, and no one would think that regulation improper. Nor does all such regulation hide its pedigree. Think again about speed bumps –they are examples of indirect regulation. Like a winding road, they use the code of streets to keep down the speed of a car. But no one is fooled about the source of this regulation; no one believes the bumps are accidental.

Thus, the point is not against indirect regulation generally. The point is instead about transparency. The state has no right to hide its agenda. In a constitutional democracy its regulations should be public. And thus, one issue raised by the practice of indirect regulation is the general issue of publicity. Should the state be permitted to use nontransparent means when transparent means are available?

Where This Leads

After I published an essay in the (then existing) Industry Standard arguing that “code is law,[64]” the following letter was sent to the editor:

Typical for a Harvard Law Professor. . . . Lessig misses the entire forest while dancing among the trees. . . . While his riff on West Coast Code (from Silicon Valley Programmers) vs. East Coast Code (from government lawyers) is very cleverly crafted, it completely avoids the real difference between the two.

The good professor seems to apply the word “regulation” equally to the efforts of private enterprises to control the behavior of their customers through market mechanisms and the efforts of government agencies to control the behavior of all citizens through force of law.

So long as the creators and purveyors of West Coast Code (no matter how selfish, monopolistic, demonic or incompetent they may be) do not carry guns and badges, I will choose them over the enforcers of East Coast Code any time[65].

Whether or not I’ve missed the “real difference” between code and law, the genius in this letter is that its author clearly sees the real similarity. The author (the president of an Internet-related business) understands that “private enterprises” try to “control the behavior of their customers”, and he writes that they use “market mechanisms” to achieve that control. (Technically, I was speaking about architectures to achieve that effect, but never mind. Whether markets or architectures, the point is the same.) He therefore sees that there is “regulation” beyond law. He just has his favorite between the two (corporate executive that he is).

What this author sees is what we all must see to understand how cyberspace is regulated and to see how law might regulate cyberspace. I’ve argued in this chapter that government has a range of tools that it uses to regulate, and cyberspace expands that range. Indirectly, by regulating code writing, the government can achieve regulatory ends, often without suffering the political consequences that the same ends, pursued directly, would yield.

We should worry about this. We should worry about a regime that makes invisible regulation easier; we should worry about a regime that makes it easier to regulate. We should worry about the first because invisibility makes it hard to resist bad regulation; we should worry about the second because we don’t yet — as I argue in Part III — have a sense of the values put at risk by the increasing scope of efficient regulation.

That’s a lot of worries, no doubt. But before we go further with these worries, we could consider in more detail the contexts within which these worries become real.

Chapter8. The Limits In Open Code

I’ve told a story about how regulation works, and about the increasing regulability of the Internet that we should expect. These are, as I described, changes in the architecture of the Net that will better enable government’s control by making behavior more easily monitored — or at least more traceable. These changes will emerge even if government does nothing. They are the by-product of changes made to enable e-commerce. But they will be cemented if (or when) the government recognizes just how it could make the network its tool.

That was Part I. In this part, I’ve focused upon a different regulability — the kind of regulation that is effected through the architectures of the space within which one lives. As I argued in Chapter 5, there’s nothing new about this modality of regulation: Governments have used architecture to regulate behavior forever. But what is new is its significance. As life moves onto the Net, more of life will be regulated through the self-conscious design of the space within which life happens. That’s not necessarily a bad thing. If there were a code-based way to stop drunk drivers, I’d be all for it. But neither is this pervasive code-based regulation benign. Due to the manner in which it functions, regulation by code can interfere with the ordinary democratic process by which we hold regulators accountable.

The key criticism that I’ve identified so far is transparency. Code-based regulation — especially of people who are not themselves technically expert — risks making regulation invisible. Controls are imposed for particular policy reasons, but people experience these controls as nature. And that experience, I suggested, could weaken democratic resolve.

Now that’s not saying much, at least about us. We are already a pretty apathetic political culture. And there’s nothing about cyberspace to suggest things are going to be different. Indeed, as Castranova observes about virtual worlds: “How strange, then, that one does not find much democracy at all in synthetic worlds. Not a trace, in fact. Not a hint of a shadow of a trace. It’s not there. The typical governance model in synthetic worlds consists of isolated moments of oppressive tyranny embedded in widespread anarchy.[1]

But if we could put aside our own skepticism about our democracy for a moment, and focus at least upon aspects of the Internet and cyberspace that we all agree matter fundamentally, then I think we will all recognize a point that, once recognized, seems obvious: If code regulates, then in at least some critical contexts, the kind of code that regulates is critically important.

By “kind” I mean to distinguish between two types of code: open and closed. By “open code” I mean code (both software and hardware) whose functionality is transparent at least to one knowledgeable about the technology. By “closed code”, I mean code (both software and hardware) whose functionality is opaque. One can guess what closed code is doing; and with enough opportunity to test, one might well reverse engineer it. But from the technology itself, there is no reasonable way to discern what the functionality of the technology is.

вернуться

63.

Michael Froomkin points to the Clipper chip regulations as another example. By using the standards-setting process for government purchases, the federal government could try to achieve a standard for encryption without adhering to the Administrative Procedure Act. "A stroke of bureaucratic genius lay at the heart of the Clipper strategy. Congress had not, and to this date has not, given the executive branch the power to control the private use of encryption. Congress has not even given the executive the power to set up an escrow system for keys. In the absence of any formal authority to prevent the adoption of unescrowed cryptography, Clipper's proponents hit upon the idea of using the government's power as a major consumer of cryptographic products to rig the market. If the government could not prevent the public from using nonconforming products, perhaps it could set the standard by purchasing and deploying large numbers of escrowed products"; "It Came from Planet Clipper," 15, 24, 1–33.

вернуться

64.

See The Industry Standard, available at http://www.lessig.org/content/standard/0,1902,4165,00.html (cached: http://www.webcitation.org/5IwtxT699).

вернуться

65.

See "Legal Eagle" (letter to the editor), The Industry Standard, April 26, 1999 (emphasis added).

вернуться

1.

Castronova, Synthetic Worlds, 207.