We can use the same model to describe the regulation of behavior in cyberspace.[9]
Law regulates behavior in cyberspace. Copyright law, defamation law, and obscenity laws all continue to threaten ex post sanction for the violation of legal rights. How well law regulates, or how efficiently, is a different question: In some cases it does so more efficiently, in some cases less. But whether better or not, law continues to threaten a certain consequence if it is defied. Legislatures enact[10]; prosecutors threaten[11]; courts convict[12].
Norms also regulate behavior in cyberspace. Talk about Democratic politics in the alt.knitting newsgroup, and you open yourself to flaming; “spoof” someone’s identity in a MUD, and you may find yourself “toaded”[13]; talk too much in a discussion list, and you are likely to be placed on a common bozo filter. In each case, a set of understandings constrain behavior, again through the threat of ex post sanctions imposed by a community[14].
Markets regulate behavior in cyberspace. Pricing structures constrain access, and if they do not, busy signals do. (AOL learned this quite dramatically when it shifted from an hourly to a flat-rate pricing plan.[15]) Areas of the Web are beginning to charge for access, as online services have for some time. Advertisers reward popular sites; online services drop low-population forums. These behaviors are all a function of market constraints and market opportunity. They are all, in this sense, regulations of the market.
Finally, an analog for architecture regulates behavior in cyberspace — code. The software and hardware that make cyberspace what it is constitute a set of constraints on how you can behave. The substance of these constraints may vary, but they are experienced as conditions on your access to cyberspace. In some places (online services such as AOL, for instance) you must enter a password before you gain access; in other places you can enter whether identified or not[16]. In some places the transactions you engage in produce traces that link the transactions (the “mouse droppings”) back to you; in other places this link is achieved only if you want it to be[17]. In some places you can choose to speak a language that only the recipient can hear (through encryption)[18]; in other places encryption is not an option[19]. The code or software or architecture or protocols set these features, which are selected by code writers. They constrain some behavior by making other behavior possible or impossible. The code embeds certain values or makes certain values impossible. In this sense, it too is regulation, just as the architectures of real-space codes are regulations.
As in real space, then, these four modalities regulate cyberspace. The same balance exists. As William Mitchell puts it (though he omits the constraint of the market):
Architecture, laws, and customs maintain and represent whatever balance has been struck in real space. As we construct and inhabit cyberspace communities, we will have to make and maintain similar bargains — though they will be embodied in software structures and electronic access controls rather than in architectural arrangements[20].
Laws, norms, the market, and architectures interact to build the environment that “Netizens” know. The code writer, as Ethan Katsh puts it, is the “architect”[21].
But how can we “make and maintain” this balance between modalities? What tools do we have to achieve a different construction? How might the mix of real-space values be carried over to the world of cyberspace? How might the mix be changed if change is desired?
On Governments and Ways to Regulate
I’ve described four constraints that I’ve said “regulate” an individual. But these separate constraints obviously don’t simply exist as givens in a social life. They are neither found in nature nor fixed by God. Each can be changed, though the mechanics of changing them is complex. Law can have a significant role in this mechanics, and my aim in this section is to describe that role.
A simple example will suggest the more general point. Say the theft of car radios is a problem — not big in the scale of things, but a frequent and costly enough problem to make more regulation necessary. One response might be to increase the penalty for car radio theft to life in prison, so that the risk faced by thieves made it such that this crime did not pay. If radio thieves realized that they exposed themselves to a lifetime in prison each time they stole a radio, it might no longer make sense to them to steal radios. The constraint constituted by the threatened punishment of law would now be enough to stop the behavior we are trying to stop.
But changing the law is not the only possible technique. A second might be to change the radio’s architecture. Imagine that radio manufacturers program radios to work only with a single car — a security code that electronically locks the radio to the car, so that, if the radio is removed, it will no longer work. This is a code constraint on the theft of radios; it makes the radio no longer effective once stolen. It too functions as a constraint on the radio’s theft, and like the threatened punishment of life in prison, it could be effective in stopping the radio-stealing behavior.
Thus, the same constraint can be achieved through different means, and the different means cost different amounts. The threatened punishment of life in prison may be fiscally more costly than the change in the architecture of radios (depending on how many people actually continue to steal radios and how many are caught). From this fiscal perspective, it may be more efficient to change code than law. Fiscal efficiency may also align with the expressive content of law — a punishment so extreme would be barbaric for a crime so slight. Thus, the values may well track the efficient response. Code would be the best means to regulate.
The costs, however, need not align so well. Take the Supreme Court’s hypothetical example of life in prison for a parking ticket[22]. It is likely that whatever code constraint might match this law constraint, the law constraint would be more efficient (if reducing parking violations were the only aim). There would be very few victims of this law before people conformed their behavior appropriately. But the “efficient result” would conflict with other values. If it is barbaric to incarcerate for life for the theft of a radio, it is all the more barbaric as a penalty for a parking violation. The regulator has a range of means to effect the desired constraint, but the values that these means entail need not align with their efficiency. The efficient answer may well be unjust — that is, it may conflict with values inherent in the norms, or law (constitution), of the society.
Law-talk typically ignores these other regulators and how law can affect their regulation. Many speak as if law must simply take the other three constraints as given and fashion itself to them[23].
I say “as if” because today it takes only a second’s thought to see that this narrowness is absurd. There were times when these other constraints were treated as fixed — when the constraints of norms were said to be immovable by governmental action[24], or the market was thought to be essentially unregulable[25], or the cost of changing real-space code was so high as to make the thought of using it for regulation absurd[26]. But we see now that these constraints are plastic[27]. They are, as law is, changeable, and subject to regulation.
9.
Jay Kesan has offered a related, but more expansive analysis. See Jay P. Kesan and Rajiv C. Shah, "Shaping Code,"
10.
See Michelle Armond, "Regulating Conduct on the Internet: State Internet Regulation and the Dormant Commerce Clause,"
11.
See, for example, the policy of the Minnesota attorney general on the jurisdiction of Minnesota over people transmitting gambling information into the state; available at http://web.archive.org/web/20000816215338/http://www.ag.state.mn.us/home/consumer/consumernews/OnlineScams/memo.html (cached: http://www.webcitation.org/5IwtqoSHt).
12.
See, for example,
14.
Norms are something different — more directly regulating user behavior. See Daniel Benoliel,
15.
See, for example, "AOL Still Suffering but Stock Price Rises,"
16.
USENET postings can be anonymous; see Henry Spencer and David Lawrence, Man aging USENET (Sebastopol, Cal.: O'Reilly and Associates, 1998), 366–67.
17.
Web browsers make this information available, both in real time and archived in a cookie file; see http://www.cookiecentral.com/faq.htm (cached: http://www.webcitation.org/5Iwtsr5Vb). They also permit users to turn this tracking feature off.
19.
Encryption, for example, is illegal in some international contexts; see Baker and Hurst,
21.
See Ethan Katsh, "Software Worlds and the First Amendment," 335, 340. "If a compar ison to the physical world is necessary, one might say that the software designer is the architect, the builder, and the contractor, as well as the interior decorator."
23.
Interestingly — and again, a reason to see the future of regulation talk located else where — this is not true of architects. An example is the work of John de Monchaux and J. Mark Schuster. In their essay "Five Things to Do" and in the collection that essay introduces,
24.
See, for example, James C. Carter,
25.
See, for example, the discussion of wage fund theory in Hovenkamp,
26.
For a fascinating account of the coming of age of the idea that the natural environment might be tamed to a productive and engineered end, see John M. Barry,
27.
As Roberto Unger puts it, "Modern social thought was born proclaiming that society is made and imagined, that it is a human artifact rather than the expression of an underlying natural order";