Выбрать главу

That is an example of information stored in “analog” form. The dimmer’s knob provides an analogy to the bulb’s lighting level. If it’s turned halfway, presumably you have about half the total wattage. When you measure or describe how far the knob is turned, you’re actually storing information about the analogy (the knob) rather than about the lighting level. Analog information can be gathered, stored, and reproduced, but it tends to be imprecise—and runs the risk of becoming less precise each time it is transferred.

Now let’s look at an entirely different way of describing how to light the room, a digital rather than analog method of storing and transmitting information. Any kind of information can be converted into numbers using only 0s and 1s. These are called binary numbers—numbers composed entirely of 0s and 1s. Each 0 or 1 is called a bit. Once the information has been converted, it can be fed to and stored in computers as long strings of bits. Those numbers are all that’s meant by “digital information.”

Instead of a single 250-watt bulb, let’s say you have eight bulbs, each with a wattage double the one preceding it, from 1 to 128. Each of these bulbs is hooked to its own switch, with the lowest-watt bulb on the right. Such an arrangement can be diagrammed like this:

By turning these switches on and off, you can adjust the lighting level in 1-watt increments from 0 watt (all switches off) to 255 watts (all switches on). This gives you 256 possibilities. If you want 1 watt of light, you turn on only the rightmost switch, which turns on the 1-watt bulb. If you want 2 watts of light, you turn on only the 2-watt bulb. If you want 3 watts of light, you turn on both the 1-watt and 2-watt bulbs, because 1 plus 2 equals the desired 3 watts. If you want 4 watts of light, you turn on the 4-watt bulb. If you want 5 watts, you turn on just the 4-watt and 1-watt bulbs. If you want 250 watts of light, you turn on all but the 4-watt and 1-watt bulbs.

If you have decided the ideal illumination level for dining is 137 watts of light, you turn on the 128-, 8-, and 1-watt bulbs, like this:

This system makes it easy to record an exact lighting level for later use or to communicate it to others who have the same light-switch setup. Because the way we record binary information is universal—low number to the right, high number to the left, always doubling—you don’t have to write down the values of the bulbs. You simply record the pattern of switches: on, off, off, off, on, off, off, on. With that information a friend can faithfully reproduce the 137 watts of light in your room. In fact, as long as everyone involved double-checks the accuracy of what he does, the message can be passed through a million hands and at the end every person will have the same information and be able to achieve exactly 137 watts of light.

To shorten the notation further, you can record each “off” as 0 and each “on” as 1. This means that instead of writing down “on, off, off, off, on, off, off, on,” meaning turn on the first, the fourth, and the eighth of the eight bulbs, and leave the others off, you write the same information as 1, 0, 0, 0, 1, 0, 0, 1, or 10001001, a binary number. In this case it’s 137. You call your friend and say: “I’ve got the perfect lighting level! It’s 10001001. Try it.” Your friend gets it exactly right, by flipping a switch on for each 1 and off for each 0.

This may seem like a complicated way to describe the brightness of a light source, but it is an example of the theory behind binary expression, the basis of all modern computers.

Binary expression made it possible to take advantage of electric circuits to build calculators. This happened during World War II when a group of mathematicians led by J. Presper Eckert and John Mauchly at the University of Pennsylvania’s Moore School of Electrical Engineering began developing an electronic computational machine, the Electronic Numerical Integrator And Calculator, called ENIAC. Its purpose was to speed up the calculations for artillery-aiming tables. ENIAC was more like an electronic calculator than a computer, but instead of representing a binary number with on and off settings on wheels the way a mechanical calculator did, it used vacuum tube “switches.”

Soldiers assigned by the army to the huge machine wheeled around squeaking grocery carts filled with vacuum tubes. When one burned out, ENIAC shut down and the race began to locate and replace the burned-out tube. One explanation, perhaps somewhat apocryphal, for why the tubes had to be replaced so often was that their heat and light attracted moths, which would fly into the huge machine and cause short circuits. If this is true, it gives new meaning to the term “bugs” for the little glitches that can plague computer hardware or software.

When all the tubes were working, a staff of engineers could set up ENIAC to solve a problem by laboriously plugging in 6,000 cables by hand. To make it perform another function, the staff had to reconfigure the cabling—every time. John von Neumann, a brilliant Hungarian-born American, who is known for many things, including the development of game theory and his contributions to nuclear weaponry, is credited with the leading role in figuring out a way around this problem. He created the paradigm that all digital computers still follow. The “von Neumann architecture,” as it is known today, is based on principles he articulated in 1945—including the principle that a computer could avoid cabling changes by storing instructions in its memory. As soon as this idea was put into practice, the modern computer was born.

Today the brains of most computers are descendants of the microprocessor Paul Allen and I were so knocked out by in the seventies, and personal computers often are rated according to how many bits of information (one switch in the lighting example) their microprocessor can process at a time, or how many bytes (a cluster of eight bits) of memory or disk-based storage they have. ENIAC weighed 30 tons and filled a large room. Inside, the computational pulses raced among 1,500 electro-mechanical relays and flowed through 17,000 vacuum tubes. Switching it on consumed 150,000 watts of energy. But ENIAC stored only the equivalent of about 80 characters of information.

By the early 1960s, transistors had supplanted vacuum tubes in consumer electronics. This was more than a decade after the discovery at Bell Labs that a tiny sliver of silicon could do the same job as a vacuum tube. Like vacuum tubes, transistors act as electrical switches, but they require significantly less power to operate and as a result generate much less heat and require less space. Multiple transistor circuits could be combined onto a single chip, creating an integrated circuit. The computer chips we use today are integrated circuits containing the equivalent of millions of transistors packed onto less than a square inch of silicon.

In a 1977 Scientific American article, Bob Noyce, one of the founders of Intel, compared the $300 microprocessor to ENIAC, the moth-infested mastodon from the dawn of the computer age. The wee microprocessor was not only more powerful, but as Noyce noted, “It is twenty times faster, has a larger memory, is thousands of times more reliable, consumes the power of a lightbulb rather than that of a locomotive, occupies 1/30,000 the volume and costs 1/10,000 as much. It is available by mail order or at your local hobby shop.”

1946: A view inside a part of the ENIAC computer

Of course, the 1977 microprocessor seems like a toy now. And, in fact, many inexpensive toys contain computer chips that are more powerful than the 1970s chips that started the microcomputer revolution. But all of today’s computers, whatever their size or power, manipulate information stored as binary numbers.