Выбрать главу

No experience in our everyday life prepares us for the implications of a number that doubles a great number of times—exponential improvements. One way to understand it is with a fable.

King Shirham of India was so pleased when one of his ministers invented the game of chess that he asked the man to name any reward.

“Your Majesty,” said the minister, “I ask that you give me one grain of wheat for the first square of the chessboard, two grains for the second square, four grains for the third, and so on, doubling the number of grains each time until all sixty-four squares are accounted for.” The king was moved by the modesty of the request and called for a bag of wheat.

The king asked that the promised grains be counted out onto the chessboard. On the first square of the first row was placed one small grain. On the second square were two specks of wheat. On the third square there were 4, then 8, 16, 32, 64, 128. By square eight at the end of the first row, King Shirham’s supply master had counted out a total of 255 grains.

Intel microprocessors have doubled in transister count approximately every eighteen months, in accordance with Moore’s Law.

The king probably registered no concern. Maybe a little more wheat was on the board than he had expected, but nothing surprising had happened. Assuming it would take one second to count each grain, the counting so far had taken only about four minutes. If one row was done in four minutes, try to guess how long it would take to count out the wheat for all sixty-four squares of the board. Four hours? Four days? Four years?

By the time the second row was complete, the supply master had worked for about eighteen hours just counting out 65,535 grains. By the end of the third of the eight rows, it took 194 days to count the 16.8 million grains for the twenty-fourth square. And there were still forty empty squares to go.

It is safe to say that the king broke his promise to the minister. The final square would have gotten 18,446,744,073,709,551,615 grains of wheat on the board, and required 584 billion years of counting. Current estimates of the age of the earth are around 4.5 billion years. According to most versions of the legend, King Shirham realized at some point in the counting that he had been tricked and had his clever minister beheaded.

Exponential growth, even when explained, seems like a trick.

Moore’s Law is likely to hold for another twenty years. If it does, a computation that now takes a day will be more than 10,000 times faster, and thus take fewer than ten seconds.

Laboratories are already operating “ballistic” transistors that have switching times on the order of a femtosecond. That is 1/1,000,000,000,000,000 of a second, which is about 10 million times faster than the transistors in today’s microprocessors. The trick is to reduce the size of the chip circuitry and the current flow so that moving electrons don’t bump into anything, including each other. The next stage is the “single-electron transistor,” in which a single bit of information is represented by a lone electron. This will be the ultimate in low-power computing, at least according to our current understanding of physics. In order to make use of the incredible speed advantages at the molecular level, computers will have to be very small, even microscopic. We already understand the science that would allow us to build these superfast computers. What we need is an engineering breakthrough, and these are often quick in coming.

By the time we have the speed, storing all those bits won’t be a problem. In the spring of 1983, IBM released its PC/XT, the company’s first personal computer with an interior hard disk. The disk served as a built-in storage device and held 10 megabytes, or “megs,” of information, about 10 million characters or 80 million bits. Existing customers who wanted to add these 10 megs to their original computers could, for a price. IBM offered a $3,000 kit, complete with separate power supply, to expand the computer’s storage. That’s $300 per megabyte. Today, thanks to the exponential growth described by Moore’s Law, personal-computer hard drives that can hold 1.2 gigabytes—1.2 billion characters of information—are priced at $250. That’s 21 cents per megabyte! And we look toward an exotic improvement called a holographic memory, which can hold terabytes of characters in less than a cubic inch of volume. With such capability, a holographic memory the size of your fist could hold the contents of the Library of Congress.

As communications technology goes digital, it becomes subject to the same exponential improvements that have made today’s $2,000 laptop computer more powerful than a $10 million IBM mainframe computer of twenty years ago.

At some point not far in the future, a single wire running into each home will be able to deliver all of a household’s digital data. The wire will either be fiber, which is what long-distance telephone calls are carried on now, or coaxial cable, which currently brings us cable television signals. If the bits are interpreted as voice calls, the phone will ring. If there are video images, they will show up on the television set. If they are on-line news services, they will arrive as written text and pictures on a computer screen.

That single wire bringing the network will certainly carry much more than phone calls, movies, news. But we can no more imagine what the information highway will carry in twenty-five years than a Stone Age man using a crude knife could have envisioned Ghiberti’s Baptistery doors in Florence. Only when the highway arrives will all its possibilities be understood. However, the last twenty years of experience with digital breakthroughs allow us to understand some of the key principles and possibilities for the future.

3

LESSONS FROM THE COMPUTER INDUSTRY

Success is a lousy teacher. It seduces smart people into thinking they can’t lose. And it’s an unreliable guide to the future. What seems the perfect business plan or latest technology today may soon be as out-of-date as the eight-track tape player, the vacuum-tube television, or the mainframe computer. I’ve watched it happen. Careful observation of many companies over a long period of time can teach you principles that will help with strategies for the years ahead.

Companies investing in the highway will try to avoid repeating the mistakes made in the computer industry over the past twenty years. I think most of these mistakes can be understood by looking at a few critical factors. Among them are negative and positive spirals, the necessity of initiating rather than following trends, the importance of software as opposed to hardware, and the role of compatibility and the positive feedback it can generate.

You can’t count on conventional wisdom. That only makes sense in conventional markets. For the last three decades the market for computer hardware and software has definitely been unconventional. Large established companies that one day had hundreds of millions of dollars in sales and lots of satisfied customers had disappeared in a short time. New companies, such as Apple, Compaq, Lotus, Oracle, Sun, and Microsoft, appeared to go from nothing to a billion dollars of revenue in a flash. These successes were driven, in part, by what I call the “positive spiral”

When you have a hot product, investors pay attention to you and are willing to put their money into your company. Smart kids think, Hey, everybody’s talking about this company. I’d like to work there. When one smart person comes to a company, soon another does, because talented people like to work with each other. This creates a sense of excitement. Potential partners and customers pay more attention, and the spiral continues, making the next success easier.