The mind can even fabricate memories. On several occasions, Ronald Reagan recalled wartime experiences that were later traced to Hollywood movies. These were apparently honest mistakes. Reagan’s memory simply took certain images from films he had seen and converted them into personal memories. Reagan’s mistake was caught because, as president, his comments were subjected to intense scrutiny, but this sort of invention is far more common than we realize. In one series of experiments, researchers invented scenarios such as being lost in a shopping mall or staying overnight in a hospital with an ear infection. They then asked volunteers to imagine the event for a few days or to write down how they imagine it played out. Then, days later, the researchers interviewed the subjects and discovered that between 20 and 40 percent believed the imagined scenarios had actually happened.
A more basic problem with the Example Rule is that it is biased, thanks to the way our memories work. Recent, emotional, vivid, or novel events are all more likely to be remembered than others. In most cases, that’s fine because it’s precisely those sorts of events that we actually need to remember.
But the bias in our memory will be reflected in Gut’s judgments using the Example Rule—which explains the paradox of people buying earthquake insurance when the odds of an earthquake are lowest and dropping it as the risk rises. If an earthquake recently shook my city, that memory will be fresh, vivid, and frightening. Gut will shout: Be afraid! Buy insurance! But if I’ve been living in this place for decades and there has never been an earthquake, Gut will only shrug. Not even scientists issuing warnings will rouse Gut because it doesn’t know anything about science. It knows only what the Example Rule says, and the Example Rule says don’t worry about earthquakes if you have to struggle to remember one happening.
“Men on flood plains appear to be very much prisoners of their experience, ” researcher Robert Kates wrote in 1962, bemoaning the fact that people erect buildings despite being told that a flood must inevitably come. We saw the same dynamic at work following the terrible tsunami that swept across the Indian Ocean on December 26, 2004. Afterward, we learned that experts had complained about the lack of a warning system. It didn’t cost much, the experts had argued, and a tsunami was bound to come. It was a pretty esoteric subject, however, and no one was interested. Many people had never even heard the word tsunami until the day 230,000 lives were taken by one. And when that happened, the whole world started talking about tsunamis. Why was there no warning system in place? Could it happen here? Is our warning system good enough? It was the hot topic for a month or two. But time passed and there were no more tsunamis. Memories faded and so did the concern. For now, at least. A team of scientists has warned that one of the Canary Islands off the coast of Africa is fractured and a big chunk of the island will someday crash into the ocean—causing a mammoth tsunami to race across the Atlantic and ravage the coast from Brazil to Canada. Other scientists dispute these findings, but we can safely assume that, should this occur, interest in this esoteric subject would revive rather abruptly.
Experience is a valuable thing and Gut is right to base intuitions on it, but experience and intuition aren’t enough. “Experience keeps a dear school,” Benjamin Franklin wrote, “but fools will learn in no other.”
Franklin wrote those words in the mid-eighteenth century. From the perspective of a human living in the early twenty-first century, that’s a very long time ago, but in evolutionary terms it might as well have been this morning. The brain inside Franklin’s head was particularly brilliant, but it was still, in its essentials, no different than yours or mine or that of the person who first put seeds in the ground 12,000 years ago—or that of the human who first daubed some paint on a cave wall 40,000 years ago.
As we have seen, the world inhabited by humans changed very little over most of that sweep of time. And then it changed almost beyond description. The first city, Ur, was founded only 4,600 years ago and never got bigger than 65,000 people. Today, half of all humans live in cities—more than 80 percent in some developed countries.
Even more sweeping than the transformation of the physical environment is the change in how we communicate. The first crude writing—with symbols scratched into soft clay—appeared about 5,000 years ago. Gutenberg invented modern printing a mere five and a half centuries ago, and it was at this stage that Ben Franklin published his witticism about the limits of experience.
The first photograph was taken 180 years ago. Radio appeared a century ago, television thirty years later. It was only forty-eight years ago that the first satellite message was relayed—a Christmas greeting from U.S. President Eisenhower.
Then came cable television, fax, VCR, e-mail, cell phones, home video, digital, twenty-four-hour cable news, and satellite radio. Less than twenty years ago, the rare journalist who knew of the Internet’s existence and wrote about it would put quotation marks around the word and carefully explain the nature of this unfathomable contraption. Today, it is embedded in the daily lives of hundreds of millions of people and occasionally touches the lives of billions more. Google, iPod, Wikipedia, YouTube, Facebook, MySpace: All these words represent globe-spanning information channels with immense and unfolding potential to change societies. And yet, as I write this sentence, only one—Google—has even existed for ten years.
When Saddam Hussein was executed at the end of 2006, official video was released by the Iraqi government. It appeared on television and the Internet minutes later. At the same time, another video clip appeared. Someone had smuggled a cell phone into the execution and recorded the entire hanging, including the taunts of guards and witnesses and the actual moment of execution that had been omitted from the official version. From phone to phone the video spread, and then to the Internet, putting uncensored images of a tightly guarded event in bedrooms, offices, and cafés in every country on earth.
But the really astonishing thing about that incident is that people didn’t find it astonishing. During the Vietnam War, television news reports were filmed, put in a can, driven to an airport, and flown out to be shown days after they were shot—and they provided a startling immediacy unlike anything previously experienced. But when the tsunami of 2004 crashed into the coast of Thailand, tourists e-mailed video clips as soon as they got to high ground—accomplishing instantly and freely what sophisticated television networks could not have done with unlimited time and money just thirty years before. In 2005, when Londoners trapped in the wreckage of trains bombed by terrorists used cell-phone cameras to show the world what they saw almost at the moment they saw it, the talk was almost exclusively of the content of the images, not their delivery. It was simply expected that personal experience would be captured and instantaneously distributed worldwide. In less than three human life spans, we went from a world in which a single expensive, blurry, black-and-white photograph astonished people to one in which cheap color video made instantly available all over the planet does not.
For the advance of humanity, this is a wondrous thing. For the promise it offers each individual to learn and grow, it is magnificent. And yet.
And yet the humans living amid this deluge of information have brains that believe, somewhere in their deepest recesses, that an image of our children is our children, that a piece of fudge shaped like dog poo is dog poo, and that a daydream about winning the lottery makes it more likely we will win the lottery.