In 1983, The Day After, a nightmarish look at life in small-town America before and after nuclear attack, became the most-talked-about TV drama of the era. In 1984, no fewer than seven novels featuring nuclear war were published. The fear was real and intense. It filled the streets of Europe and America with millions of protestors and filled countless heads with nightmares. “Suppose I survive,” wrote British novelist Martin Amis. “Suppose my eyes aren’t pouring down my face, suppose I am untouched by the hurricane of secondary missiles that all mortar, metal, and glass has abruptly become: suppose all this. I shall be obliged (and it’s the last thing I feel like doing) to retrace that long mile home, through the firestorm, the remains of the thousand-mile-an-hour winds, the warped atoms, the groveling dead. Then—God willing, if I still have the strength, and, of course, if they are still alive—I must find my wife and children and I must kill them.”
And if global incineration weren’t enough to worry about, 1985 also saw explosive awareness about the rapid spread of a deadly new virus. There was no treatment for AIDS. Get it and you were certain to die a slow, wasting death. And there was a good chance you would get it because a breakthrough into the heterosexual population was inevitable. “AIDS has both sexes running scared,” Oprah Winfrey told her audience in 1987. “Research studies now project that one in five heterosexuals could be dead from AIDS at the end of the next three years. That’s by 1990. One in five.” Surgeon General C. Everett Koop called it “the biggest threat to health this nation has ever faced.” A member of the president’s commission on AIDS went one step further, declaring the disease to be “the greatest threat to society, as we know it, ever faced by civilization—more serious than the plagues of past centuries.” We know now that it didn’t work out that way, but at the time there were good reasons to think it would. And to be very, very afraid.
So was the world of 1985 so much safer? Thomas Friedman thought so in 2003, but I think he was the victim of a cognitive illusion. He knew the Cold War ended peacefully and AIDS did not sweep through the United States like the Black Death. That knowledge made those outcomes appear far more likely than they did at the time. And it made him feel that the Thomas Friedman of 1985 was much more confident of those outcomes than the Thomas Friedman of 1985 really was.
I don’t mean to knock Friedman. The point is simply that even a renowned commentator on global affairs is vulnerable to this illusion. And he’s not alone. In a 2005 book called Expert Political Judgment, Philip Tetlock, a University of California psychologist, presented the results of a twenty-year project that involved Tetlock tracking the predictions of 284 political scientists, economists, journalists, and others whose work involved “commenting or offering advice on political or economic trends.” In all, Tetlock checked the accuracy of 82,361 predictions and found the experts’ record was so poor they would have been beaten by random guesses. Tetlock also found, just as Baruch Fischhoff had earlier, that when experts were asked after the fact to recall their predictions and how confident they were, they remembered themselves being more accurate and more certain than they actually were. (Unlike the Israeli students Fischhoff surveyed, however, experts often got defensive when they were told this.)
I certainly don’t want to suggest all scary prognostications are wrong. Horrible things do happen, and it’s sometimes possible—very difficult but possible—for smart, informed people to foresee them. Each scary prognostication has to be taken on its merits. But anyone rattled by catastrophist writing should also know that many of the horrible and wonderful things that come to pass are not predicted and there is a very long history of smart, informed people foreseeing disasters—they tend to focus on the negative side of things, for some reason—that never come to pass.
In 1967—a year we remember for the Summer of Love and Sergeant Pepper’s Lonely Hearts Club Band—Americans got a remarkably precise warning of pending catastrophe. It would strike in 1975, they were told, and the world would never be the same. Famine—1975! by brothers William and Paul Paddock may be thoroughly forgotten today, but it was a best seller in 1967. The brothers had solid credentials. One was an agronomist, the other an experienced foreign service officer. The book is loaded with scientific research, studies, and data from around the world—everything from postwar Mexican wheat production to Russian economic output. And the Paddocks came to a brutal conclusion: As a result of soaring populations, the world was rapidly running out of food. Massive, worldwide starvation was coming, and there was nothing anyone could do to stop it. “Catastrophe is foredoomed,” they wrote. “The famines are inevitable.”
The Paddocks were not cranks. There were countless experts who agreed with them. Harvard biologist George Wald predicted that absent emergency measures “civilization will end within fifteen or thirty years.” The loudest alarm was raised by Stanford University biologist Paul Ehrlich. “The battle to feed all of humanity is over,” Ehrlich wrote in The Population Bomb, published in 1968. “In the 1970s and 1980s, hundreds of millions of people will starve to death in spite of any crash programs embarked upon now.”
Like the Paddocks, Ehrlich loaded his book with research, studies and statistics. He also wrote three different scenarios for the unfolding of future events in heavily dramatic style—a technique that would become common in the catastrophist genre and one that, as we have seen, is very likely to trigger the Rule of Typical Things and lead Gut to believe the predicted events are more likely than reason would suggest. “Even with rationing, a lot of Americans are going to starve unless this climate change reverses,” a frustrated scientist says to his wife in the first scenario. “We’ve seen the trends clearly since the early 1970s, but nobody believed it would happen here, even after the 1976 Latin American famine and the Indian Dissolution. Almost a billion human beings starved to death in the last decade, and we managed to keep the lid on by a combination of good luck and brute force.” That scenario ends with the United States launching a preemptive nuclear strike on the U.S.S.R. In the second scenario, poverty, starvation, and crowded populations allow a virus to emerge from Africa and sweep the world—one-third of the planet’s population dies. In the third scenario, the United States realizes the error of its ways and supports the creation of world bodies that tax rich countries to pay for radical population control measures—one billion people still die of starvation in the 1980s, but population growth slows and humanity survives. Ehrlich writes that this last scenario is probably far too optimistic because “it involves a maturity of outlook and behavior in the United States that seems unlikely to develop in the near future.”
In 1970, Ehrlich celebrated the first Earth Day with an essay that narrowed the range of possibilities considerably: Between 1980 and 1989, roughly four billion people, including 65 million Americans, would starve in what he dubbed the “Great Die-Off.”
The Population Bomb was a huge best seller. Ehrlich became a celebrity, making countless appearances in the media, including The Tonight Show with Johnny Carson. Awareness of the threat spread and mass starvation became a standard theme in popular culture. In the 1973 movie Soylent Green, the swollen populations of the future are fed rations of a mysterious processed food called “Soylent Green”—and as we learn in the memorable final line, “Soylent Green is people!”