The second clue lay in something that looked, on the surface, to be a meaningless quirk. It turned out that people’s ratings of the risks and benefits for the ninety activities and technologies on the list were connected. If people thought the risk posed by something was high, they judged the benefit to be low. The reverse was also true. If they thought the benefit was high, the risk was seen as low. In technical terms, this is an “inverse correlation. ” It makes absolutely no sense here because there’s no logical reason that something—say, a new prescription drug—can’t be both high risk and high benefit. It’s also true that something can be low risk and low benefit— sitting on the couch watching Sunday afternoon football comes to mind. So why on earth did people put risk and benefit at opposite ends of a seesaw? It was curious but it didn’t seem important. In his earliest papers on risk, Slovic mentioned the finding in only a sentence or two.
In the years to come, however, the model of a two-track mind—Head and Gut operating simultaneously—advanced rapidly. A major influence in this development was the work of Robert Zajonc, a Stanford psychologist, who explored what psychologists call affect—which we know simply as feeling or emotion. Zajonc insisted that we delude ourselves when we think that we evaluate evidence and make decisions by calculating rationally. “This is probably seldom the case,” he wrote in 1980. “We buy cars we ‘like,’ choose the jobs and houses we find ‘attractive,’ and then justify those choices by various reasons.”
With this new model, Slovic understood the limitations of his earlier research. Working with Ali Alhakami, a Ph.D. student at the University of Oregon, he also started to realize that the perceived link between risk and benefit he discovered earlier may have been much more than a quirk. What if people were reacting unconsciously and emotionally at the mention of a risky activity or technology? They hear “nuclear power” and . . . ugh! They have an instantaneous, unconscious reaction. This bad feeling actually happens prior to any conscious thought, and because it comes first, it shapes and colors the thoughts that follow—including responses to the researchers’ questions about risk.
That would explain why people see risk and benefit as if they were sitting at opposite ends of a seesaw. How risky is nuclear power? Nuclear power is a Bad Thing. Risk is also bad. So nuclear power must be very risky. And how beneficial is nuclear power? Nuclear power is Bad, so it must not be very beneficial. When Gut reacts positively to an activity or technology— swimming, say, or aspirin—it tips the seesaw the other way: Aspirin is a Good Thing so it must be low risk and high benefit.
To test this hypothesis, Slovic and Alhakami, along with colleagues Melissa Finucane and Stephen Johnson, devised a simple experiment. Students at the University of Western Australia were divided into two groups. The first group was shown various potential risks—chemical plants, cell phones, air travel—on a computer screen and asked to rate the riskiness of the item on a scale from one to seven. Then they rated the benefits of each. The second group did the same, except that they had only a few seconds to make their decisions.
Other research had shown that time pressure reduces Head’s ability to step in and modify Gut’s judgment. If Slovic’s hypothesis was correct, the seesaw effect between risk and benefit should be stronger in the second group than the first. And that’s just what they found.
In a second experiment, Slovic and Alhakami had students at the University of Oregon rate the risks and benefits of a technology (different trials used nuclear power, natural gas, and food preservatives). Then they were asked to read a few paragraphs describing some of the benefits of the technology. Finally, they were asked again to rate the risks and benefits of the technology. Not surprisingly, the positive information they read raised student’s ratings of the technology’s benefits in about one-half of the cases. But most of those who raised their estimate of the technology’s benefits also lowered their estimate of the risk—even though they had not read a word about the risk. Later trials in which only risks were discussed had the same effect but in reverse: People who raised their estimate of the technology’s risks in response to the information about risk also lowered their estimate of its benefit.
Various names have been used to capture what’s going on here. Slovic calls it the “affect heuristic.” I prefer to think of it as the Good-Bad Rule. When faced with something, Gut may instantly experience a raw feeling that something is Good or Bad. That feeling then guides the judgments that follow: “Is this thing likely to kill me? It feels good. Good things don’t kill. So, no, don’t worry about it.”
The Good-Bad Rule helps to solve many riddles. In Slovic’s original studies, for example, he found that people consistently underestimated the lethality of all diseases except one: The lethality of cancer was actually overestimated. One reason that might be is the Example Rule. The media pays much more attention to cancer than diabetes or asthma, so people can easily recall examples of deaths caused by cancer even if they don’t have personal experience with the disease. But consider how you feel when you read the words diabetes and asthma. Unless you or someone you care about has suffered from these diseases, chances are they don’t spark any emotion. But what about the word cancer? It’s like a shadow slipping over the mind. That shadow is affect—the “faint whisper of emotion,” as Slovic calls it. We use cancer as a metaphor in ordinary language—meaning something black and hidden, eating away at what’s good—precisely because the word stirs feelings. And those feelings shape and color our conscious thoughts about the disease.
The Good-Bad Rule also helps explain our weird relationship with radiation. We fear nuclear weapons, reasonably enough, while nuclear power and nuclear waste also give us the willies. Most experts argue that nuclear power and nuclear waste are not nearly as dangerous as the public thinks they are, but people will not be budged. On the other hand, we pay good money to soak up solar radiation on a tropical beach and few people have the slightest qualms about deliberately exposing themselves to radiation when a doctor orders an X-ray. In fact, Slovic’s surveys confirmed that most laypeople underestimate the (minimal) dangers of X-rays.
Why don’t we worry about suntanning? Habituation may play a role, but the Good-Bad Rule certainly does. Picture this: you, lying on a beach in Mexico. How does that make you feel? Pretty good. And if it is a Good Thing, our feelings tell us, it cannot be all that risky. The same is true of X-rays. They are a medical technology that saves lives. They are a Good Thing, and that feeling eases any worries about the risk they pose.
On the other end of the scale are nuclear weapons. They are a Very Bad Thing—which is a pretty reasonable conclusion given that they are designed to annihilate whole cities in a flash. But Slovic has found feelings about nuclear power and nuclear waste are almost as negative, and when Slovic and some colleagues examined how the people of Nevada felt about a proposal to create a dump site for nuclear waste in that state, they found that people judged the risk of a nuclear waste repository to be at least as great as that of a nuclear plant or even a nuclear weapons testing site. Not even the most ardent anti-nuclear activist would make such an equation. It makes no sense—unless people’s judgments are the product of intensely negative feeling to all things “nuclear.”