The superficial results were hardly surprising. When Bush supporters were confronted with Bush’s contradictory statements, they rated them to be less contradictory than Kerry supporters. And when the explanation was provided, Bush supporters considered it to be much more satisfactory than did Kerry supporters. When the focus was on John Kerry, the results reversed. There was no difference between Republicans and Democrats when the neutral subject was tested.
All this was predictable. Far more startling, however, was what showed up on the MRI. When people processed information that ran against their strongly held views—information that made their favored candidate look bad—they actually used different parts of the brain than they did when they processed neutral or positive information. It seems confirmation bias really is hardwired in each of us, and that has enormous consequences for how opinions survive and spread.
Someone who forms a belief based on nothing more than the fact that other people around him hold that belief nonetheless has a belief. That belief causes confirmation bias to kick in, and incoming information is screened: If it supports the belief, it is readily accepted; if it goes against the belief, it is ignored, scrutinized carefully, or flatly rejected. Thus, if the information that turns up in newspapers, on televisions, and in conversation is mixed—and it very often is when risk is involved—confirmation bias will steadily strengthen a belief that originally formed only because it’s what everybody else was saying during a coffee break last week.
That’s on the individual level. What happens when people who share a belief get together to discuss it? Psychologists know the answer to that, and it’s not pretty. They call it group polarization.
It seems reasonable to think that when like-minded people get together to discuss a proposed hazardous waste site or the breast implants they believe are making them sick or some other risk, their views will tend to coalesce around the average within the group. But they won’t. Decades of research has proved that groups usually come to conclusions that are more extreme than the average view of the individuals who make up the group. When opponents of a hazardous waste site gather to talk about it, they will become convinced the site is more dangerous than they originally believed. When a woman who believes breast implants are a threat gets together with women who feel the same way, she and all the women in the meeting are likely to leave believing they had previously underestimated the danger. The dynamic is always the same. It doesn’t matter what the subject under discussion is. It doesn’t matter what the particular views are. When like-minded people get together and talk, their existing views tend to become more extreme.
In part, this strange human foible stems from our tendency to judge ourselves by comparison with others. When we get together in a group of like-minded people, what we share is an opinion that we all believe to be correct and so we compare ourselves with others in the group by asking “How correct am I?” Inevitably, most people in the group will discover that they do not hold the most extreme opinion, which suggests they are less correct than others. And so they become more extreme. Psychologists confirmed this theory when they put people in groups and had them state their views without providing reasons why—and polarization still followed.
A second force behind group polarization is simple numbers. Prior to going to a meeting of people who believe silicone breast implants cause disease, a woman may have read several articles and studies on the subject. But because the people at the meeting greatly outnumber her, they will likely have information she was not aware of. Maybe it’s a study suggesting implants cause a disease she has never heard of, or it’s an article portraying the effects of implant-caused diseases as worse than she knew. Whatever it is, it will lead her to conclude the situation is worse than she had thought. As this information is pooled, the same process happens to everyone else in the meeting, with people becoming convinced that the problem is bigger and scarier than they had thought. Of course, it’s possible that people’s views could be moderated by hearing new information that runs in the opposite direction—an article by a scientist denying that implants cause disease, for example. But remember confirmation bias: Every person in that meeting is prone to accepting information that supports their opinion and ignoring or rejecting information that does not. As a result, the information that is pooled at the meeting is deeply biased, making it ideal for radicalizing opinions. Psychologists have also demonstrated that because this sort of polarizationis based on information-sharing alone, it does not require anything like a face-to-face conversation—a fact amply demonstrated every day on countless political blogs.
So Alan convinces Betty, and that persuades Carl, which then settles it for Deborah. Biased screening of information begins and opinions steadily strengthen. Organizations are formed, information exchanged. Views become more extreme. And before you know it, as Cass Sunstein wrote, there are “hundreds, thousands or millions of people” who are convinced they are threatened by some new mortal peril. Sometimes they’re right. It took only a few years for almost everyone to be convinced that AIDS was a major new disease. But they can also be very wrong. As we saw, it wasn’t science that transformed the popular image of silicone breast implants from banal objects to toxic killers.
Reasonable or not, waves of worry can wash over communities, regions, and nations, but they cannot roll on forever. They follow social networks and so they end where those networks end—which helps explain why the panic about silicone breast implants washed across the United States and Canada (which also banned the implants) but caused hardly a ripple in Europe.
The media obviously play a key role in getting waves started and keeping them rolling because groups make their views known through more than conversations and e-mail. Groups also speak through the media, explicitly but also implicitly. Watch any newscast, read any newspaper: Important claims about hazards—heroin is a killer drug, pollution causes cancer, the latest concern is rapidly getting worse—will simply be stated as true, without supporting evidence. Why? Because they are what “everybody knows” is true. They are, in other words, group opinions. And like all group opinions, they exert a powerful influence on the undecided.
The media also respond to rising worry by producing more reports— almost always emotional stories of suffering and loss—about the thing that has people worried. And that causes the Guts of readers and viewers to sit up and take notice. Remember the Example Rule? The easier it is to recall examples of something happening, Gut believes, the more likely it is to happen. Growing concern about silicone breast implants prompted more stories about women with implants and terrible illnesses. Those stories raised the public’s intuitive estimate of how dangerous silicone breast implants are. Concern continued to grow. And that encouraged the media to produce more stories about sick women with implants. More fear, more reporting. More reporting, more fear. Like a microphone held too close to a loudspeaker, modern media and the primal human brain create a feedback loop.