Выбрать главу

The first results were an almost exact duplicate of the original conformity experiments: When the task was easy and people thought the experiment was “low importance,” one-third abandoned their own judgment and conformed to the group answer. Then came the “easy task/high importance” version. The researchers expected conformity would fall under those conditions, and it did. But it didn’t disappear: Between 13 percent and 16 percent still followed the group.

Things got intriguing when the questions became harder to answer. Among those who thought the test was “low importance,” a minority conformed to the group, just as they did when the questions were easy to answer. But when the test was “high importance,” conformity actually went up. The researchers also found that under those conditions, people became more confident about the accuracy of their group-influenced answers. “Our data suggest,” wrote the researchers, “that so long as the judgments are difficult or ambiguous, and the influencing agents are united and confident, increasing the importance of accuracy will heighten confidence as well as conformity—a dangerous combination.”

Judgments about risk are often difficult and important. If Baron, Vandello, and Brunsman are right, those are precisely the conditions under which people are most likely to conform to the views of the group and feel confident that they are right to do so.

But surely, one might think, an opinion based on nothing more than the uninformed views of others is a fragile thing. We are exposed to new information every day. If the group view is foolish, we will soon come across evidence that will make us doubt our opinions. The blind can’t go on leading the blind for long, can they?

Unfortunately, psychologists have discovered another cognitive bias that suggests that, in some circumstances, the blind can actually lead the blind indefinitely. It’s called confirmation bias and its operation is both simple and powerful. Once we have formed a view, we embrace information that supports that view while ignoring, rejecting, or harshly scrutinizing information that casts doubt on it. Any belief will do. It makes no difference whether the thought is about trivia or something important. It doesn’t matter if the belief is the product of long and careful consideration or something I believe simply because everybody else in the Internet chat room said so. Once a belief is established, our brains will seek to confirm it.

In one of the earliest studies on confirmation bias, psychologist Peter Wason simply showed people a sequence of three numbers—2, 4, 6—and told them the sequence followed a certain rule. The participants were asked to figure out what that rule was. They could do so by writing down three more numbers and asking if they were in line with the rule. Once you think you’ve figured out the rule, the researchers instructed, say so and we will see if you’re right.

It seems so obvious that the rule the numbers are following is “even numbers increasing by two.” So let’s say you were to take the test. What would you say? Obviously, your first step would be to ask: “What about 8, 10, 12? Does that follow the rule?” And you would be told, yes, that follows the rule.

Now you are really suspicious. This is far too easy. So you decide to try another set of number. Does “14, 16, 18” follow the rule? It does.

At this point, you want to shout out the answer—the rule is even numbers increasing by two!—but you know there’s got to be a trick here. So you decide to ask about another three numbers: 20, 22, 24. Right, again!

Most people who take this test follow exactly this pattern. Every time they guess, they are told they are right and so, it seems, the evidence that they are right piles up. Naturally, they become absolutely convinced that their initial belief is correct. Just look at all the evidence! And so they stop the test and announce that they have the answer: It is “even numbers increasing by two.”

And they are told that they are wrong. That is not the rule. The correct rule is actually “any three numbers in ascending order.”

Why do people get this wrong? It is very easy to figure out that the rule is not “even numbers increasing by two.” All they have to do is try to disconfirm that the rule is even numbers increasing by two. They could, for example, ask if “5, 7, 9” follows the rule. Do that and the answer would be, yes, it does—which would instantly disconfirm the hypothesis. But most people do not try to disconfirm. They do the opposite, trying to confirm the rule by looking for examples that fit it. That’s a futile strategy. No matter how many examples are piled up, they can never prove that the belief is correct. Confirmation doesn’t work.

Unfortunately, seeking to confirm our beliefs comes naturally, while it feels strange and counterintuitive to look for evidence that contradicts our beliefs. Worse still, if we happen to stumble across evidence that runs contrary to our views, we have a strong tendency to belittle or ignore it. In 1979—when capital punishment was a top issue in the United States— American researchers brought together equal numbers of supporters and opponents of the death penalty. The strength of their views was tested. Then they were asked to read a carefully balanced essay that presented evidence that capital punishment deters crime and evidence that it does not. The researchers then retested people’s opinions and discovered that they had only gotten stronger. They had absorbed the evidence that confirmed their views, ignored the rest, and left the experiment even more convinced that they were right and those who disagreed were wrong.

Peter Wason coined the term “confirmation bias,” and countless studies have borne out his discovery—or rather, his demonstration of a tendency thoughtful observers have long noted. Almost four hundred years ago, Sir Francis Bacon wrote that “the human understanding when it has once adopted an opinion (either as being a received opinion or as being agreeable to itself) draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects; in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate....” Wise words proven true every day by countless pundits and bloggers.

The power of confirmation bias should not be underestimated. During the U.S. presidential election of 2004, a team of researchers led by Drew Westen at Emory University brought together thirty committed partisans— half Democrats, half Republicans—and had them lie in magnetic resonance imaging (MRI) machines. While their brains were being scanned, they were shown a series of three statements by or about George W. Bush. The second statement contradicted the first, making Bush look bad. Participants were asked whether the statements were inconsistent and were then asked to rate how inconsistent they were. A third statement then followed that provided an excuse for the apparent contradiction between the statements. Participantswere asked if perhaps the statements were not as inconsistent as they first appeared. And finally, they were again asked to rate how inconsistent the first two statements were. The experiment was repeated with John Kerry as the focus and a third time with a neutral subject.