Выбрать главу

But Homer Simpson isn’t merely skeptical. He is cynical. He denies the very possibility of knowing the difference between true and untrue, between the more accurate and the less. And that’s just wrong. It may take a little effort to prove that the statistic Homer cites is fabricated, but it can be done. The truth is out there, to quote another staple of 1990s television.

Along with truth, cynicism endangers trust. And that can be dangerous. Researchers have found that when the people or institutions handling a risk are trusted, public concern declines: It matters a great deal whether the person telling you not to worry is your family physician or a tobacco company spokesman. Researchers have also shown, as wise people have always known, that trust is difficult to build and easily lost. So trust is vital.

But trust is disappearing fast. In most modern countries, political scientists have found a long-term decline in public trust of various authorities. The danger here is that we will collectively cross the line separating skepticism from cynicism. Where a reasonable respect for expertise is lost, people are left to search for scientific understanding on Google and in Internet chat rooms, and the sneer of the cynic may mutate into unreasoning, paralyzing fear. That end state can be seen in the anti-vaccination movements growing in the United States, Britain, and elsewhere. Fueled by distrust of all authority, anti-vaccination activists rail against the dangers of vaccinating children (some imaginary, some real-but-rare) while ignoring the immense benefits of vaccination—benefits that could be lost if these movements continue to grow.

This same poisonous distrust is on display in John Weingart’s Waste Is a Terrible Thing to Mind, an account of Weingart’s agonizing work as the head of a New Jersey board given the job of finding a site for a low-level radioactive waste disposal facility. Experts agreed that such a facility is not a serious hazard, but no one wanted to hear that. “At the Siting Board’s open houses,” writes Weingart, who is now a political scientist at Rutgers University, “people would invent scenarios and then dare Board members and staff to say they were impossible. A person would ask, ‘What would happen if a plane crashed into a concrete bunker filled with radioactive waste and exploded? ’ We would explain that while the plane and its contents might explode, nothing in the disposal facility could. And they would say, ‘But what if explosives had been mistakenly disposed of, and the monitoring devices at the facility had malfunctioned so they weren’t noticed?’ We would head down the road of saying that this was an extremely unlikely set of events. And they would say, ‘Well, it could happen, couldn’t it?’ ”

Fortunately, we have not entirely abandoned trust, and experts can still have great influence on public opinion, particularly when they manage to forge a consensus among themselves. Does HIV cause AIDS? For a long time, there were scientists who said it did not, but the overwhelming majority said it did. The public heard and accepted the majority view. The same scenario is playing out now with climate change—most people in every Western country agree that man-made climate change is real, not because they’ve looked into the science for themselves, but because they know that’s what most scientists think. But as Howard Margolis describes in Dealing with Risk, scientists can also find themselves resoundingly ignored when their views go against strong public feelings. Margolis notes that the American Physical Society—an association of physicists—easily convinced the public that cold fusion didn’t work, but it had no impact when it issued a positive report on the safety of high-level nuclear waste disposal.

So scientific information and the opinions of scientists can certainly play a role in how people judge risks, but—as the continued divisions between expert and lay opinion demonstrate—they aren’t nearly as influential as scientists and officials might like. We remain a species powerfully influenced by the unconscious mind and its tools—particularly the Example Rule, the Good-Bad Rule, and the Rule of Typical Things. We also remain social animals who care about what other people think. And if we aren’t sure whether we should worry about this risk or that, whether other people are worried makes a huge difference.

“Imagine that Alan says that abandoned hazardous waste sites are dangerous, or that Alan initiates protest action because such a site is located nearby,” writes Cass Sunstein in Risk and Reason. “Betty, otherwise skeptical or in equipoise, may go along with Alan; Carl, otherwise an agnostic, may be convinced that if Alan and Betty share the relevant belief, the belief must be true. It will take a confident Deborah to resist the shared judgments of Alan, Betty and Carl. The result of these sets of influences can be social cascades, as hundreds, thousands or millions of people come to accept a certain belief because of what they think other people believe.”

Of course it’s a big leap from someone in a laboratory going along with the group answer on meaningless questions to “hundreds, thousands or millions of people” deciding that something is dangerous simply because that’s what other people think. After all, people in laboratory experiments know their answers don’t really matter. They won’t be punished if they make mistakes, and they won’t be rewarded for doing well. But in the real world, our views do matter. For one thing, we are citizens of democracies in which popular opinion influences how governments respond—or don’t respond—to risks. More concretely, what we think about risks can be critical in our personal lives. Will you support the creation of a hazardous waste site in your region? If it goes ahead, it means jobs and economic growth. But it could also mean a threat to the health of your family. Will you shrug off the threat of pedophiles and hoodlums and let your kids play unsupervised in the neighborhood park? Will you eat foods with trans fats? Each of these questions matters far more than whether line number four is longer than line number five, so it’s hard to believe that on these important matters we would so casually abandon our own judgment and follow the herd.

In fact, there is some evidence that the importance of these questions would change the rate at which we conform to the group view. It would increase it. Substantially.

Psychologists at the University of Iowa—Robert Baron, Joseph Vandello, and Bethany Brunsman—designed an experiment in which they told participants they were testing the accuracy of eyewitness verifications. The testing would involve slides appearing briefly and questions based on what was shown: Was the man wearing glasses? Did you see the man in the police lineup? Three people took the test each round. This was a ruse, of course. Two of the three test-takers were actually part of the real experiment, which was to see if people would go along with the false answers they gave.

Two new variables set this experiment apart from the earlier research on conformity. In some trials, the slides were shown repeatedly or they were left on the screen for a full five seconds. The researchers knew from earlier testing that people almost never make mistakes under those conditions. But in other trials, the slide would flash onscreen only briefly, making the questions much harder to answer. The researchers also gave participants different background information about the purpose of the experiment. Some were told it was merely a pilot study intended to get a general sense of how people perceive things that might be used sometime in the future “to develop a test that lets us know how accurate people are as witnesses.” This was the “low importance” version of the experiment. But others were told the researchers had already developed the “Iowa Witness Identification Test” that “will soon be adopted by police departments and courtrooms in cases involving eyewitness testimony,” the researchers said. “Over the next few weeks, we will be running research participants such as yourselves through the revised I.W.I.T. to establish reliable norms. As a result, we hope that you try your best on this test because establishing accurate norms is crucial. Most people do try hard on this test because they are interested in seeing how good their eyewitness accuracy is compared to others. But, to increase your interest in doing well on this test, we will be awarding prizes of $20 at the end of the experimental testing period to the participants who score the highest in accuracy.” This was the “high importance” condition.