The truth about DDT is that the questions about how to deal with it were, and are, complex. So what does the precautionary principle tell us about this most reviled of chemicals? Well, once typhus and malaria have been removed from the equation, it would probably come down on the side of a ban. But what does “precaution” mean when insect-borne disease is still very much present? WHO estimates that malaria kills one million people a year, and contributes to another two million deaths. Most of the dead are children, and most of those children are African. If DDT is used to fight malaria in Africa, it carries certain risks. And there are risks to not using it. So how do we decide? The precautionary principle is no help.
“Why, then, is the Precautionary Principle widely thought to give guidance? ” asks Cass Sunstein. The answer is simple: We pay close attention to some risks while ignoring others, which very often causes the dilemma of choosing between risks to vanish. If we ignore malaria, it seems only prudent to ban DDT. Ignore the potential risks of natural chemicals, or the economic costs, and it becomes much easier to demand bans on synthetic chemicals. Ignore the threat of fire and it seems obvious that the flame-retardant chemicals polluting our blood must be eliminated. And if we don’t know anything about typhoid or cholera, it’s easy to conclude that we should stop treating water with a chemical that produces a known carcinogen. “Many people who are described as risk averse are, in reality, no such thing. They are averse to particular risks, not risks in general,” Sunstein writes. And it’s not just individuals who have blind spots. “Human beings, cultures and nations often single out one or a few social risks as ‘salient,’ and ignore the others.”
But how do people choose which risks to worry about and which to ignore? Our friends, neighbors, and coworkers constantly supply us with judgments that are a major influence. The media provide us with examples—or not—that Gut feeds into the Example Rule to estimate the likelihood of a bad thing happening. Experience and culture color hazards with emotions that Gut runs through the Good-Bad Rule. The mechanism known as habituation causes us to play down the risks of familiar things and play up the novel and unknown. If we connect with others who share our views about risks, group polarization can be expected—causing our views to become more entrenched and extreme.
And of course, for risks involving chemicals and contamination, there is “intuitive toxicology.” We are hardwired to avoid contamination, no matter how small the amounts involved. With the culture having defined chemical to mean man-made chemical, and man-made chemical as dangerous, it is all but inevitable that our worries about chemical pollution will be out of all proportion to the real risks involved. Confirmation bias is also at work. Once we have the feeling that chemical contamination is a serious threat, we will tend to latch onto information that confirms that hunch—while dismissing or ignoring anything that suggests otherwise. This is where the complexity of science comes into play. For controversial chemicals, relevant studies may number in the dozens or the hundreds or the thousands, and they will contradict each other. For anyone with a bias—whether a corporate spokesman, an environmentalist, or simply a layperson with a hunch— there will be almost always be evidence to support that bias.
The first step in correcting our mistakes of intuition has to be a healthy respect for the scientific process. Scientists have their biases, too, but the whole point of science is that as evidence accumulates, scientists argue among themselves based on the whole body of evidence, not just bits and pieces. Eventually, the majority tentatively decides in one direction or the other. It’s not a perfect process, by any means; it’s frustratingly slow and it can make mistakes. But it’s vastly better than any other method humans have used to understand reality.
The next step in dealing with risk rationally is to accept that risk is inevitable. In Daniel Krewski’s surveys, he found that about half the Canadian public agreed that a risk-free world is possible. “A majority of the population expects the government or other regulatory agencies to protect them completely from all risk in their daily lives,” he says with more than a hint of amazement in his voice. “Many of us who work in risk management have been trying to get the message out that you cannot guarantee zero risk. It’s an impossible goal.” We often describe something as “unsafe” and we say we want it to be made “safe.” Most often, it’s fine to use that language as shorthand, but bear in mind that it’s not fully accurate. In the risk business, there are only degrees of safety. It is often possible to make something safer, but safe is usually out of the question.
We must also accept that regulating risk is a complicated business. It almost always involves trade-offs—swapping typhoid for carcinogenic traces in our drinking water, for example. And it requires careful consideration of the risks and costs that may not be so obvious as the things we worry about—like more expensive fruits and vegetables leading to an increase in cancer. It also requires evidence. We may not want to wait for conclusive scientific proof—as the precautionary principle suggests—but we must demand much more than speculation.
Rational risk regulation is a slow, careful, and thoughtful examination of the dangers and costs in particular cases. If banning certain synthetic pesticides can be shown to reduce a risk materially at no more cost than a modest proliferation of dandelions, say, it probably makes sense. If there are inexpensive techniques to reduce the amount of chlorine required to treat drinking water effectively, that may be a change that’s called for. Admittedly, this is not exciting stuff. There’s not a lot of passion and drama in it. And while there are always questions of justice and fairness involved—Who bears the risk? Who will shoulder the cost of reducing the risk?—there is not a lot of room for ideology and inflammatory rhetoric.
Unfortunately, there are lots of activists, politicians, and corporations who are not nearly as interested in pursuing rational risk regulation as they are in scaring people. After all, there are donations, votes, and sales to be had. Even more unfortunately, Gut will often side with the alarmists. That’s particularly true in the case of chemicals, thanks to a combination of Gut’s intuitive toxicology and the negative reputation chemicals have in the culture. Lois Swirsky Gold says, “It’s almost an immutable perception. I hear it from people all the time. ‘Yes, I understand that 50 percent of the natural chemicals tested are positive, half the chemicals that are in [the Carcinogenic Potency Project] data base are positive, 70 percent of the chemicals that are naturally occurring in coffee are carcinogens in rodent tests. Yes, I understand all that but still I’m not going to eat that stuff if I don’t have to.’ ”
All this talk of tiny risks adds up to one big distraction, says Bruce Ames. “There are really important things to worry about, and it gets lost in the noise of this constant scare about unimportant things.” By most estimates, more than half of all cancers in the developed world could be preventedwith nothing more than lifestyle changes ranging from exercise to weight control and, of course, not smoking. Whatever the precise risk of cancer posed by synthetic chemicals in the environment, it is a housefly next to that elephant.