Fine, many people say. But until more is known, the sensible thing to do is err on the side of caution by banning or restricting suspected chemicals. Better safe than sorry, after all.
This attitude has been enshrined in various laws and regulations as the precautionary principle. There are many definitions of that principle, but one of the most influential comes from Principle 15 of the Rio Declaration on Environment and Development: “Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.” Like many international resolutions, this is full of ambiguity. What qualifies as “serious” damage? What are “cost-effective measures”? And while it may be clear that we don’t need “full scientific certainty,” how much evidence should we have before we act? And this is only one of more than twenty definitions of the precautionary principle floating about in regulations and laws. Many are quite different and some are contradictory on certain points. As a result, there is a vast and growing academic literature grappling with what exactly “precaution” means and how it should be implemented. Politicians and activists like to talk about the precautionary principle as if it were a simple and sensible direction to err on the side of caution. But there’s nothing simple about it.
Nor is it all that sensible. As law professor Cass Sunstein argues in Laws of Fear, the precautionary principle is more a feel-good sentiment than a principle that offers real guidance about regulating risks. Risks are everywhere, he notes, so we often face a risk in acting and a risk in not acting— and in these situations, the precautionary principle is no help.
Consider chlorine. Treat drinking water with it and it creates by-products that have been shown to cause cancer in lab animals in high doses and may increase the cancer risk of people who drink the water. There’s even some epidemiological evidence that suggests the risk is more than hypothetical. So the precautionary principle would suggest we stop putting chlorine in drinking water. But what happens if we do that? “If you take the chlorine out of the drinking water, as was done in South America, you end up with an epidemic of 2,000 cases of cholera,” says Daniel Krewski. And cholera is far from the only threat. There are many water-borne diseases, including typhoid fever, a common killer until the addition of chlorine to drinking water all but wiped it out in the developed world early in the twentieth century. So, presumably, the precautionary principle would say we must treat drinking water with chlorine. “Because risks are on all sides, the Precautionary Principle forbids action, inaction, and everything in between, ” writes Sunstein. It is “paralyzing; it forbids the very steps that it requires.”
So should we ban or restrict synthetic chemicals until we have a full understanding of their effects? This attractively simple idea is a lot more complicated than it appears. If pesticides were banned, agricultural yields would decline. Fruits and vegetables would get more expensive and people would buy and eat fewer of them. But cancer scientists believe that fruits and vegetables can reduce the risk of cancer if we eat enough of them, which most people do not do even now. And so banning pesticides in order to reduce exposure to carcinogens could potentially result in more people getting cancer.
Consider also that scientists are at least as ignorant of natural chemicals as they are of the man-made variety. And since there is no reason to assume—contrary to what our culture tells us—that natural is safe and man-made dangerous, that suggests we should worry as much about natural chemicals, or perhaps even more because natural chemicals vastly outnumber their man-made cousins. “The number of naturally occurring chemicals present in the food supply—or generated during the processes of growing, harvesting, storage and preparation—is enormous, probably exceeding one million different chemicals,” notes a 1996 report by the U.S. National Academy of Sciences. Everyone who digs into a delicious meal of all-natural, organically grown produce is swallowing thousands of chemicals whose effects on the human body aren’t fully understood and whose interaction with other chemicals is mysterious. And remember that of the natural chemicals that have been tested, half have been shown to cause cancer in lab animals. If we were to strictly apply the banned-until-proven-safe approach to chemicals, there would be little left to eat.
Partisans—both enviros and industry—prefer to ignore dilemmas like these and cast issues in much simpler terms. In an article entitled “Lessons of History,” the World Wildlife Fund tells readers that when the pesticide DDT “was discovered by Swiss chemist Paul Muller in 1939, it was hailed as a miracle. It could kill a wide range of insect pests but seemed to be harmless to mammals. Crop yields increased and it was also used to control malariaby killing mosquitoes. Muller was awarded the Nobel Prize in 1944. However, in 1962, scientist Rachel Carson noticed that insect and worm-eating birds were dying in areas where DDT had been sprayed. In her book Silent Spring, she issued grave warnings about pesticides and predicted massive destruction of the planet’s ecosystems unless the ‘rain of chemicals’ was halted.” This hardly looks like a dilemma. On the one hand, DDT was used to increase crop yields and control malaria, which is nice but hardly dramatic stuff. On the other hand, it threatened “massive destruction.” It’s not hard to see what the right response is.
Unfortunately, there’s quite a bit wrong with the WWF’s tale (the least of which is saying DDT was discovered in 1939, when it was first synthesized in 1874 and its value as an insecticide revealed in 1935). It doesn’t mention, for example, that the first large-scale use of DDT occurred in October 1943, when typhus—a disease spread by infected mites, fleas, and lice—broke out in newly liberated Naples. Traditional health measures didn’t work, so 1.3 million people were sprayed with the pesticide. At a stroke, the epidemic was wiped out—the first time in history that a typhus outbreak had been stopped in winter. At the end of the war, DDT was widely used to prevent typhus epidemics among haggard prisoners, refugees, and concentration-camp inmates. It’s rather sobering to think that countless Holocaust survivors owe their lives to an insecticide that is reviled today.
As for malaria, DDT did more than “control” it. “DDT was the main product used in global efforts, supported by the [World Health Organization] , to eradicate malaria in the 1950s and 1960s,” says a 2005 WHO report. “This campaign resulted in a significant reduction in malaria transmission in many parts of the world, and was probably instrumental in eradicating the disease from Europe and North America.” Estimates vary as to how many lives DDT helped save, but it’s certainly in the millions and probably in the tens of millions.
In recent years, however, anti-environmentalists have constructed an elaborate mythology around the chemicaclass="underline" DDT is perfectly harmless and absolutely effective; DDT single-handedly wiped out malaria in Europe and North America; DDT could do the same in Africa if only eco-imperialists would let Africans use the chemical to save their children. For the most part, this mythology improperly belittles the proven harms DDT inflicted on nonhuman species, particularly birds, and it ignores the abundant evidence—which started to appear as early as 1951—that mosquitoes rapidly develop resistance to the insecticide. In fact, the indiscriminate spraying of DDT on farm fields during the 1950s contributed to the development of mosquitoes’ resistance, so the banning of the pesticide for agricultural use actually helped preserve its value as a malaria-fighting tool.