Heuristics have a dark side, though: they cause us to have unconscious biases towards things we’re familiar with, and choose to do the same thing we’ve always done rather than do something new that may be more efficient.
They cause us to make logical leaps that take us to false conclusions. For instance, these mental shortcuts underpin our capacity for racism, sexism, and other forms of discrimination.
One such nefarious heuristic is called confirmation bias. It’s the psychological hypothesis that once we begin to believe something, we unconsciously begin seeking out information to reinforce that belief, often in the absence of facts. In fact, our biases can grow to be so strong that facts to the contrary will actually strengthen our wrong beliefs.
In 2005, Emory University professor Drew Westen and his colleagues recruited 15 self-described strong Democrats and 15 strong Republicans for a sophisticated test. They used a functional magnetic resonance imaging (fMRI) machine to study how partisan voters reacted to negative remarks about their party or candidate. Westen and his colleagues found that when these subjects processed “emotionally threatening information” about their preferred candidates, the parts of the brain associated with reasoning shut down and the parts responsible for emotions flared up.[41] Westen’s research indicates that once we grow biased enough, we lose our capacity to change our minds.
Following Westen’s study, social scientists Brendan Nyhan and Jason Reifle conducted a new test,[42] and discovered what they believe is a “backfire effect.”
Nyhan and Reifle provided the subjects with sample articles claiming that President Bush stated that tax cuts would create such economic growth that it would increase government revenues. The same articles included corrective statements from a 2003 Economic Report of the President and various other official sources, claiming that this was implausible. The researchers then showed the students the actual tax revenues as a proportion of GDP declining after Bush’s tax cuts were enacted.
The results were fascinating: after reading the article, the conservatives in the study were still more inclined to believe that tax cuts increase revenue as a result of reading the correction. Hearing the truth made conservatives more likely to agree with the misperception. The facts backfired.
We already know that things like confirmation bias make us seek out information that we agree with. But it’s also the case that once we’re entrenched in a belief, the facts will not change our minds.
Politics is the area in which scientists have studied the psychological causes of bias the most. It’s easy to get people to self-identify, and universities tend to have more of an interest in political science than in other realms of social studies. But you can also see the results of this kind of bias in areas other than politics: talk to a Red Sox fan about whether or not the Yankees are the best team in baseball’s history, and you’ll see strong bias come out. Talk to MacBook owners about the latest version of Windows and you may see the same phenomenon.
We’ve likely evolved this way because it’s safer. Forming a heuristic means survivaclass="underline" watching your caveman friend eat some berries and die doesn’t make you want to conduct a test to see if those berries kill people. It makes you want to not eat those berries anymore, and to tell your friends not to eat those berries either.
Cognitive scientists Hugo Mercier and Dan Sperber took reasoning and turned it on its head. After all, if all the evidence around reasoning shows that we’re actually pretty bad at using it to make better choices, then maybe that’s not reason’s primary function. In their paper “Why do humans reason?,”[43] they argue instead that “reasoning does exactly what can be expected of an argumentative device: Look for arguments that support a given conclusion, and, ceteris paribus, favor conclusions for which arguments can be found.” Mercier and Sperber argue that our minds may have evolved to value persuasion over truth. It certainly is plausible— human beings are social animals, and persuasion is a form of social power.
The seeds of opinion can be dangerous things. Once we begin to be persuaded of something, we not only seek out confirmation for that thing, but we also refute fact even in the face of incontrovertible evidence. With confirmation bias and Nyhan and Reifle’s backfire effect in full force, we find ourselves both addicted to more information and vulnerable to misinformation for the sake of our egos.
This MSNBC Is Going Straight to My Amygdala
Neuroscience is the new outer space. It’s a vacuum of promise and fantasy waiting to be filled with science and data. There’s no greater, no more mysterious, no more misunderstood organ in our bodies than our brains. If one weighed the pages of mythology around the brain against that of all scientific papers ever written about it, the scale would likely tip towards myth.
The fields of psychology and neuroscience are filled with misinformation, disagreement, untested hypotheses, and the occasional consensus-based, verifiable, and repeatably tested theory. And so it’s a struggle for me: on one hand, I’m preaching about information diets, but—in trying to synthesize my own research in the field—I run the risk of accidentally feeding you junk information myself. On the other hand, so much of both fields is applicable to an information diet that it’s impossible not to draw on them.
Banting had an advantage on me. When he wrote his Letter on Corpulence, the Calorie was a unit used to measure the energy consumption of steam engines. Science had not scratched the surface of what he’d touched on yet. I’m writing in the midst of the dawn of science in this field; we know some, but not a lot. It’s as scientifically accurate to say, “This MSNBC is going straight to my amygdala,” as it is to say, “This ice cream is going straight to my thighs.” Only, now we actually have more information and more accurate research about how ice cream actually affects your thighs.
Let’s start with some acknowledgement that our brains are not exactly like the digestive and endocrine systems. Direct comparisons tend to be ridiculous: the rules for how our minds store and process information are different from how our bodies store and process food. Food consumption has immediate effects: drink an extraordinary amount of water, and you may get a fatal case of water intoxication. The same is not true for information; few people have died directly from reading too much PerezHilton.com in a given day.
Cognitive processing does, however, cause physiological changes just like our food does—only not in the same way. Up until a few years ago, it was thought that the human brain became fixed at some point during early childhood. Now science has shown that this isn’t the case; our brains constantly adapt and change their physiological structure. Every time we learn something (according to neuroscientists), it results in a physiological change in the brain.
This phenomenon is called neuroplasticity, and a quote from Dr. Donald Hebb sums it up: “neurons that fire together, wire together.” More explicitly, Hebb says:
“Let us assume that the persistence or repetition of a reverberatory activity (or “trace”) tends to induce lasting cellular changes that add to its stability.… When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.”[44]
43
http://www.dan.sperber.fr/wp-content/uploads/2009/10/MercierSperberWhydohumansreason.pdf