The Republican Brain (7 page)

BOOK: The Republican Brain
4.33Mb size Format: txt, pdf, ePub
ads

And no wonder. If we have strong emotional convictions about something, then these convictions must be thought of as an actual physical part of our brains, residing not in any individual brain cell (or neuron) but rather in the complex connections between them, and the pattern of neural activation that has occurred so many times before, and will occur again. The more we activate a particular series of connections, the more powerful it becomes. It grows more and more a part of us, like the ability to play guitar or juggle a soccer ball.

So to attack that “belief” through logical or reasoned argument, and thereby expect it to vanish and cease to exist in a brain, is really a rather naïve idea. Certainly, it is not the wisest or most effective way of trying to “change brains,” as Berkeley cognitive linguist George Lakoff puts it.

We've inherited an Enlightenment tradition of thinking of beliefs as if they're somehow disembodied, suspended above us in the ether, and all you have to do is float up the right bit of correct information and wrong beliefs will dispel, like bursting a soap bubble. Nothing could be further from the truth. Beliefs are
physical.
To attack them is like attacking one part of a person's anatomy, almost like pricking his or her skin (or worse). And motivated reasoning might perhaps best be thought of as a defensive mechanism that is triggered by a direct attack upon a belief system, physically embodied in a brain.

I've still only begun to unpack this theory and its implications—and have barely drawn any meaningful distinctions between liberals and conservatives—but it is already apparent why Condorcet's vision fails so badly. Condorcet believed that good arguments, widely disseminated, would win the day. The way the mind works, however, suggests that good arguments will only win the day when people don't have strong emotional commitments that contradict them. Or to employ lingo sometimes used by the psychologists and political scientists working in this realm, it suggests that
cold
reasoning (rational, unemotional) is very different from
hot
reasoning (emotional, motivated).

Consider an example. You can easily correct a wrong belief when the belief is that Mother's Day is May 8, but it's actually May 9. Nobody is going to dispute that—nobody's invested enough to do so (we hope), and moreover, you'd expect most of us to have strong motivations (which psychologists sometimes call
accuracy motivations
) to get the date of Mother's Day right, rather than defensive motivations that might lead us to get it wrong. By the same token, in a quintessential example of “cold” and “System 2” reasoning, liberals and conservatives can both solve the same math problem and agree on the answer (again, we hope).

But when good arguments threaten our core belief systems, something very different happens. The whole process gets shunted into a different category. In the latter case, these arguments are likely to automatically provoke a negative subconscious and emotional reaction. Most of us will then come up with a reason to reject them—or, even in the absence of a reason, refuse to change our minds.

Even scientists—supposedly the most rational and dispassionate among us and the purveyors of the most objective brand of knowledge—are susceptible to motivated reasoning. When they grow deeply committed to a view, they sometimes cling to it tenaciously and refuse to let go, ignoring or selectively reading the counterevidence. Every scientist can tell you about a completely intransigent colleague, who has clung to the same pet theory for decades.

However, what's unique about science is that it has its origins in a world-changing attempt to weed out and control our lapses of objectivity—what the great 17th-century theorist of scientific method, Francis Bacon, dubbed the “idols of the mind.” That attempt is known as the Scientific Revolution, and revolutionary it was. Gradually, it engineered a series of processes to put checks on human biases, so that even if individual researchers are prone to fall in love with their own theories, peer review and the skepticism of one's colleagues ensure that, eventually, the best ideas emerge. In fact, it is precisely because different scientists have different motivations and commitments—including the incentive to refute and unseat the views of their rivals, and thus garner fame and renown for themselves—that the process is supposed to work, among scientists, over the long term.

Thus when it comes to science, it's not just the famous method that counts, but the norms shared by individuals who are part of the community. In science, it is seen as a virtue to hold your views tentatively, rather than with certainty, and to express them with the requisite caveats and without emotion. It is also seen as admirable to change your mind, based upon the weight of new evidence.

By contrast, for people who have authoritarian personalities or dispositions—predominantly political conservatives, and especially religious ones—seeming uncertain or indecisive may be seen as a sign of weakness.

If even scientists are susceptible to bias, you can imagine how ordinary people fare. When it comes to the dissemination of science—or contested facts in general—across a nonscientific populace, a very different process is often occurring than the scientific one. A vast number of individuals, with widely varying motivations, are responding to the conclusions that science, allegedly, has reached. Or so they've heard.

They've heard through a wide variety of information sources—news outlets with differing politics, friends and neighbors, political elites—and are processing the information through different brains, with very different commitments and beliefs, and different psychological needs and cognitive styles. And ironically, the fact that scientists and other experts usually employ so much nuance, and strive to disclose all remaining sources of uncertainty when they communicate their results, makes the evidence they present highly amenable to selective reading and misinterpretation. Giving ideologues or partisans data that's relevant to their beliefs is a lot like unleashing them in the motivated reasoning equivalent of a candy store. In this context, rather than reaching an agreement or a consensus, you can expect different sides to polarize over the evidence and how to interpret it.

Motivated reasoning thus helps to explain all manner of maddening, logically suspect maneuvers that people make when they're in the middle of arguments so as to avoid changing their minds.

Consider one classic: goalpost shifting. This occurs when someone has made a clear and factually refutable claim, and staked a great deal on it—but once the claim meets its demise, the person demands some additional piece of evidence, or tweaks his or her views in some way so as to avoid having to give them up. That's what the Seekers did when their prophecy failed; that's what vaccine deniers do with each subsequent scientific discrediting of the idea that vaccines cause autism; that's what the hardcore Birthers did when President Obama released his long-form birth certificate; that's what the errant prophet Harold Camping did when his predicted rapture did not commence on May 21, 2011, and the world did not end on October 21, 2011.

In all of these cases, the individuals or groups involved had staked it all on a particular piece of information coming to light, or a particular event occurring. But when the evidence arrived and it contradicted their theories, they didn't change their minds. They physically and emotionally
couldn't.
Rather, they moved the goalposts.

Note, however, that only those who do
not
hold the irrational views in question see this behavior as suspect and illogical. The goalpost shifters probably don't perceive what they are doing, or understand why it appears (to the rest of us) to be dishonest. This is also why we tend to perceive hypocrisy in others, not in ourselves.

Indeed, a very important motivated reasoning study documented precisely this: Democrats viewed a Republican presidential candidate as a flip-flopper or hypocrite when he changed positions, and vice versa. Yet each side was more willing to credit that his own party's candidate had had an honest change in views.

The study in question was conducted by psychologist Drew Westen of Emory University (also the author of the much noted book
The Political Brain
) and his colleagues, and it's path-breaking for at least two reasons. First, Westen studied the minds of strong political partisans when they were confronted with information that directly challenged their views during a contested election—Bush v. Kerry, 2004—a time when they were most likely to be highly emotional and biased. Second, Westen's team used functional magnetic resonance imaging (fMRI) to scan the brains of these strong partisans, discovering which parts were active during motivated reasoning.

In Westen's study, strong Democrats and strong Republicans were presented with “contradictions”: Cases in which a person was described as having said one thing, and then done the opposite. In some cases these were politically neutral contradictions—e.g., about Walter Cronkite—but in some cases they were alleged contradictions by the 2004 presidential candidates. Here are some examples, which are fairly close to reality but were actually constructed for the study:

George W. Bush:
“First of all, Ken Lay is a supporter of mine. I love the man. I got to know Ken Lay years ago, and he has given generously to my campaign. When I'm President, I plan to run the government like a CEO runs a country. Ken Lay and Enron are a model of how I'll do that.”

Contradictory:
Mr. Bush now avoids any mention of Ken Lay and is critical of Enron when asked.

John Kerry:
During the 1996 campaign, Kerry told a
Boston Globe
reporter that the Social Security system should be overhauled. He said Congress should consider raising the retirement age and means testing benefits. “I know it's going to be unpopular,” he said. “But we have a generational responsibility to fix this problem.”

Contradictory:
This year, on
Meet the Press
, Kerry pledged that he will never tax or cut benefits to seniors or raise the age for eligibility for Social Security.

Encountering these contradictions, the subjects were then asked to consider whether the “statements and actions are inconsistent with each other,” and to rate how much inconsistency (or, we might say, hypocrisy) they felt they'd seen. The result was predictable, but powerful: Republicans tended to see hypocrisy in Kerry (but not Bush), and Democrats tended to see the opposite. Both groups, though, were much more in agreement about whether they'd seen hypocrisy in politically neutral figures.

This study also provides our first tantalizing piece of evidence that Republicans may be more biased, overall, in defense of their political beliefs or their party. While members of both groups in the study saw more hypocrisy or contradiction in the candidate they opposed, Democrats were more likely to see hypocrisy in their own candidate, Kerry, as well. But Republicans were less likely to see it in Bush. Thus, the authors concluded that Republicans showed “a small but significant tendency to reason to more biased conclusions regarding Bush than Democrats did toward Kerry.”

While all this was happening, the research subjects were also having their brains scanned. Sure enough, the results showed that when engaged in biased political reasoning, partisans were not using parts of the brain associated with “cold,” logical thinking. Rather, they were using a variety of regions associated with emotional processing and psychological defense. Instead of listing all the regions here—there are too many, you'd be drowning in words like “ventral”—let me instead underscore the key conclusion.

Westen captured the activation of what appeared to be emotionally oriented brain circuits when subjects were faced with a logical contradiction that activated their partisan impulses. He did not capture calm, rational deliberation. These people weren't solving math problems. They were committing the mental equivalent of beating their chests.

BOOK: The Republican Brain
4.33Mb size Format: txt, pdf, ePub
ads

Other books

Acts of God by Mary Morris
The Silver Swan by Kelly Gardiner
Brilliant Hues by Naomi Kinsman
A Sword Into Darkness by Mays, Thomas A.
Complicated by Megan Slayer
Lenin: A Revolutionary Life by Christopher Read
Saving Grace by Barbara Rogan