This is Your Brain on Politics

It is a tautology to say that our political views affect our attitudes toward various societal issues. If I knew in advance whether you were a conservative or a liberal, for example, I could probably predict with better-than-chance accuracy your views on climate change, teaching evolution in public schools, or genetically modified foods.

But did you know that our political leanings can also alter our perceptions of reality?

This may seem surprising, but it’s true. A growing body of research shows that our political beliefs can influence not only the way our brains interpret and remember scientific information, but even our subjective perceptions of our physical environments.

The Surprising Relationship Between Cognitive Processing and Politics

Take the issue of gun control. A study by Dan Kahan and colleagues found that when presented with scientific results that purportedly tested a politically neutral proposition (the effectiveness of a new skin cream), subjects interpret these data consistent with their pre-tested mathematical reasoning abilities. For this condition, there is no significant difference in performance among Democrats and Republicans with similar reasoning skills. But when the same results are presented as testing a politically charged proposition (the effectiveness of laws banning concealed weapons), subjects’ performance depends heavily on whether they are liberal or conservative. Democrats perform much better as a group compared to Republicans when the results support their political views and vice versa.

John Jost and Erin Hennes obtained similar results studying reactions to climate change science. They found that political leanings predict not only whether a subject will find scientific evidence of climate change to be persuasive, but also how well a subject will be able to remember this evidence. Subjects with conservative leanings are more prone to misremember the evidence as being weaker than it actually is.

Perhaps most surprisingly, Jost and Hennes found that after being reminded of their political views on a hot summer day in New York City, subjects with conservative leanings perceive the ambient temperature to be significantly lower than those with more liberal views. Even our perceptual experience of the outside world, then, can be altered by our political predispositions.

What’s Behind it?

Psychologists attribute these kinds of results to the phenomenon of motivated reasoning. A cognitive strategy to reduce dissonance, motivated reasoning leads people to accept information that confirms their prior beliefs and, conversely, reject information that contradicts them. Neural imaging studies have shown that people use different parts of their brain when engaging in motivated reasoning compared to “pure” reasoning. As demonstrated by Jost and Hennes, motivated reasoning may even lead our brains to process sensory information (like temperature) differently. And as we can see from Kahan’s and other studies, motivated reasoning is not limited to a particular political persuasion: both liberals and conservatives engage in it.

What’s at Stake?

The extent to which our brains will go to keep our worlds consistent with our deeply held beliefs has profound effects not only for science communication and political discourse, but also for the legitimacy of science as an institution. A recent study by Erik Nisbet, Kathryn Cooper, and Kelley Garett suggests that the more we are presented with scientific information that conflicts with our beliefs and causes us to engage in motivated reasoning, the more we distrust the science community and the angrier we become about that issue. In other words, rather than blaming our own fallible brains for improperly rejecting scientific information, we tend to blame the messenger. At some point, this could lead to a total communication breakdown between scientists and policy-motivated educators on the one hand, and their target audiences on the other—and when this occurs further educational efforts might only make things worse. One disturbing study found that after being exposed to myth-debunking information about the flu vaccine, people who had prior anti-vaccine attitudes were even less likely to want to get the vaccine than they were before being exposed to the information.

What to do?

So what should we do about it?  Well, first of all, and contrary to the beliefs of many, it’s becoming clear that educational campaigns alone are an often insufficient—and perhaps even detrimental—way to motivate people to specific beliefs or actions on highly charged issues.

Because motivated reasoning is an emotion-driven form of reasoning, one possible solution is to try to take politics—and the emotion that goes with it—out of the picture when communicating scientific results. In other words, to make the issue less highly charged. As Nisbet, Cooper, and Garett point out, there are many social issues about which science has something to say—including nuclear power and fracking—but for some reason the media has not chosen to make these into politically polarizing issues to the same degree as, say, climate change.

We are all now a part of the media, so we can all do our part to try to de-charge public discourse around issues we feel strongly about. If this seems impossible for a particular issue, another suggestion is to present value-based arguments rather than evidence-based arguments. So, for example, if you’d really like to change some minds in the vaccine debate, try the following. Don’t present scientific evidence to all of your Facebook and Twitter followers with a caption like “It’s science dummies!”  This, as has been shown, will only make the people who disagree with you angrier about the issue (really, did we need a study to tell us this?). Instead, try appealing to shared values. Talk about concern for children’s health, commitment to community, etc.

And if you happen to be a conservative, don’t forget to put a sweater on, because it’s cold out there!