Six Strategies for Avoiding the Truth



Are you lying to yourself every day?

Depends: are you a "Bayesian Updater?" Hopefully you are. The term is named after Reverend Thomas Bayes. Around 1763, Bayes proposed a probability theory which stated that when you're confronted with facts contradicting your current beliefs, you change or update your beliefs.

In his new book, Answers for Aristotle, University of Chicago’s Massimo Pigliuuci suggests that if humans are rational, then the Bayesian principle should be our default. Of course, modern science has done a great deal to de-emphasize the role of logic in decision-making. Even the great philosopher Aristotle, upon closer reading, suggests we are more rationalizers than rational.

So if people aren't Bayesian Updaters, what are they? In their study of cognitive dissonance, Northwestern University professor Monica Prasad and her research team have identified six alternative strategies. Their work shines light on just how intelligent and well informed individuals can cling to a belief even in the face of all available proof to the contrary.

Her findings are based on a study about Republicans who failed to change their stance on the Iraq War, even after being confronted with hard evidence that Sadam Hussein was not connected to 9/11, as Bush had initially argued.

Here are the six most common responses Prasad identified in her study:

1.  Attitude Bolstering (33%): When told Sadam Hussein had nothing to do with 9/11, this group simply shifted to other justifications for the Iraq War. For example, “There is no doubt in my mind that if we did not deal with Saddam Hussein when we did, it was just a matter of time when we would have to deal with him.” Pigliuuci likens this to the famous Groucho Marx line, "These are my principles, if you don't like them I've got others."

2.  Disputing Rationality (16%): Having trouble justifying your reasoning? Here’s one option: don’t even try. As one subject put it, “Well, I bet they say that the Commission didn’t have any proof of it but I guess we still can have our opinions and feel that way even though they say that.”

3.   Inferred Justification (16%): Some respondents worked backwards, suggesting that even if they couldn’t find a reason, surely one had to exist, because why else would we be in Iraq? “...I believe that [the death of innocent people is wrong] also, but there must be a reason why we’re still over there or we wouldn’t be over there still.”

4.  Denial of belief in the link (14%): These subjects used a “slippery slope” defense, subtly reinterpreting the original linkage between Hussein and 9/11 to be about Afghanistan and 9/11, as if the malleability of the facts was not a problem.

5.  Counter-arguing (12%) Another common strategy was simply refuting the information. These people responded with their own arguments connecting Sadam and the 9/11 attacks. For example, “I believe he was definitely involved with it because he was  pumping money into the terrorist organizations every way he could. And he would even send $25,000 to somebody who committed suicide to kill another person, to their family.”

6.  Selective Exposure (6%) Instead of changing their mind, this group simply disengaged from the issue altogether, saying things like, “I don’t know. I don’t know anything about . . . where and what we’re going after.” and “I’m gonna pass on this one, for now.” This is sort of like covering your ears so you can't hear the question being asked, and therefore you're not responsible anymore.


Interestingly, even after the subjects were shown a quote where George Bush acknowledged that there was no linkage between 9/11 and Sadam Hussein, only 2% of those surveyed changed their minds.

It should be pointed out that this study is not a condemnation of Republicans. No personal background or political affiliation makes you immune to these fallacies. It’s not an issue of party lines, it’s an issue of being human.

So what drives our cognitive dissonance? One answer might be heuristics. These are the shortcuts, or rule of thumb processes our emotional brains use to make quick decisions. This primitive thinking system (what Daniel Kahneman calls System 1) is alive and well today. We use it on a daily basis. Heuristics are a handy way to solve a problem when time and/or energy are in short supply. The problem starts when we take the shortcut without even knowing it.


The ostrich is reputedly famous for his ability to put his head in sand. It turns out that humans have more in common with the ostrich than we might think. Kudos to the Reverend Bayes who back in the 18th century gave us the benefit of the doubt when it came to rationality. Today the Bayesians amongst us know better.

In defense of the good Reverend, perhaps back then he was unfamiliar with the ostrich.


Check out Robb’s new book and more 


Comments

Popular posts from this blog

The Robert Frost Quandary, or How Irrational Thinking Might Save Your Life

Who Are You: Does Personality Change as We Age?

The Top One Quality of True Leaders