Over 400,000 people visited Farnam Street last month to learn how to make better decisions, create new ideas, and avoid stupid errors. With more than 100,000 subscribers to our popular weekly digest, we've become an online intellectual hub. To learn more about we what do, start here.

What do you do when the evidence says you’re wrong?

We’re really good at interpreting new information through the lens of our prior (or desired) beliefs. The last thing we want to do is change our mind.

Even in the face of new evidence our brains will work overtime to discredit, forget, or otherwise ignore evidence that goes against our prior beliefs. (We even discount expert opinion!)

 

We’re also more likely to see others as ignoring new evidence than ourselves. We really stop at nothing to avoid changing our minds.

Ben Goldacre, author of Bad Science: Quacks, Hacks, and Big Pharma Flacks, recently wrote:

The classic paper on the last of those strategies is from Lord in 1979: they took two groups of people, one in favour of the death penalty, the other against it, and then presented each with a piece of scientific evidence that supported their pre-existing view, and a piece that challenged it. Murder rates went up, or down, for example, after the abolition of capital punishment in a state, or comparing neighbouring states, and the results were as you might imagine. Each group found extensive methodological holes in the evidence they disagreed with, but ignored the very same holes in the evidence that reinforced their views.

Goldacre concludes:

When presented with unwelcome scientific evidence, it seems, in a desperate bid to retain some consistency in their world view, people would rather conclude that science in general is broken. This is an interesting finding. But I’m not sure it makes me very happy.

Max Bazerman, Harvard Professor and author of one of my all time favorite books, Judgment in Managerial Decision Making, offers some clues on why we find it so hard to appropriately weigh new information that goes against our prior beliefs:

The first has to do with the way the human mind is designed to retrieve information from memory. The mere consideration of certain hypotheses makes information that is consistent with these hypotheses selectively accessible (Gilbert, 1991, How Mental Systems Believe). Indeed, research shows that the human tendency to entertain provisional hypothesis as true even makes it possible to implant people with false memories.

We also succumb to the confirmation trap due to how we search for information. Because there are limits to our attention and cognitive processing, we must search for information selectively, searching first where we are most likely to find the most useful information. One consequence is the retrievability bias. Another consequence is that people search selectively for information or give special credence to information that allows them to come to the conclusions they desire to reach (Kunda, 1990, The Case for Motivated Reasoning).

A lot of this boils down to psychology. Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, beliefs, attitudes, or opinions) that are psychologically inconsistent. Dissonance produces an uncomfortable mental state the the mind needs to resolve. In resolving dissonance our minds trend towards self-justification which makes it hard to admit mistakes.

If you want to make better decisions, you should read Judgment in Managerial Decision Making.

Shop at Amazon.com and support Farnam Street