Over 400,000 people visited Farnam Street last month to learn how to make better decisions, create new ideas, and avoid stupid errors. With more than 100,000 subscribers to our popular weekly digest, we've become an online intellectual hub. To learn more about we what do, start here.

Mental Model: Confirmation Bias

confirmation bias

The confirmation bias is the tendency to seek information that confirms prior conclusions and to ignore evidence to the contrary.

The importance of understanding this source of Psychological Misjudgment is enormous. Once you become aware of the confirmation bias you realize that it permeates your decision making process.

Several biases emerge from the confirmation heuristic: confirmation bias, anchoring, conjunctive and disjunctive events bias, overconfidence, and hindsight bias.


An example of the confirmation bias can be found in a 1960 experiment from Peter Wason.

Imagine that the sequence of three numbers below follows a rule, and that your task is to diagnose that rule. When you write down other sequences of the three numbers your instructor will tell you whether or not your sequences follow the rule.


What sequences would you write down? How would you know when you had enough evidence to guess the rule? Wason’s study participants tented to offer fairly few sequences, and the sequences tended to be consistent with the rule that they eventually guessed. Commonly proposed rules included “numbers that go up by two” and “the difference between the first two numbers equals the difference between the last two numbers.”

In fact, Wason’s rule was much broader: “any three ascending numbers.” This solution requires participants to accumulate disconfirming, rather than confirming, evidence. Wason concluded that the correct solution necessitates “a willingness to attempt to falsify hypotheses, and thus to test those intuitive ideas that so often carry the feeling of certitude.”

In Judgment in Managerial Decision Making, Harvard Professor Max Bazerman writes:

As teachers we have presented this task hundreds of times in classes. The first volunteer typically guesses “numbers going up by two” and is quickly eliminated. The second volunteer is often just as quick with a wrong answer. Interestingly, at this stage, it is rare that a volunteer will have proposed a sequences that doesn’t conform to the rule. Why? Because people naturally tent to seek information that confirms their expectations and hypotheses, even when disconfirming or falsifying information is more useful.


When we come across information that is consistent with our beliefs, we usually accept it with ease. Our bias is to uncritically accept information unless there is an unavoidable reason to doubt it. Yet when we discover facts that force us to question our beliefs, we pass it through a different filter: “Must I believe this?” In other words, we wonder whether we can dismiss the disconfirming evidence.

This was nicely captured by Dan Gilbert, author of Stumbling on Happiness, in an op-ed for the New York Times. Gilbert wrote, “when our bathroom scale delivers bad news, we hop off and then on again, just to make sure we didn’t misread the display or put too much pressure on one foot. When our scale delivers good news, we smile and head for the shower. By uncritically accepting evidence when it pleases us, and insisting on more when it doesn’t, we subtly tip the scales in our favor.” Gilbert knows what he’s talking about.


There are two main reasons that we fall prey to confirmation bias. The first has to do with the way the human mind retrieves information from memory. Gilbert argued in a 1991 research paper (How Mental Systems Believe) that considering certain hypotheses ensures that we retrieve information that is consistent with that hypotheses. As Bazerman describes it “the mere consideration of certain hypotheses makes information that is consistent with these hypotheses selectively accessible.” The second way we fall victim to the confirmation bias is how we search for information. Our minds try to conserve energy. This means we selectively search for information where we are most likely to find it. Consequently we fall into retrieval bias.

Tversky and Kahneman (1983) demonstrated the retrievability bias when they asked participants in their study to estimate the frequency of seven letter words with an “n” in the sixth position. Participants in the study estimated that such words were less common than seven-letter words ending in the more memorable three-letter “ing” sequence. Careful readers have no doubt noted that all words with seven letters that end in “ing” also have an “n” as their sixth letter. It is impossible for the frequency of seven-letter words that end in “ing” to be larger than the number of seven-letter words with an “n” as the sixth letter. Tversky and Kahneman explained this by arguing that “ing” words are more retrievable from memory because of the commonly ending “ing” suffix. On the other hand, it’s mentally intensive to come up with seven-letter words that have an “n” as the sixth letter.


Another consequence of our our minds desire to expend as little energy as possible is that we give special weight to information that allows us to come to the conclusion we want to reach (Kunda, 1990, The Case for Motivated Reasoning).

In trying to save energy, our minds search for information in a way that almost ensures our interpretation of the evidence is biased.

Warren Buffett offers the following:

What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.

Charlie Munger says this about Charles Darwin and the confirmation bias:

…the great example of Charles Darwin is he avoided confirmation bias. Darwin probably changed my life because I’m a biography nut, and when I found out the way he always paid extra attention to the disconfirming evidence and all these little psychological tricks. I also found out that he wasn’t very smart by the ordinary standards of human acuity, yet there he is buried in Westminster Abbey. That’s not where I’m going, I’ll tell you. And I said, “My God, here’s a guy that, by all objective evidence, is not nearly as smart as I am and he’s in Westminster Abbey? He must have tricks I should learn.” And I started wearing little hair shirts like Darwin to try and train myself out of these subconscious psychological tendencies that cause so many errors. It didn’t work perfectly, as you can tell from listening to this talk, but it would’ve been even worse if I hadn’t done what I did. And you can know these psychological tendencies and avoid being the patsy of all the people that are trying to manipulate you to your disadvantage, like Sam Walton. Sam Walton won’t let a purchasing agent take a handkerchief from a salesman. He knows how powerful the subconscious reciprocation tendency is. That is a profoundly correct way for Sam Walton to behave.

* * *

One way to avoid the confirmation bias  is to search for disconfirming evidence.

The Confirmation Bias is a part of the Farnam Street Mental Model List.