Tag: Confirmation bias

Confirmation Bias: Why You Should Seek Out Disconfirming Evidence

“What the human being is best at doing is
interpreting all new information
so that their prior conclusions remain intact.”

— Warren Buffett

***

The Basics

Confirmation bias is our tendency to cherry pick information which confirms pre-existing beliefs or ideas. This is also known as myside bias or confirmatory bias. Two people with opposing views on a topic can see the same evidence, and still come away both validated by it. Confirmation bias is pronounced in the case of ingrained, ideological, or emotionally charged views.

Failing to interpret information in an unbiased way can lead to serious misjudgements. By understanding this, we can learn to identify it in ourselves and others. We can be cautious of data which seems to immediately support our views.

When we feel as if others ‘cannot see sense’, a grasp of how confirmation bias works can enable us to understand why. Willard V Quine and J.S. Ullian described this bias in The Web of Belief as such:

The desire to be right and the desire to have been right are two desires, and the sooner we separate them the better off we are. The desire to be right is the thirst for truth. On all counts, both practical and theoretical, there is nothing but good to be said for it. The desire to have been right, on the other hand, is the pride that goeth before a fall. It stands in the way of our seeing we were wrong, and thus blocks the progress of our knowledge.

Experimentation beginning in the 1960s revealed our tendency to confirm existing beliefs, rather than questioning them or seeking new ones. Other research has revealed our single-minded need to enforce ideas.

Like many mental models, confirmation bias was first identified by the ancient Greeks. In The Peloponnesian War, Thucydides described this tendency as such:

For it is a habit of humanity to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy.

Why we use this cognitive shortcut is understandable. Evaluating evidence (especially when it is complicated or unclear) requires a great deal of mental energy. Our brains prefer to take shortcuts. This saves the time needed to make decisions, in particular when under pressure. As many evolutionary scientists have pointed out, our minds are unequipped to handle the modern world. For most of human history, people experienced very little information during their lifetimes. Decisions tended to be survival based. Now, we are constantly receiving new information and have to make numerous complex choices each day. To stave off overwhelm, we have a natural tendency to take shortcuts.

In The Case for Motivated Reasoning, Ziva Kunda wrote “we give special weight to information that allows us to come to the conclusion we want to reach.” Accepting information which confirms our beliefs is easy and requires little mental energy. Yet contradicting information causes us to shy away, grasping for a reason to discard it.

In The Little Book of Stupidity, Sia Mohajer wrote:

The confirmation bias is so fundamental to your development and your reality that you might not even realize it is happening. We look for evidence that supports our beliefs and opinions about the world but excludes those that run contrary to our own… In an attempt to simplify the world and make it conform to our expectations, we have been blessed with the gift of cognitive biases.

How Confirmation Bias Clouds our Judgement

“The human understanding when it has once adopted an opinion draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects.”
— Francis Bacon

***

The complexity of confirmation bias partly arises from the fact that it is impossible to overcome it without an awareness of the concept. Even when shown evidence to contradict a biased view, we may still interpret it in a manner which reinforces our current perspective.

In one Stanford study, participants were chosen, half of whom were in favor of capital punishment. The other half were opposed to it. Both groups read details of the same two fictional studies. Half of the participants were told that one study supported the deterrent effect of capital punishment and the other opposed it. The other participants read the inverse information. At the conclusion of the study, the majority of participants stuck to their original views, pointing to the data which supported it and discarding that which did not.

Confirmation bias clouds our judgement. It gives us a skewed view of information, even straight numerical figures. Understanding this cannot fail to transform a person’s worldview. Or rather, our perspective on it. Lewis Carroll stated “we are what we believe we are”, but it seems that the world is also what we believe it to be.

A poem by Shannon L. Adler illustrates this concept:

Read it with sorrow and you will feel hate.
Read it with anger and you will feel vengeful.
Read it with paranoia and you will feel confusion.
Read it with empathy and you will feel compassion.
Read it with love and you will feel flattery.
Read it with hope and you will feel positive.
Read it with humor and you will feel joy.
Read it without bias and you will feel peace.
Do not read it at all and you will not feel a thing.

Confirmation bias is somewhat linked to our memories (similar to availability bias.) We have a penchant for recalling evidence which backs up our beliefs. However neutral the original information was, we fall prey to selective recall. As Leo Tolstoy wrote:

The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.

Why We Ignore Contradicting Evidence

“Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the destruction of their original evidential bases.”
—Lee Ross and Craig Anderson

***

Why is it that we struggle to even acknowledge information which contradicts our views? When first learning about the existence of confirmation bias, many people deny they are affected. After all, most of us see ourselves as intelligent, rational people. So, how can our beliefs persevere even in the face of clear, empirical evidence? Even when something is proven untrue many entirely sane people continue to find ways to mitigate the subsequent cognitive dissonance.

Much of this is the result of our need for cognitive consistency. We are bombarded by information. It comes from other people, the media, our experience, and different sources. Our minds must find means of encoding, storing, and retrieving the data we are exposed to. One way we do this is by developing cognitive shortcuts and models. These can be useful, or unhelpful. Confirmation bias is one of the less helpful heuristics which exists as a result. The information which we interpret is influenced by existing beliefs, meaning we are more likely to recall it. As a consequence, we tend to see more evidence which enforces our worldview. Confirmatory data is taken seriously, while disconfirmatory data is treated with scepticism. Our general assimilation of information is subject to deep bias. To constantly evaluate our worldview would be exhausting, so we prefer to strengthen it. It can also be difficult to consider multiple ideas at once, making it simpler to focus on just one.

We ignore contradictory evidence because it is so unpalatable for our brains. According to research by Jennifer Lerner and Philip Tetlock, we are motivated to think in a critical manner only when held accountable by others. If we are expected to justify our beliefs, feelings, and behaviour to others, we are less likely to be biased towards confirmatory evidence. This is less out of a desire to be accurate, and more the result of wanting to avoid negative consequences or derision for being illogical. Ignoring evidence can be beneficial, such as when we side with the beliefs of others to avoid social alienation.

Examples of Confirmation Bias in Action

Creationists vs Evolutionary Biologists

A prime example of confirmation bias can be seen in the clashes between creationists and evolutionary biologists. The latter use scientific evidence and experimentation to reveal the process of biological evolution over millions of years. The former see the bible as true in the literal sense, and think the world is only a few thousand years old. Creationists are skilled at mitigating the cognitive dissonance caused by factual evidence which disproves their ideas. Many consider the non-empirical ‘evidence’ for their beliefs (such as spiritual experiences and the existence of scripture) to be of greater value than the empirical evidence for evolution.

Evolutionary biologists have used fossil records to prove how the process of evolution has occurred over millions of years. Meanwhile, some creationists view the same fossils as planted by a god to test our beliefs. Others claim that fossils are proof of the global flood described in the bible. They ignore evidence to contradict these conspiratorial ideas, instead of using it to confirm what they already think.

Doomsayers

Take a walk through London on a busy day and you are pretty much guaranteed to see a doomsayer on a street corner ranting about the upcoming apocalypse. Return a while later and you will find them still there, announcing that the end has been postponed.

Leon Festinger explained the phenomena:

Suppose an individual believes something with his whole heart, suppose further that he has a commitment to this belief that he has taken irrevocable actions because of it. Finally, suppose that he is presented with evidence, unequivocal, and undeniable evidence that his belief is wrong, what will happen? The individual will frequently emerge, not only unshaken but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting people to his view

Music

Confirmation bias in music is interesting because it is actually part of why we enjoy it so much. According to Daniel Levitin, author of This is Your Brain on Music:

As music unfolds, the brain constantly updates its estimates of when new beats will occur, and takes satisfaction in matching a mental beat with a real-in-the-world one.

Witness the way a group of teenagers will act when someone puts on Wonderwall by Oasis or Creep by Radiohead. Or how their parents react to Starman by Bowie or Alone by Heart. Or even their grandparents to The Way You Look Tonight by Sinatra or Je ne Regrette Rien by Edith Piaf. The ability to predict each successive beat or syllable is intrinsically pleasurable. This is a case of confirmation bias serving us well. We learn to understand musical patterns and conventions, enjoying seeing them play out.

Homeopathy

The multi-billion dollar homeopathy industry is an example of mass confirmation bias.

Homeopathy was invented by Jacques Benveniste, a French researcher studying histamines. Benveniste became convinced that the effectiveness of histamines increased as a solution was diluted, due to what he termed ‘water memories.’ Test results were performed without blinding, leading to a placebo effect. Benveniste was so certain of his hypothesis that he found data to confirm it and ignored that which did not. Other researchers repeated his experiments with appropriate blinding and proved Benveniste’s results to have been false. Many of the people who worked with him withdrew from science as a result.

Yet homeopathy supporters have only grown in numbers. Supporters cling to any evidence to support homeopathy while ignoring that which does not.

Scientific Experiments

“One of the biggest problems with the world today is that we have large groups of people who will accept whatever they hear on the grapevine, just because it suits their worldview—not because it is actually true or because they have evidence to support it. The striking thing is that it would not take much effort to establish validity in most of these cases… but people prefer reassurance to research.”
— Neil deGrasse Tyson

In good scientific experiments, researchers should seek to falsify their hypotheses, not to confirm it. Unfortunately, this is not always the case (as shown by homeopathy.) There are many cases of scientists interpreting data in a biased manner, or repeating experiments until they achieve the desired result. Confirmation bias also comes into play when scientists peer review studies. They tend to give positive reviews of studies which confirm their views and those accepted by the scientific community.

This is problematic. Inadequate research programs can continue past the point where evidence points to a false hypothesis. Confirmation bias wastes a huge amount of time and funding. We must not take science at face value and be aware of the role of biased reporting.

Conclusion

“The eye sees only what the mind is prepared to comprehend.”
— Robertson Davies

***

This article has the potential to be an opportunity to assess how confirmation bias affects you. Consider looking back over the previous paragraphs and asking:

  • Which parts did I automatically agree with?
  • Which parts did I ignore or skim over without realizing?
  • How did I react to the points which I agreed/disagreed with?
  • Did this post confirm any ideas I already had? Why?
  • What if I thought the opposite of those ideas?

Being cognizant of confirmation is not easy, but with practice, it is possible to recognize the role it plays in the way we interpret information. You need to search out disconfirming evidence.

As Rebecca Goldstein wrote in Incompleteness: The Proof and Paradox of Kurt Godel:

All truths — even those that had seemed so certain as to be immune to the very possibility of revision — are essentially manufactured. Indeed, the very notion of the objectively true is a socially constructed myth. Our knowing minds are not embedded in truth. Rather, the entire notion of truth is embedded in our minds, which are themselves the unwitting lackeys of organizational forms of influence.

To learn more about confirmation bias, read The Little Book of Stupidity or The Black Swan. Be sure to check out our entire latticework of mental models.

“The eye sees only what the mind is prepared to comprehend.”

-Robertson Davies

Nassim Taleb: How to Not be a Sucker From the Past

"History is useful for the thrill of knowing the past, and for the narrative (indeed), provided it remains a harmless narrative." — Nassim Taleb
“History is useful for the thrill of knowing the past, and for the narrative (indeed), provided it remains a harmless narrative.” — Nassim Taleb

The fact that new information exists about the past in general means that we have an incomplete road map about history. There is a necessarily fallibility … if you will.

In The Black Sawn, Nassim Taleb writes:

History is useful for the thrill of knowing the past, and for the narrative (indeed), provided it remains a harmless narrative. One should learn under severe caution. History is certainly not a place to theorize or derive general knowledge, nor is it meant to help in the future, without some caution. We can get negative confirmation from history, which is invaluable, but we get plenty of illusions of knowledge along with it.

While I don't entirely hold Taleb's view, I think it's worth reflecting on. As a friend put it to me recently, “when people are looking into the rear view mirror of the past, they can take facts and like a string of pearls draw lines of causal relationships that facilitate their argument while ignoring disconfirming facts that detract from their central argument or point of view.”

Taleb advises us to adopt the empirical skeptic approach of Menodotus which was to “know history without theorizing from it,” and to not draw any large theoretical or scientific claims.

We can learn from history but our desire for causality can easily lead us down a dangerous rabbit hole when new facts come to light disavowing what we held to be true. In trying to reduce the cognitive dissonance, our confirmation bias leads us to reinterpret past events in a way that fits our current beliefs.

History is not stagnant — we only know what we know currently and what we do know is subject to change. The accepted beliefs about how events played out may change in light of new information and then the new accepted beliefs may change over time as well.

Falsification: How to Destroy Incorrect Ideas

“The human mind is a lot like the human egg,
and the human egg has a shut-off device.
When one sperm gets in, it shuts down so the next one can’t get in.”

— Charlie Munger

***

Sir Karl Popper wrote that the nature of scientific thought is that we could never be sure of anything. The only way to test the validity of any theory was to prove it wrong, a process he labeled falsification. And it turns out we're quite bad at falsification.

When it comes to testing a theory we don't instinctively try to find evidence we're wrong. It's much easier and more mentally satisfying to find information that proves our intuition. This is known as the confirmation bias.

In Paul Tough's book, How Children Succeed: Grit, Curiosity, and the Hidden Power of Character, he tells the story of an English psychologist Peter Cathcart Wason, who came up with an “ingenious experiment to demonstrate our natural tendency to confirm rather than disprove our own ideas.”

Subjects were told that they would be given a series of three numbers that followed a certain rule known only to the experimenter. Their assignment was to figure out what the rule was, which they could do by offering the experimenter other strings of three numbers and asking him whether or not these new strings met the rule.

The string of numbers the subjects were given was quite simple:

2-4-6

Try it: What’s your first instinct about the rule governing these numbers? And what’s another string you might test with the experimenter in order to find out if your guess is right? If you’re like most people, your first instinct is that the rule is “ascending even numbers” or “numbers increasing by two.” And so you guess something like:

8-10-12

And the experimenter says, “Yes! That string of numbers also meets the rule.” And your confidence rises. To confirm your brilliance, you test one more possibility, just as due diligence, something like:

20-22-24

“Yes!” says the experimenter. Another surge of dopamine. And you proudly make your guess: “The rule is: even numbers, ascending in twos.” “No!” says the experimenter. It turns out that the rule is “any ascending numbers.” So 8-10-12 does fit the rule, it’s true, but so does 1-2-3. Or 4-23-512. The only way to win the game is to guess strings of numbers that would prove your beloved hypothesis wrong—and that is something each of us is constitutionally driven to avoid.

In the study, only 1 in five people was able to guess the correct rule.

And the reason we’re all so bad at games like this is the tendency toward confirmation bias: It feels much better to find evidence that confirms what you believe to be true than to find evidence that falsifies what you believe to be true. Why go out in search of disappointment?

There is also a video explaining Wason's work.

Michael Mauboussin: Three Steps to Effective Decisions

three steps

Making an important decision is never easy, but making the right decision is even more challenging.

Effective decision-making isn't just about accumulating information and going with what seems to make the most sense. Sometimes, internal biases can impact the way we seek out and process information, polluting the conclusions we reach in the process. It's critical to be conscious of those tendencies and to accumulate the sort of fact-based and unbiased inputs that will result in the highest likelihood that a decision actually leads to the desired outcome.

In this video, Michael Mauboussin, Credit Suisse's Head of Financial Strategies, lays out three steps that can help focus a decision-maker's thinking.

How do we take new information that comes in and integrate it with our point of view?

Typically we don't really take into consideration new information. The first major barrier to that is something called the confirmation bias. Once you've decided on something and you think this is the right way to think about it, you either blow off new information or if it's ambiguous you interpret it in a way that's favorable to you. Now the next problem, and we all have this, is called pseudo and subtly-diagnostic information. Pseudodiagnostic means information that isn't very relevant but you think it is. Subtly-diagnostic is information that is relevant and you don't pay attention to it.

So the key in all of this, is we have this torrent of information coming in, how do I sort that in a way that should lead me to increase or decrease my probabilities of a particular event happening.

The Four Villains of Decision Making

You're probably not as effective at making decisions as you could be.

Don't worry. I'm going to show you how you can make better decisions in work and life.

We're going to explore Chip and Dan Heaths' new book, Decisive. It's going to help us make better decisions both as individuals and in groups.

But before we get into that, you should think about a tough decision you're grappling with right now. Having a decision working in your mind as you're reading this post will help make the advice in here real.

Ok, let's dig in.

***

“A remarkable aspect of your mental life is that you are rarely stumped … The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it.”
— Daniel Kahneman in Thinking, Fast and Slow

***

We're quick to jump to conclusions because we give too much weight to the information in front of us and we fail to search for new information, which might disprove our thoughts.

Nobel Prize winning Psychologist Daniel Kahneman called this tendency “what you see is all there is.” But that's not the only reason we don't make good decisions — there are many others.

We're overconfident. We look for information that fits our thoughts and ignore information that doesn't. We are overly influenced by authority. We choose the short-term over the long-term. Once we've made a decision we find it hard to change our mind. In short our brains are flawed. I could go on.

Knowing about these and other biases isn't enough; it doesn't help us fix the problem. We need a framework for making decisions. In Decisive, the Heaths introduce a four-step process designed to counteract many biases.

In keeping with Kahneman's visual metaphor, the Heaths refer to the tendency to see only what's in front of us as a “spotlight” effect.

And that, in essence, is the core difficulty of decision making. What's in the spotlight will rarely be everything we need to make good decisions, but we won't always remember to shift the light.

Most of us rarely use a process for thinking about things. If we do use one it's likely to be the pros-and-cons list. While better than nothing, this approach is still deeply flawed because it doesn't really account for biases.

The Four Villains of Decision Making

  1. Narrow Framing: “… the tendency to define our choices to narrowly, to see them in binary terms. We ask, “Should I break up with my partner or not?” instead of “What are the ways I could make this relationship better?”
  2. Confirmation Bias: “When people have the opportunity to collect information from the world, they are more likely to select information that supports their preexisting attitudes, beliefs, and actions.” We pretend we want the truth, yet all we really want is reassurance.
  3. Short-term Emotion: “When we've got a difficult decision to make, our feelings churn. We replay the same arguments in our head. We agonize about our circumstances. We change our minds from day to day. If our decision was represented on a spreadsheet, none of the numbers would be changing—there's no new information being added—but it doesn't feel that way in our heads.”
  4. Overconfidence: “People think they know more than they do about how the future will unfold.”

The Heaths came up with a process to help us overcome these villains and make better choices. “We can't deactivate our biases, but … we can counteract them with the right discipline.” The nature of each of the four decision-making villains suggests a strategy for how to defeat it.

1. You encounter a choice. But narrow framing makes you miss options. So … Widen Your Options. How can you expand your sent of choices? …

2. You analyze your options. But the confirmation bias leads you to gather self-serving information. So … Reality-Test Your Assumptions. How can you get outside your head and collect information you can trust? …

3. You make a choice. But short-term emotion will often tempt you to make the wrong one. So … Attain Distance Before Deciding. How can you overcome short-term emotion and conflicted feelings to make better choices? …

4. Then you live with it. But you'll often be overconfident about how the future will unfold. So … Prepare to Be Wrong. How can we plan for an uncertain future so that we give our decisions the best chance to succeed? …

They call this WRAP. “At its core, the WRAP model urges you to switch from “auto spotlight” to manual spotlight.

WRAP

All in all this was a great book. We focus our efforts on analysis. If a decision is wrong the analysis must have been the problem. Not only does this ignore the fact that you can have ‘bad outcomes' with good decisions but it also places your spotlight on the analysis at the cost of the process by which the decision was made. More to come on this …

Read this next: What Matters More in Decisions: Analysis or Process?

 

How you can instantly improve your marriage

Most of us see what we want to see.

If we're arguing with a spouse, we're going to start seeing all of their faults. After all, it's not my fault it's your fault. Once we've labeled someone as, say, selfish, it becomes self-reinforcing thanks to the availability and confirmation bias. Our views become so clouded that we can't appreciate the positive attributes about our partner.

Not only do we search for information that agrees with us but we fail to notice anything to the contrary. “We see,” writes Aaron Beck in his book Love is Never Enough, “each other through the bias of negative frames.”

There is a way for couples to fight the tendency to only notice what's wrong: keep a “marriage diary,” with a list of all the things your partner does that you like.

In his book, Beck describes a couple, Karen and Ted, who are having marriage troubles. Beck suggested they keep a marriage diary.

After I proposed to Karen and Ted that each take notes of what the other did that was pleasing during the previous week, Karen reported the following:

Ted was great. I was really upset by some of my clients. They are a real pain. … Anyhow, I told Ted about it. He was very sympathetic. He didn't try to tell me what to do. He said that if he was in my position, he would probably feel frustrated, too. He said that my clients are tough to deal with. I felt a lot better.

Each of Ted's actions pleased Karen, who remarked, “They were like presents.” Although Ted had done similar things for Karen in the past, they had been erased from her memory because of the negative view of Ted.

The same was also true for Ted.

Psychologist Mark Kane Goldstein has used this method to help husbands and wives keep “track of their partner's pleasant actions.”

Each spouse is given several sheets of graph paper on which to record whatever his or her partner does that is pleasing. The spouse rates these acts on a ten-point scale, indicating degree of satisfaction. Dr. Goldstein found that 70 percent of the couples who tried this simple method reported an improvement in their relationship.

Simply by shifting their focus away from the negative and onto little pleasures, couples were more aware of their satisfaction.

Psychologists call this “considering the opposite.”