Tag: Commitment and Consistency Bias

Mental Model: Commitment and Consistency Bias

“The difficulty lies not in the new ideas,
but in escaping the old ones, which ramify,
for those brought up as most of us have been,
into every corner of our minds.”

— John Maynard Keynes

***

Ben Franklin tells an interesting little story in his autobiography. Facing opposition to being reelected Clerk of the General Assembly, he sought to gain favor with the member so vocally opposing him:

Having heard that he had in his library a certain very scarce and curious book, I wrote a note to him, expressing my desire of perusing that book, and requesting he would do me the favour of lending it to me for a few days. He sent it immediately, and I return'd it in about a week with another note, expressing strongly my sense of the favour.

When we next met in the House, he spoke to me (which he had never done before), and with great civility; and he ever after manifested a readiness to serve me on all occasions, so that we became great friends, and our friendship continued to his death.

This is another instance of the truth of an old maxim I had learned, which says, “He that has once done you a kindness will be more ready to do you another, than he whom you yourself have obliged.”

The man, having lent Franklin a rare and valuable book, sought to stay consistent with his past actions. He wouldn't, of course, lend a book to an unworthy man, would he?

***

Positive Self Image

Scottish philosopher and economist Adam Smith said in The Theory of Moral Sentiments:

The opinion which we entertain of our own character depends entirely on our judgments concerning our past conduct. It is so disagreeable to think ill of ourselves, that we often purposely turn away our view from those circumstances which might render that judgment unfavorable.

Even when it acts against our best interest our tendency is to be consistent with our prior commitments, ideas, thoughts, words, and actions. As a byproduct of confirmation bias, we rarely seek disconfirming evidence of what we believe. This, after all, makes it easier to maintain our positive self-image.

Part of the reason this happens is our desire to appear and feel like we’re right. We also want to show people our conviction. This shouldn’t come as a surprise. Society values consistency and conviction even when it is wrong.

We associate consistency with intellectual and personal strength, rationality, honesty, and stability. On the other hand, the person who is perceived as inconsistent is also seen as confused, two-faced, even mentally ill in certain extreme circumstances.

A politician, for example, who wavers, gets labelled a flip flopper and can lose an election over it (John Kerry). A CEO who risks everything on a successful bet and holds a conviction that no one else holds is held to be a hero (Elon Musk).

But it’s not just our words and actions that nudge our subconscious, but also how other people see us. There is a profound truth behind Eminem’s lyrics: I am, whatever you say I am. If I wasn't, then why would I say I am?

If you think I’m talented, I become more talented in your eyes — in part because you labelling me as talented filters the way you see me. You start seeing more of my genius and less of my normal-ness, simply by way of staying consistent with your own words.

In his book Outliers, Malcolm Gladwell talks about how teachers simply identifying students as smart not only affected how the teachers saw their work but, more importantly, affected the opportunities teachers gave the students. Smarter students received better opportunities, which, we can reason, offers them better experiences. This is turn makes them better. It’s almost a self-fulfilling prophecy.

And the more we invest in our beliefs of ourselves or others—think money, effort, or pain, the more sunk costs we have and the harder it becomes to change our mind. It doesn’t matter if we’re right. It doesn’t matter if the Ikea bookshelf sucks, we’re going to love it.

In Too Much Invested to Quit, psychologist Allan Teger says something similar of the Vietnam War:

The longer the war continued, the more difficult it was to justify the additional investments in terms of the value of possible victory. On the other hand, the longer the war continued, the more difficult it became to write off the tremendous losses without having anything to show for them.

***

As a consequence, there are few rules we abide by more than the “Don’t make any promises that you can’t keep.” This, generally speaking, is a great rule that keeps society together by ensuring that our commitments for the most part are real and reliable.

Aside from the benefits of preserving our public image, being consistent is simply easier and leads to a more predictable and consistent life. By being consistent in our habits and with previous decisions, we significantly reduce the need to think and can go on “auto-pilot” for most of our lives.

However beneficial these biases are, they too deserve deeper understanding and caution. Sometimes our drive to appear consistent can lure us into choices we otherwise would consider against our best interests. This is the essence of a harmful bias as opposed to a benign one: We are hurting ourselves and others by committing it. 

A Slippery Slope

Part of why commitment can be so dangerous is because it is like a slippery slope – you only need a single slip to slide down completely. Therefore compliance to even tiny requests, which initially appear insignificant, have a good probability of leading to full commitment later.

People whose job it is to persuade us know this.

Among the more blunt techniques on the spectrum are those reported by a used-car sales manager in Robert Cialdini’s book Influence. The dealer knows the power of commitment and that if we comply a little now, we are likely to comply fully later on. His advice to other sellers goes as follows:

“Put 'em on paper. Get the customer's OK on paper. Get the money up front. Control 'em. Control the deal. Ask 'em if they would buy the car right now if the price is right. Pin 'em down.”

This technique will be obvious to most of us. However, there are also more subtle ways to make us comply without us noticing.

A great example of a subtle compliance practitioner is Jo-Ellen Demitrius, the woman currently reputed to be the best consultant in the business of jury selection.

Whenever screening potential jurors before a trial she asks an artful question:

“If you were the only person who believed in my client's innocence, could you withstand the pressure of the rest of the jury to change your mind?”

It’s unlikely that any self-respecting prospective juror would answer negatively. And, now that the juror has made the implicit promise, it is unlikely that once selected he will give in to the pressure exerted by the rest of the jury.

Innocent questions and requests like this can be a great springboard for initiating a cycle of compliance.

The Lenient Policy

A great case study for compliance is the tactics that Chinese soliders employed on American war captives during the Korean War. The Chinese were particularly effective in getting Americans to inform on one another. In fact, nearly all American prisoners in the Chinese camps are said to have collaborated with the enemy in one way or another.

This was striking, since such behavior was rarely observed among American war prisoners during WWII. It raises the question of what secret trades led to the success of the Chinese?

Unlike the North Koreans, the Chinese did not treat the victims harshly. Instead they engaged in what they called “lenient policy” towards the captives, which was, in reality, a clever series of psychological assaults.

In their exploits the Chinese relied heavily on commitment and consistency tactics to receive the compliance they desired. At first, the Americans were not too collaborative, as they had been trained to provide only name, rank, and serial number, but the Chinese were patient.

They started with seemingly small but frequent requests to repeat statements like “The United States is not perfect” and “In a Communist country, unemployment is not a problem.” Once these requests had been complied with, the heaviness of the requests grew. Someone who had just agreed that United States was not perfect would be encouraged to expand on his thoughts about specific imperfections. Later he might be asked to write up and read out a list of these imperfections in a discussion group with other prisoners. “After all, it's what you really believe, isn't it?” The Chinese would then broadcast the essay readings not only to the whole camp, but to other camps and even the American forces in South Korea. Suddenly the soldier would find himself a “collaborator” of the enemy.

The awareness that the essays did not contradict his beliefs could even change his self-image to be consistent with the new “collaborator” label, often resulting in more cooperation with the enemy.

It is not surprising that very few American soldiers were able to avoid such “collaboration” altogether.

Foot in the Door

The small request growing into bigger requests as applied by the Chinese on American soldiers is also called the Foot-in-the-door Technique. It was first discovered by two scientists – Freedman and Fraser, who had worked on an experiment in which a fake volunteer worker asked home owners to allow a public-service billboard to be installed on their front lawns.

To get a better idea of how it would look, the home owners were even shown a photograph depicting an attractive house that was almost completely obscured by an ugly sign reading DRIVE CAREFULLY. While the request was quite understandably denied by 83 percent of residents, one particular group reacted favorably.

Two weeks earlier a different “volunteer worker” had come and asked the respondents of this group a similar request to display a much smaller sign that read BE A SAFE DRIVER. The request was so negligible that nearly all of them complied. However, the future effects of that request turned out to be so enormous that 76 percent of this group complied with the bigger, much less reasonable request (the big ugly sign).

At first, even the researchers themselves were baffled by the results and repeated the experiment on similar setups. The effect persisted. Finally, they proposed that the subjects must have distorted their own views about themselves as a result of their initial actions:

What may occur is a change in the person's feelings about getting involved or taking action. Once he has agreed to a request, his attitude may change, he may become, in his own eyes, the kind of person who does this sort of thing, who agrees to requests made by strangers, who takes action on things he believes in, who cooperates with good causes.

The rule goes that once someone has instilled our self-image where they want it to be, we will comply naturally with the set of requests that adhere to the new self-view. Therefore we must be very careful about agreeing to even the smallest requests. Not only can it make us comply with larger requests later on, but it can make us even more willing to do favors that are only remotely connected to the earlier ones.

Even Cialdini, someone who knows this bias inside-out, admits to his fear that his behavior will be affected by consistency bias:

It scares me enough that I am rarely willing to sign a petition anymore, even for a position I support. Such an action has the potential to influence not only my future behavior but also my self-image in ways I may not want.

Further, once a person's self-image is altered, all sorts of subtle advantages become available to someone who wants to exploit that new image.

Give it, take it away later

Have you ever witnessed a deal that is a little too good to be true only to later be disappointed? You had already made up your mind, had gotten excited and were ready to pay or sign until a calculation error was discovered. Now with the adjusted price, the offer did not look all that great.

It is likely that the error was not an accident – this technique, also called low-balling, is often used by compliance professionals in sales. Cialdini, having observed the phenomenon among car dealers, tested its effects on his own students.

In an experiment with colleagues, he made two groups of students show up at 7:00 AM in the morning to do a study on “thinking processes”. When they called one group of students they immediately told them that the study starts at 7:00 AM. Unsurprisingly, only 24 percent wanted to participate.

However, for the other group of students, researchers threw a low-ball. The first question was whether they wanted to take part in a study of thinking processes. Fifty-six percent of them replied positively. Now, to those that agreed, the meeting time of 7:00 AM was revealed.

These students were given the opportunity to opt out, but none of them did. In fact, driven by their commitment, 95 percent of the low-balled students showed up to the Psychology Building at 7:00 AM as they had promised.

Do you recognize the similarities between the experiment and the sales situation?

The script of low-balling tends to be the same:

First, an advantage is offered that induces a favorable decision in the manipulator's direction. Then, after the decision has been made, but before the bargain is sealed, the original advantage is deftly removed (i.e., the price is raised, the time is changed, etc.).

It would seem surprising that anyone would buy under these circumstances, yet many do. Often the self-created justifications provide so many new reasons for the decision that even when the dealer pulls away the original favorable rationale, like a low price, the decision is not changed. We stick with our old decision even in the face of new information!

Of course not everyone complies, but that’s not the point. The effect is strong enough to hold for a good number of buyers, students or anyone else whose rate of compliance we may want to raise.

The Way Out

The first real defense to consistency bias is awareness about the phenomenon and the harm a certain rigidity in our decisions can cause us.

Robert Cialdini suggests two approaches to recognizing when consistency biases are unduly creeping into our decision making. The first one is to listen to our stomachs. Stomach signs display themselves when we realize that the request being pushed is something we don’t want to do.

He recalls a time when a beautiful young woman tried to sell him a membership he most certainly did not need by using the tactics displayed above. He writes:

I remember quite well feeling my stomach tighten as I stammered my agreement. It was a clear call to my brain, “Hey, you're being taken here!” But I couldn't see a way out. I had been cornered by my own words. To decline her offer at that point would have meant facing a pair of distasteful alternatives: If I tried to back out by protesting that I was not actually the man-about-town I had claimed to be during the interview, I would come off a liar; trying to refuse without that protest would make me come off a fool for not wanting to save $1,200. I bought the entertainment package, even though I knew I had been set up. The need to be consistent with what I had already said snared me.

But then eventually he came up with the perfect counter-attack for later episodes, which allowed him to get out of the situation gracefully.

Whenever my stomach tells me I would be a sucker to comply with a request merely because doing so would be consistent with some prior commitment I was tricked into, I relay that message to the requester. I don't try to deny the importance of consistency; I just point out the absurdity of foolish consistency. Whether, in response, the requester shrinks away guiltily or retreats in bewilderment, I am content. I have won; an exploiter has lost.

The second approach concerns the signs that are felt within our heart and is best used when it is not really clear whether the initial commitment was wrongheaded.

Imagine you have recognized that your initial assumptions about a particular deal were not correct. The car is not extraordinarily cheap and the experiment is not as fun if you have to wake up at 6 AM to make it. Here it helps to ask one simple question:

“Knowing what I know, if I could go back in time, would I make the same commitment?”

Ask it frequently enough and the answer might surprise you.

***

Want More? Check out our ever-growing library of mental models and biases.

More Information Doesn’t Mean Better Decisions

lack fundamental understanding

Situations in life often permit no delay; and when we cannot determine the course which is certainly best, we must follow the one which is probably the best… This frame of mind freed me also from the repentance and remorse commonly felt by those vacillating individuals who are always seeking as worthwhile things which they later judge to be bad. Rene Descartes, Discourse on Method

We tend to think that if we only had more information we'd make better decisions. The world, however, doesn't always work that way.

At a certain point, too much information actually causes us to make worse decisions.

Irrelevant Information
One of the reasons we make worse decisions with more information is that we pursue information that appears relevant but isn't.

The harder the information is to find—that is the more work we have to do in order to find it and the more exclusive is it—the more psychology tells us that we'll put too much value on that information.

In part this happens because of our bias toward commitment and consistency; we've spent time and effort seeking out that information, so mentally we feel obliged to use it. This nudges us toward decisions we otherwise wouldn't have made.

Another reason we love irrelevant information is that we really lack fundamental understanding.

If we don't understand something, we won't have a firm grasp of the fundamental variables that govern the situation and the tradeoffs involved, so we'll look for new variables. When you're not sure how to weigh one attribute compared with another, you end up searching for a reason.

Often this mountain of new information — even if easily obtainable – is largely irrelevant to the situation. The problem is we don't know it is irrelevant.

The result is that peculiar feeling of inward unrest known as indecision. Fortunately it is too familiar to need description, for to describe it would be impossible. As long as it lasts, with the various objects before the attention, we are said to deliberate; and when finally the original suggestion either prevails and makes the movement take place, or gets definitively quenched by its antagonists, we are said to decide, or to utter our voluntary fiat in favor of one or the other course. The reinforcing and inhibiting ideas meanwhile are termed the reasons or motives by which the decision is brought about.William James

Decisions
Decisions are hard to make. In part this is because of conflict and uncertainty. We are uncertain of the consequences of our actions and have difficulty making tradeoffs between attributes. Just as knowledge can make decision making easier, a lack of knowledge compounds the problem.

When faced with two choices of equal alternatives, Slovic (1975, 1990) suggests we make choices based on what's easy to explain and justify. Sounds logical, right, why flip a coin when I can come up with a reason.

Sometimes we weigh the pros and cons. Subconsciously, when deciding for something, we focus on the pros and when we decide against something we focus on the reasons for rejection. This has the added advantage of giving us a good story to tell but causes problems when there are no striking positive or negative aspects to help make the decision.

When we can't find a compelling reason to do something or avoid something, we are left in a state of conflict. So we search for more information.

“Seeking new alternatives usually requires additional time and effort, and may involve the risk of losing the previously available options.” (Tversky and Shafir)

The implications of my cobbled together theory seem worth considering.

If our current choices don't give us a convincing reason to opt for a choice we'll likely seek out additional information (rather than questioning our understanding). When we do seek out additional information, we're really just looking for a compelling rationale for choosing one alternative over another.

The more we look for new rationale to make decisions, the further we are from understanding. The harder we look, the more we'll find. The more we find, the more we'll mis-weigh what we find. The more we mis-weigh, the more likely we are to make a poor decision.

So the next time you find yourself seeking out hard-to-find esoteric information to give yourself an edge in that important decision, think hard about whether you understand the fundamentals of the situation. The more esoteric information you seek the further you move from the likely variables that will govern the outcomes of the situation.

Other sources:
Bastardi and Shafir

The Half-life of Facts

Facts change all the time. Smoking has gone from doctor recommended to deadly. We used to think the Earth was the center of the universe and that Pluto was a planet. For decades we were convinced that the brontosaurus was a real dinosaur.

Knowledge, like milk, has an expiry date. That's the key message behind Samuel Arbesman's excellent new book The Half-life of Facts: Why Everything We Know Has an Expiration Date.

We're bombarded with studies that seemingly prove this or that. Caffeine is good for you one day and bad for you the next. What we think we know and understand about the world is constantly changing. Nothing is immune. While big ideas are overturned infrequently, little ideas churn regularly.

As scientific knowledge grows, we end up rethinking old knowledge. Abresman calls this “a churning of knowledge.” But understanding that facts change (and how they change) helps us cope in a world of constant uncertainty. We can never be too sure of what we know.

In introducing this idea, Abresam writes:

Knowledge is like radioactivity. If you look at a single atom of uranium, whether it’s going to decay — breaking down and unleashing its energy — is highly unpredictable. It might decay in the next second, or you might have to sit and stare at it for thousands, or perhaps even millions, of years before it breaks apart.

But when you take a chunk of uranium, itself made up of trillions upon trillions of atoms, suddenly the unpredictable becomes predictable. We know how uranium atoms work in the aggregate. As a group of atoms, uranium is highly regular. When we combine particles together, a rule of probability known as the law of large numbers takes over, and even the behavior of a tiny piece of uranium becomes understandable. If we are patient enough, half of a chunk of uranium will break down in 704 million years, like clock-work. This number — 704 million years — is a measurable amount of time, and it is known as the half-life of uranium.

It turns out that facts, when viewed as a large body of knowledge, are just as predictable. Facts, in the aggregate, have half-lives: We can measure the amount of time for half of a subject’s knowledge to be overturned. There is science that explores the rates at which new facts are created, new technologies developed, and even how facts spread. How knowledge changes can be understood scientifically.

This is a powerful idea. We don’t have to be at sea in a world of changing knowledge. Instead, we can understand how facts grow and change in the aggregate, just like radioactive materials. This book is a guide to the startling notion that our knowledge — even what each of us has in our head — changes in understandable and systematic ways.

Why does this happen? Why does knowledge churn? In Zen and the Art of Motocycle Maintenance, Robert Pirsig writes:

If all hypotheses cannot be tested, then the result of any experiment are inconclusive and the entire scientific method falls short of its goal of establishing proven knowledge.

About this Einstein had said, “Evolution has shown that at any given moment out of all conceivable constructions a single one has always proved itself absolutely superior to the rest,” and let it go at that.

… But there it was, the whole history of science, a clear story of continuously new and changing explanations of old facts. The time spans of permanence seemed completely random, he could see no order in them. Some scientific truths seemed to last for centuries, others for less than a year. Scientific truth was not dogma, good for eternity, but a temporal quantitative entity that could be studied like anything else.

A few pages later, Pirsig continues:

The purpose of scientific method is to select a single truth from among many hypothetical truths. That, more than anything else, is what science is all about. But historically science has done exactly the opposite. Through multiplication upon multiplication of facts, information, theories and hypotheses, it is science itself that is leading mankind from single absolute truths to multiple, indeterminate, relative ones.

With that, lets dig into how this looks. Arbesman offers a example:

A few years ago a team of scientists at a hospital in Paris decided to actually measure this (churning of knowledge). They decided to look at fields that they specialized in: cirrhosis and hepatitis, two areas that focus on liver diseases. They took nearly five hundred articles in these fields from more than fifty years and gave them to a battery of experts to examine.

Each expert was charged with saying whether the paper was factual, out-of-date, or disproved, according to more recent findings. Through doing this they were able to create a simple chart (see below) that showed the amount of factual content that had persisted over the previous decades. They found something striking: a clear decay in the number of papers that were still valid.

Furthermore, they got a clear measurement of the half-life of facts in these fields by looking at where the curve crosses 50 percent on this chart: 45 years. Essentially, information is like radioactive material: Medical knowledge about cirrhosis or hepatitis takes about forty-five years for half of it to be disproven or become out-of-date.

half-life of facts, decay in the truth of knowledge

Old knowledge, however, isn't a waste. It's not like we have to start from scratch. “Rather,” writes Arbesman, “the accumulation of knowledge can then lead us to a fuller and more accurate picture of the world around us.”

Isaac Asimov, in a wonderful essay, uses the Earth's curvature to help explain this:

When people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together.

When our knowledge in a field is immature, discoveries come easily and often explain the main ideas. “But there are uncountably more discoveries, although far rarer, in the tail of this distribution of discovery. As we delve deeper, whether it's intro discovering the diversity of life in the oceans or the shape of the earth, we begin to truly understand the world around us.”

So what we're really dealing with the long tail of discovery. Our search for what's way out at the end of that tail, while it might not be as important or as Earth-shattering as the blockbuster discoveries, can be just as exciting and surprising. Each new little piece can teach us something about what we thought was possible in the world and help us to asymptotically approach a more complete understanding of our surroundings.

In an interview with the Economist, Arbesman was asked which scientific fields decay the slowest-and fastest-and what causes that difference.

Well it depends, because these rates tend to change over time. For example, when medicine transitioned from an art to a science, its half-life was much more rapid than it is now. That said, medicine still has a very short half-life; in fact it is one of the areas where knowledge changes the fastest. One of the slowest is mathematics, because when you prove something in mathematics it is pretty much a settled matter unless someone finds an error in one of your proofs.

One thing we have seen is that the social sciences have a much faster rate of decay than the physical sciences, because in the social sciences there is a lot more “noise” at the experimental level. For instance, in physics, if you want to understand the arc of a parabola, you shoot a cannon 100 times and see where the cannonballs land. And when you do that, you are likely to find a really nice cluster around a single location. But if you are making measurements that have to do with people, things are a lot messier, because people respond to a lot of different things, and that means the effect sizes are going to be smaller.

Arbesman concludes his economist interview:

I want to show people how knowledge changes. But at the same time I want to say, now that you know how knowledge changes, you have to be on guard, so you are not shocked when your children (are) coming home to tell you that dinosaurs have feathers. You have to look things up more often and recognise that most of the stuff you learned when you were younger is not at the cutting edge. We are coming a lot closer to a true understanding of the world; we know a lot more about the universe than we did even just a few decades ago. It is not the case that just because knowledge is constantly being overturned we do not know anything. But too often, we fail to acknowledge change.

Some fields are starting to recognise this. Medicine, for example, has got really good at encouraging its practitioners to stay current. A lot of medical students are taught that everything they learn is going to be obsolete soon after they graduate. There is even a website called “up to date” that constantly updates medical textbooks. In that sense we could all stand to learn from medicine; we constantly have to make an effort to explore the world anew—even if that means just looking at Wikipedia more often. And I am not just talking about dinosaurs and outer space. You see this same phenomenon with knowledge about nutrition or childcare—the stuff that has to do with how we live our lives.

Even when we find new information that contradicts what we thought we knew, we're likely to be slow to change our minds. “A prevailing theory or paradigm is not overthrown by the accumulation of contrary evidence,” writes Richard Zeckhauser, “but rather by a new paradigm that, for whatever reasons, begins to be accepted by scientists.”

In this view, scientific scholars are subject to status quo persistence. Far from being objective decoders of the empirical evidence, scientists have decided preferences about the scientific beliefs they hold. From a psychological perspective, this preference for beliefs can be seen as a reaction to the tensions caused by cognitive dissonance.

A lot of scientific advancement happens only when the old guard dies off. Many years ago Max Planck offered this insight: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

While we have the best intentions and our minds change slowly, a lot of what we think we know is actually just a temporary knowledge to be updated in the future by more complete knowledge. I think this is why Nassim Taleb argues that we should read Seneca and not worry about someone like Jonah Lehrer bringing us sexy narratives of the latest discoveries. It turns out most of these discoveries are based on very little data and, while they may add to our cumulative knowledge, they are not likely to be around in 10 years.

The Half-life of Facts is a good read that help puts what we think we understand about the world into perspective.

Follow your curiosity and read my interview with the author. Knowing that knowledge has a half-life isn't enough, we can use this to help us determine what to read.

Secrets from the Science of Persuasion

A great animation describing the fundamental principles of persuasion based on the research of Dr. Robert Cialdini, Professor Emeritus of Psychology and Marketing at Arizona State University.

Dr. Cialdini, if you're not familiar, is the author of Influence: The Psychology of Persuasion and the co-author of the New York Times, Wall Street Journal and Business Week International Bestseller Yes! 50 Scientifically Proven Ways to be Persuasive.

Learning about the six universals that guide human behavior could be the best 12 minutes of your day.

  1. Reciprocity
  2. Commitment
  3. Social Proof
  4. Liking
  5. Authority
  6. Scarcity

Cognitive Dissonance and Change Blindness

“Their judgment was based more on wishful thinking than on a sound calculation of probabilities; for the usual thing among men is that when they want something they will, without any reflection, leave that to hope, while they will employ the full force of reason in rejecting what they find unpalatable.”
Thucydides, in History of the Peloponnesian War

From Stalking the Black Swan: Research and Decision Making in a World of Extreme Volatility

When new information conflicts with our preexisting hypotheses, we have a problem that needs to be resolved. Cognitive dissonance refers to the state of tension that occurs when a person holds two ideas, beliefs, attitudes, or opinions that are psychologically inconsistent. This conflict manifests itself as a state of mental tension or dissonance, the intensity of which is visible in magnetic resonance imaging studies of the brain. The theory was developed in 1957 by Leon Festinger, who observed in a series of experiments that people would change their attitudes to make them more consistent with actions they had just taken. In popular usage, cognitive dissonance refers to the tendency to ignore information that conflicts with preexisting views, to rationalize certain behaviors to make them seem more consistent with self-image, or to change attitudes to make them consistent with actions already taken. In some cases, it is the equivalent of telling ourselves “little while lies,” but in other cases it no doubt contributes to logical errors like the “confirmation trap,” where people deliberately search for data to confirm existing views rather than challenge them.

Two major sources of cognitive dissonance are self-image (when the image we hold of ourselves is threatened) and commitment (when we've said something, we don't want to be criticized for changing our minds).

“Cognitive dissonance,” writes Ken Posner, “may mainfest itself in a phenomenon known as change blindness. According to behavioral researches”:

change blindness is a situation where people fail to notice change because it takes place slowly and incrementally. It is also called the “boiling frog syndrome,” referring to the folk wisdom that if you throw a frog in boiling water it will jump out, but if you put it into cold water that is gradually heated, the frog will never notice the change. Most of the studies in this area focus on difficulties in perceiving change visually, but researchers think there is a parallel to decision making.

“Change blindness,” Posner continues, “happens when we filter out the implications of new information rather than assigning them even partial weight in our thinking.”

The Human Mind has a Shut-Off Device

This passage is from Ryan Holiday in Trust Me, I'm Lying:

Once the mind has accepted a plausible explanation for something, it becomes a framework for all the information that is perceived after it. We’re drawn, subconsciously, to fit and contort all the subsequent knowledge we receive into our framework, whether it fits or not. Psychologists call this “cognitive rigidity”. The facts that built an original premise are gone, but the conclusion remains—the general feeling of our opinion floats over the collapsed foundation that established it.

Information overload, “busyness,” speed, and emotion all exacerbate this phenomenon. They make it even harder to update our beliefs or remain open-minded.

Reminds me of this quote from Charlie Munger:

[W]hat I'm saying here is that the human mind is a lot like the human egg, and the human egg has a shut-off device. When one sperm gets in, it shuts down so the next one can't get in. The human mind has a big tendency of the same sort. And here again, it doesn't just catch ordinary mortals; it catches the deans of physics. According to Max Planck, the really innovative, important new physics was never really accepted by the old guard. Instead a new guard came along that was less brain-blocked by its previous conclusions. And if Max Planck's crowd had this consistency and commitment tendency that kept their old inclusions intact in spite of disconfirming evidence, you can imagine what the crowd that you and I are part of behaves like.

If we get most of our plausible explanations from headlines—that is newspapers, tweets, facebook—we're in trouble. Conclusions based not on well-reasoned arguments, deep fluency, or facts but headlines are the most troublesome. Map that to how hard it is to update our beliefs and you can start to see the structural problem.

 

12