Tag: Bias from self-interest

Bias from Self-Interest — Self Deception and Denial to Reduce Pain or Increase Pleasure; Regret Avoidance (Tolstoy effect)

We can ignore reality, but we cannot ignore the consequences of reality.

Bias from self-interest affects everything from how we see and filter information to how we avoid pain. It affects our self-preservation instincts and helps us rationalize our choices. In short, it permeates everything.

***

Our Self-Esteem

Our self-esteem can be a very important aspect of personal well-being, adjustment and happiness. It has been reported that people with higher self-esteem are happier with their lives, have fewer interpersonal problems, achieve at a higher and more consistent level and give in less to peer pressure.

The strong motivation to preserve a positive and consistent self-image is more than evident in our lives.

We attribute success to our own abilities and failures to environmental factors and we continuously rate ourselves as better than average on any subjective measure – ethics, beauty, and ability to get along with others.

Look around – these positive illusions appear to be the rule rather than the exception in well-adjusted people.

However, sometimes life is harsh on us and gives few if any reasons for self-love.

We get fired, a relationship ends, and we end up making decisions which are not well aligned with our inner selves. And so we come up with ways to straighten our damaged self-image.

Under the influence of bias from self-interest, we may find ourselves drifting away from facts and spinning them to the point they become acceptable. While the tendency is mostly harmless and episodical, there are cases when it grows extreme.

The imperfect and confusing realities of our life can activate strong responses, which helps us preserve ourselves and our fragile self-images. Usually amplified by love, death or chemical dependency, strong self-serving bias may leave the person with little capacity to assess the situation objectively.

In his speech, The Psychology of Human Misjudgment, Charlie Munger reflects on the extreme tendencies that serious criminals display in Tolstoy’s novels and beyond. Their defense mechanisms can be divided into two distinct types – they are either in denial of committing the crime at all or they think that the crime is justifiable in light of their hardships.

Munger coins the two cases the Tolstoy effect.

Avoiding Reality by Denying It

Denial occurs, when we encounter a serious thought about reality, but decide to ignore it.

Imagine one day you notice a strange, dark spot on your skin. You feel a sudden sense of anxiety, but soon go on with your day and forget about it. Weeks later, it has not gone away and has slowly become darker and you eventually decide to take action and visit the doctor.

In such cases, small doses of denial might serve us well. We have time to absorb the information slowly and figure out the next steps for action, in case our darkest fears come true. However, once denial becomes a prolonged measure for coping with troubling matters, causing our problems to amplify, we are bound to suffer from consequences.

The consequences can be different. The mildest one is a simple inability to move on with our lives.

Charlie Munger was startled to see a case of persistent denial in a family friend:

This first really hit me between the eyes when a friend of our family had a super-athlete, super-student son who flew off a carrier in the north Atlantic and never came back, and his mother, who was a very sane woman, just never believed that he was dead.

The case made him realize that denial is often amplified by intense feelings of love and death. We're denying to avoid pain.

While denial of the death of someone close is usually harmless and understandable, it can become a significant problem, when we deny an issue that is detrimental to ourselves and others.

A good example of such issues are physical dependencies, such as alcoholism or drug addiction.

Munger advises staying away from any opportunity to slip into an addiction since the psychological effects are most damaging. The reality distortion that happens in the minds of drug addicts leads them to believe that they have remained in a respectable condition and with reasonable prospects even as their condition keeps deteriorating.

Rationalizing Our Choices

A less severe case of distortion, but no less foolish, is our tendency to rationalize the choices we have made.

Most of us have a positive concept of ourselves and we believe ourselves to be competent, moral and smart.

We can go to great lengths to preserve this self-image. No doubt we have all engaged in behaviors that are less than consistent with our inner self-image and then used phrases, such as “not telling the truth is not lying”, “I didn’t have the time” and “others are even worse” to justify our less than ideal actions.

This tendency in part can be explained by the engine that drives self-justification called cognitive dissonance. It is the state of tension that occurs, whenever we hold two opposing facts in our heads, such as “smoking is bad” and “I smoke two packs a day”.

Dissonance bothers us under any circumstances, but it becomes particularly unbearable when our self-concept is threatened by it. After all, we spend our lives trying to lead lives that are consistent and meaningful. This drive “to save face” is so powerful that it often overrules and contradicts the pure effects of rewards and punishments as assumed by economic theory or observed in simple animal behavioral research.

The most obvious way to quiet dissonance is by quitting. However, a smoker that has tried to quit and failed can also quiet the other belief – namely that smoking is not all that bad. It is the simple and failure-free option that allows her to feel good about herself and requires hardly any effort. Having suspended our moral compass only once and found rationales for the bad, but fixable, choices gives us permission to repeat them in the future and continue the vicious cycle.

The Vicious Cycle of Self-Justification

Carol Tavris

Carol Tavris and Elliot Aronson in their book Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts explain the vicious cycle of choices with an analogy of a pyramid.

Consider the case of two reasonably honest students at the beginning of the term. They face the temptation to cheat on an important test. One of them gives in and the other does not. How do you think they will feel about cheating a week later?

Most likely their initially torn opinions will have polarized in light of their initial choices. Now take this effect and amplify it over the term. By the time they are through with the term two things will have happened:
1) They will be very far from each other in their beliefs
2) They will be convinced that they have always felt strongly about the issue and their side of the argument

Just like those students, we are often at the top of the choice pyramid, facing a decision whose consequences are morally ambiguous. This first choice then starts a process of entrapment of action – justification – further action, which increases the intensity of our commitment

triangle

Over time our choices reinforce themselves and towards the bottom of the pyramid, we find ourselves rolling toward increasingly extreme views.

Consider the famous Stanley Milgram experiment, where two thirds of the 3,000 subjects administered a life threatening level of electric shock to another person. While this study is often used to illustrate our obedience to authority, it also a demonstrates the effects of self-justification.

Simply imagine the scenario of someone asking you to do the favor inflicting 500V of potentially deadly and incredibly painful shock on another person for the sake of science. Chances are most of us would refuse it under any circumstances.

Now suppose the researcher tells you he is interested in effects of punishment on learning and you will have to inflict hardly noticeable electric impulses on another person. You are even encouraged to try the lower levels of 10V yourself to feel that the pain is hardly noticeable.

When you come along, suddenly the experimenter asks you to increase the shock to 20V, which seems like a small increase, so you agree without thinking much. Then the cascade continues – if you gave 20V shock, what is the harm in giving 30V? Suddenly you find yourself unable to draw the line, so you simply tag along with the instructions.

When people are asked in advance whether they would administer shock above 450V, nearly nobody believes they would. However, when facing the choice under pressing circumstances, two-thirds of them did!

The implications here are powerful – if we don’t actively draw the line ourselves, our habits and circumstances will decide for us.

Making Smarter Choices

We will all do dumb things. We can’t help it. We are wired that way. However, we are not doomed to live in denial or keep striving to justify our actions. We always have the choice to correct our tendencies, once we recognize them.

A better understanding of our minds serves as the first step towards breaking the self-justification habit. It takes time, self-reflection and willingness to become more mindful about our behavior and reasons for our behavior, but it is well worth the effort.

The authors of Mistakes Were Made (But not By Me) give an example of conservative William Safire, who wrote a column criticizing (then and current) American presidential candidate Hillary Clinton’s efforts to conceal the identity of her health care task force. A few years later Dick Cheney, a Republican (conservative) candidate whom Safire admired, made a similar move to Clinton by insisting on keeping his energy task force secret.

The alarm bell in Safire’s head rang and he admits that the temptation to rationalize the occasion and apply double standards was enormous. However, he recognized the dissonance effects and ended up writing a similar column about Cheney.

We know that Safire’s ability to spot his own dissonance and do the fair thing is rare. People will bend over backward to reduce dissonance in a way that is favorable to them and their team. Resisting that urge is not easy to do, but it is much better than letting the natural psychological tendencies cripple the integrity of our behaviors. There are ways to make fairness easier.

Making Things Easier

On the personal level, Charlie Munger suggests we should face two simple facts. Firstly, fixable, but unfixed bad performance is bad character and tends to create more of itself and cause more damage — a sort of Gresham's Law. And, secondly, in demanding places like athletic teams, excuses and bad behavior will not get us far.

On the institutional level, Munger advises building a fair, meritocratic, demanding culture plus personnel handling methods that build up morale. His second piece of advice is the severance of the worst offenders, when possible.

Munger expands on the second point by noting that it is not in any case possible to let go our children, but we must, therefore, try to fix them to our best ability. He gives a real life example of a child, who had the habit of taking candy from the stock of his father’s employer with the excuse that he had intended to replace it later. The father said words that never left the child:

“Son, it would be better for you to simply take all you want and call yourself a thief every time you do it.”

Turns out the child in this example was the dean of University of Southern California Business School, where Munger delivered the speech.

If we are effective, the lessons we teach our children will serve them well throughout their lives.

***

There is so much more to touch on with bias from self-interest, including its relation to hierarchy, how it distorts information, how it feeds our desire for self-preservation and scarcity, how it impacts group preservation, its relationship to terrority etc.

Bias From Self-Interest is part of the Farnam Street latticework of mental models

Cognitive Dissonance and Change Blindness

“Their judgment was based more on wishful thinking than on a sound calculation of probabilities; for the usual thing among men is that when they want something they will, without any reflection, leave that to hope, while they will employ the full force of reason in rejecting what they find unpalatable.”
Thucydides, in History of the Peloponnesian War

From Stalking the Black Swan: Research and Decision Making in a World of Extreme Volatility

When new information conflicts with our preexisting hypotheses, we have a problem that needs to be resolved. Cognitive dissonance refers to the state of tension that occurs when a person holds two ideas, beliefs, attitudes, or opinions that are psychologically inconsistent. This conflict manifests itself as a state of mental tension or dissonance, the intensity of which is visible in magnetic resonance imaging studies of the brain. The theory was developed in 1957 by Leon Festinger, who observed in a series of experiments that people would change their attitudes to make them more consistent with actions they had just taken. In popular usage, cognitive dissonance refers to the tendency to ignore information that conflicts with preexisting views, to rationalize certain behaviors to make them seem more consistent with self-image, or to change attitudes to make them consistent with actions already taken. In some cases, it is the equivalent of telling ourselves “little while lies,” but in other cases it no doubt contributes to logical errors like the “confirmation trap,” where people deliberately search for data to confirm existing views rather than challenge them.

Two major sources of cognitive dissonance are self-image (when the image we hold of ourselves is threatened) and commitment (when we've said something, we don't want to be criticized for changing our minds).

“Cognitive dissonance,” writes Ken Posner, “may mainfest itself in a phenomenon known as change blindness. According to behavioral researches”:

change blindness is a situation where people fail to notice change because it takes place slowly and incrementally. It is also called the “boiling frog syndrome,” referring to the folk wisdom that if you throw a frog in boiling water it will jump out, but if you put it into cold water that is gradually heated, the frog will never notice the change. Most of the studies in this area focus on difficulties in perceiving change visually, but researchers think there is a parallel to decision making.

“Change blindness,” Posner continues, “happens when we filter out the implications of new information rather than assigning them even partial weight in our thinking.”

Charlie Munger: How to Teach Business School

From Charlie Munger at the 2011 Berkshire Hathaway Shareholders Meeting:

Costco of course is a business that became the best in the world in its category. And it did it with an extreme meritocracy, and an extreme ethical duty—self-imposed to take all its cost advantages as fast as it could accumulate them and pass them on to the customers. And of course they’ve created ferocious customer loyalty. It’s been a wonderful business to watch—and of course strange things happen when you do that and when you do that long enough. Costco has one store in Korea that will do over $400 million in sales this year. These are figures that can’t exist in retail, but of course they do. So that’s an example of somebody having the right managerial system, the right personnel solution, the right ethics, the right diligence, etcetera, etcetera. And that is quite rare. If once or twice in your lifetime you’re associated with such a business you’re a very lucky person.

The more normal business is a business like, say, General Motors, which became the most successful business of its kind in the world and wiped out its common shareholders… what, last year? That is a very interesting story—and if I were teaching business school I would have Value-Line-type figures that took me through the entire history of General Motors and I would try to relate the changes in the graph and data to what happened in the business. To some extent, they faced a really difficult problem—heavily unionized business, combined with great success, and very tough competitors that came up from Asia and elsewhere in Europe. That is a real problem which of course… to prevent wealth from killing people—your success turning into a disadvantage—is a big problem in business.

And so there are all these wonderful lessons in those graphs. I don’t know why people don’t do it. The graphs don’t even exist that I would use to teach. I can’t imagine anybody being dumb enough not to have the kind of graphs I yearn for. [Laughter] But so far as I know there’s no business school in the country that’s yearning for these graphs. Partly the reason they don’t want it is if you taught a history of business this way, you’d be trampling on the territories of all the professors and sub-disciplines—you’d be stealing some of their best cases. And in bureaucracies, even academic bureaucracies, people protect their own turf. And of course a lot of that happened at General Motors. [Applause]

I really think the world … that’s the way it should be taught. Harvard Business School once taught it much that way—and they stopped. And I’d like to make a case study as to why they stopped. [Laughter] I think I can successfully guess. It’s that the course of history of business trampled on the territory of barons of other disciplines like the baron of marketing, the baron of finance, the baron of whatever.

IBM is an interesting case. There’s just one after another that are just utterly fascinating. I don’t think they’re properly taught at all because nobody wants to do the full sweep.

Source

17 Management Lessons from Ray Dalio

Ray Dalio, the sixty-one-year-old founder of Bridewater Associates, the world’s biggest hedge fund, offers the following management advice. Dalio says “Taken together, these principles are meant to paint a picture of a process for the systematic pursuit of truth and excellence and for the rewards that accompany this pursuit. I put them in writing for people to consider in order to help Bridgewater and the people I care about most.”

1. Ego prevents growth. 

Two of the biggest impediments to truth and excellence are people’s ego’s and organizational bureaucracy. Most people like compliments and agreement, and they dislike criticisms and conflict. Yet recognizing mistakes and weaknesses is essential for rapid improvement and excellence. In our culture, there is nothing embarrassing about making mistakes and having weaknesses.

[…]

We need and admire people who can suspend their egos to get at truth and evolve toward excellence, so we ignore ego-based impediments to truth. We have a different type of environment in which some behaviors discouraged elsewhere are rewarded here (like challenging one’s superiors), and some behaviors encouraged elsewhere are punished here (like speaking behind a subordinate’s back).”

2. Think and act in a principled way and expect others to as well.

all outcomes are manifestations of forces that are at work to produce them, so whenever looking at specific outcomes, think about the forces that are behind them. Constantly ask yourself, “What is this symptomatic of?”

3. If you don’t mind being wrong on the way to being right, you will learn a lot

I once had a ski instructor who had taught Michael Jordan, the greatest basketball player of all time, how to ski. He explained how Jordan enjoyed his mistakes and got the most out of them. At the start of high school, Jordan was a mediocre basketball player; he became great because he loved using his mistakes to improve. I see it all the time. Intelligent people who are open to recognizing and learning from their mistakes substantially outperform people with the same abilities who aren't open in the same way.

4. Mistakes are ok. Not correcting them is not. 

Create a culture in which it is OK to fail but unacceptable not to identify, analyze and learn from mistakes. … A common mistake is to depersonalize the mistake, saying “we didn’t handle this well” rather than “Harry didn’t handle this well.” Again, this is because people are often uncomfortable connecting specific mistakes to specific people because of ego sensitivities. … it is essential that the diagnosis connect the mistakes to the specific individuals by name.

5. Everyone might have a voice but not all opinions are equally valued. 

Not all people’s opinions are equally valuable. Still that important distinction is often unacknowledged in discussions. Prevent this by looking at people’s track records, noting their credentials, and evaluating how their arguments hold up when challenged.

6. There is a difference between debate, discussion, and teaching. 

Debate is generally among approximate equals; discussion is open-minded exploration among people of various levels of understanding; and teaching is between people of different levels of understanding.

7. Know when to keep your mouth shut.

Imagine if a group of us were trying to learn how to play golf with Tiger Woods, and he and a new golfer were debating how to swing the club. Would it be helpful or harmful and plain silly to treat their points of view equally, because they have different levels of believability? It is better to listen to what Tiger Woods has to say, without constant interruptions by some know-nothing arguing with him.

“Two of the biggest impediments to truth and excellence are people’s ego’s and organizational bureaucracy.”

— Ray Dalio

8. Be careful not to lose personal responsibility via group decision making.

Too often groups will make a decision to do something without assigning personal responsibilities so it is not clear who is supposed to do what.

9. Don’t pick your battles. Fight them all.

If you see something wrong, even small things, deal with it. that is because 1) small things can be as symptomatic of serious underlying problems as big things, so looking into them, finding what they are symptomatic of, and resolving them will prevent big problems; 2) resolving small differences with people will prevent a more serious divergence of your views; and 3) in trying to help to train people, constant reinforcement of the desired behavior is helpful. The more battles you fight, the more opportunities you will have to get to know each other and the faster the evolutionary process will be.

10. All problems are manifestations of their root causes. Find the root cause. 

Keep asking why? and don’t forget to examine problems with people. In fact, since most things are done or not done because someone decided to do them or not to do them a certain way, most root causes can be traced to specific people, especially “the responsible party.” When the problem is attributable to a person, you have to ask why the person made the mistake to get at the real root cause. For example, a root cause discovery process might go something like this: The problem was due to bad programming. Why was there bad programming? Because Harry programmed it badly. Why did Harry program it badly? Because he wasn’t well trained and because he was in a rush? Why wasn’t he well trained? Did his manger know that he wasn’t well trained and let him do the job anyway or did he not know?

11 Avoid Monday-morning quarterbacking.

That is “evaluating the merits of past decisions based on what you know now versus what you could have reasonably known at the time of the decision. Do this by asking the question, “what should an intelligent person have known in that situation,” as well as having a deep understanding of the person who made the decision (how do they think, what type of person are they, did they learn from the situation, etc?)”

12. Don’t undermine personal accountability with vagueness.

Use specific names. For example, don’t say “we” or “they” handled it badly. Also avoid: “We should…” or “We are…” Who is we? Exactly who should, who made a mistake, or who did a great job? Use specific names.

13. Self-sufficiency increases efficiency and accountability.

Try to equip departments to be as self-sufficient as possible to enhance efficiency. We do this because we don’t want to create a bureaucracy that forces departments to requisition resources from a pool that lacks the focus to do the job. For example, while people often argue that we should have a technology department, I am against that because building technology is a task, not a goal in and of itself. You build technology to…(fill in the blank, e.g., help service clients, help market, etc.). Keeping the tech resources outside the department means you would have people from various departments arguing about whether their project is more important than someone else’s in order to get resources, which isn’t good for efficiency. The tech people would be evaluated and managed by bureaucrats rather than the people they do the work for.

“Mistakes are ok. Not correcting them is not.”

— Ray Dalio

14. Constantly worry about what you are missing.

Even if you acknowledge you are a dumb shit and are following the principles and are designing around your weaknesses, understand that you still might be missing things. You will get better and be safer this way.

15. Identify the variables that really matter. 

Distinguish the important things from the unimportant things and deal with the important things first. Get the important things done very well, but not perfectly, because that's all you have time for. Chances are you won't have to deal with the unimportant things, which is better than not having time to deal with the important things. I often hear people say “wouldn't it be good to do this or that,” referring to nice-to-do rather than important things: they must be wary of those nice-to-do’s distracting them far more important things that need to be done.”

16. Use the phrase, “by and large”

Too often I hear discussions fail to progress when a statement is made and the person to whom it is made to replies “not always,” leading to a discussion of the exceptions rather than the rule. For example, a statement like “the people in the XYZ Department are working too many hours” might lead to a response like “not all of them are; Sally and Bill are working normal hours,” which could lead to a discussion of whether Sally and Bill are working too long, which derails the discussion. Because nothing is 100% true, conversations can get off track if they turn to whether exceptions exist, which is especially foolish if both parties agree that the statement is by and large true. To avoid this problem, the person making such statements might use the term “by and large,” like “by and large, the people in the XYZ Department are working too many hours.” People hearing that should consider whether it is a “by and large” statement and treat it accordingly.”

17. Most importantly

a) build the organization around goals rather than around tasks; b) make people personally responsible for achieving these goals; c) make departments as self-sufficient as possible so that they have control over the resources they need to achieve the goals; d) have the clearest possible delineation of responsibilities and reporting lines; e) have agreed-upon goals and tasks that everyone knows (from the people in the departments to the people outside the departments who oversee them); and f) hold people accountable for achieving the goals and doing the tasks.”

 

To learn more check out this excellent profile of Dalio, which appeared in the New Yorker and make sure to read all of Dalio's management principles.

* * *

Looking for more? Read The Contrarian's Guide to Leadership.

In his book Power: Why Some People Have It And Others Don’t, Jeffrey Pfeffer argues that intelligence and high performance are not needed to build power.

 

A Simple Checklist to Improve Decisions

We owe thanks to the publishing industry. Their ability to take a concept and fill an entire category with a shotgun approach is the reason that more people are talking about biases.

Unfortunately, talk alone will not eliminate them but it is possible to take steps to counteract them. Reducing biases can make a huge difference in the quality of any decision and it is easier than you think.

In a recent article for Harvard Business Review, Daniel Kahneman (and others) describe a simple way to detect bias and minimize its effects in the most common type of decisions people make: determining whether to accept, reject, or pass on a recommendation.

The Munger two-step process for making decisions is a more complete framework, but Kahneman's approach is a good way to help reduce biases in our decision-making.

If you're short on time here is a simple checklist that will get you started on the path towards improving your decisions:

Preliminary Questions: Ask yourself

1. Check for Self-interested Biases

  • Is there any reason to suspect the team making the recommendation of errors motivated by self-interest?
  • Review the proposal with extra care, especially for overoptimism.

2. Check for the Affect Heuristic

  • Has the team fallen in love with its proposal?
  • Rigorously apply all the quality controls on the checklist.

3. Check for Groupthink

  • Were there dissenting opinions within the team?
  • Were they explored adequately?
  • Solicit dissenting views, discreetly if necessary.
  • Challenge Questions: Ask the recommenders

4. Check for Saliency Bias

  • Could the diagnosis be overly influenced by an analogy to a memorable success?
  • Ask for more analogies, and rigorously analyze their similarity to the current situation.

5. Check for Confirmation Bias

  • Are credible alternatives included along with the recommendation?
  • Request additional options.

6. Check for Availability Bias

  • If you had to make this decision again in a year’s time, what information would you want, and can you get more of it now?
  • Use checklists of the data needed for each kind of decision.

7. Check for Anchoring Bias

  • Do you know where the numbers came from? Can there be
  • …unsubstantiated numbers?
  • …extrapolation from history?
  • …a motivation to use a certain anchor?
  • Reanchor with figures generated by other models or benchmarks, and request new analysis.

8. Check for Halo Effect

  • Is the team assuming that a person, organization, or approach that is successful in one area will be just as successful in another?
  • Eliminate false inferences, and ask the team to seek additional comparable examples.

9. Check for Sunk-Cost Fallacy, Endowment Effect

  • Are the recommenders overly attached to a history of past decisions?
  • Consider the issue as if you were a new CEO.
  • Evaluation Questions: Ask about the proposal

10. Check for Overconfidence, Planning Fallacy, Optimistic Biases, Competitor Neglect

  • Is the base case overly optimistic?
  • Have the team build a case taking an outside view; use war games.

11. Check for Disaster Neglect

  • Is the worst case bad enough?
  • Have the team conduct a premortem: Imagine that the worst has happened, and develop a story about the causes.

12. Check for Loss Aversion

  • Is the recommending team overly cautious?
  • Realign incentives to share responsibility for the risk or to remove risk.

If you're looking to dramatically improve your decision making here is a great list of books to get started:

Nudge: Improving Decisions About Health, Wealth, and Happiness by Richard H. Thaler and Cass R. Sunstein

Think Twice: Harnessing the Power of Counterintuition by Michael J. Mauboussin

Think Again: Why Good Leaders Make Bad Decisions and How to Keep It from Happening to You by Sydney Finkelstein, Jo Whitehead, and Andrew Campbell

Predictably Irrational: The Hidden Forces That Shape Our Decisions by Dan Ariely

Thinking, Fast and Slow by Daniel Kahneman

Judgment and Managerial Decision Making by Max Bazerman

Hindsight Bias

hindsight bias

“Judgments about what is good and what is bad, what is worthwhile and what is a waste of talent, what is useful and what is less so, are judgments that seldom can be made in the present. They can safely be made only by posterity.”Tulving

***

Hindsight bias occurs when we look backward in time and see events are more predictable than they were at the time a decision was made. This bias, also known as the “knew-it-all-along effect,” typically involves those annoying “I told you so” people who never really told you anything.

For instance, consider driving in the car with your partner and coming to a T in the road. Your partner decides to turn right and 4 miles down the road when you realize you are lost you think “I knew we should have taken that left.”

Hindsight bias can offer a number of benefits in the short run. For instance, it can be flattering to believe that our judgment is better than it actually is. And, of course, hindsight bias allows us to participate in one of our favorite pastimes — criticizing the decisions of others for their lack of foresight.

Aside from helping aid in a more objective reflection of decisions, hindsight bias also has several practical implications. For example, consider someone asked to review a paper but knows the results of the previous review from someone else? Or a physician asked for a second opinion after knowing the results of the first. The results of these actions will likely be biased by some degree. Once we know an outcome it becomes easy to find some plausible explanation.

Hindsight bias helps us become less accountable for our decisions, less critical of ourselves, and over-confident in our ability to make decisions.

One of the most interesting things I discovered when researching hindsight bias was the impact on our legal system and the perceptions of jurors.

* * *

Harvard Professor Max Bazerman offers:

The processes that give rise to anchoring and overconfidence are also at play with the hindsight bias. According to this explanation, knowledge of an event's outcome works as an anchor by which individuals interpret their prior judgments of the event's likelihood. Due to the selective accessibility of the confirmatory information during information retrieval, adjustments to anchors are inadequate. Consequently, hindsight knowledge biases our perceptions of what we remember knowing in foresight. Furthermore, to the extent that various pieces of data about the event vary in support of actual outcome, evidence that is consistent with the known outcome may become cognitively more salient and thus more available in memory. This tendency will lead an individual to justify a claimed foresight in view of “the facts provided.” Finally, the relevance of a particular piece of that may later be judged important to the extent to which it is representative of the final observed outcome.

In Cognitive Illusions, Rudiger Pohl offered the following explanations of hindsight bias:

Most prominent among the proposes explanations are cognitive accounts which assume that hindsight bias results from an inability to ignore the solution. Among the early approaches are the following three: (1) Fischhoff (1975) assumed an immediate and irreversible assimilation of the solution into one's knowledge base. As a consequence, the reconstructed estimate will be biased towards the solution. (2) Tversky and Kahneman (1974) proposed a cognitive heuristic for the anchoring effected, named anchoring and insufficient adjustment. The same mechanism may apply here, if the solution is assumed to serve as an “anchor” in the reconstruction process. The reconstruction starts from this anchor and is then adjusted in the direction of one's knowledge base. However, this adjustment process may stop too early, for example at the point where the first plausible value is reached, thus ending to a biased reconstruction. (3) Hell (1988) argued that the relative trace strengths of the regional estimate and of the solution might predict the amount of hindsight bias. The stronger the trace strength of the solution relative to that of the original estimate, the larger hindsight bias should be.

Pohl also offers an evolutionary explanation of hindsight bias:

Finally, some authors argued that hindsight bias is not necessarily a bothersome consequence of a “faulty” information process system, but that is may rather represent an unavoidable by-product of an evolutionary evolved function, namely adaptive learning. According to this view, hindsight bias is seen as the consequence of our most valuable ability to update previously held knowledge. This may be seen as a necessary process in order to prevent memory overload and thus to maintain normal cognitive functioning. Besides, updating allows us to keep our knowledge more coherent and to draw better inferences.

Ziva Junda, in social cognition, offers the following explanation of why hindsight bias occurs:

Preceding events take on new meaning and importance as they are made to cohere with the known outcome. Now that we know the our friends have filed for divorce, any ambiguous behavior we have seen is reinterpreted as indicative of tension, any disagreement gains significance, and any signs of affection seem irrelevant. It now seems obvious that their marriage was doomed from the start…Moreover, having adjusted our interpretations in light of current knowledge, it is difficult to imagine how things could have happened differently.

When making likelihood judgments, we often rely on the availability heuristic: The more difficult it is for us to imagine an outcome, the more unlikely it seems. Therefore, the difficulty we experience imagining how things might have turned out differently makes us all the more convinced that the outcomes that did occur were bound to have occurred.

Hindsight bias has large implications for criminal trials. In Jury Selection Hale Starr and Mark McCormick offer the following:

The effects of hindsight bias – which result in being held to a higher standard – are most critical for both criminal and civil defendants. The defense is more susceptible to the hindsight bias since their actions are generally the ones being evaluated fro reasonableness in foresight-foreseeability. When jurors perceive that the results of particular actions were “reasonably” more likely after the outcome is known, defendants are judged as having been capable of knowing more than they knew at the time the action was taken and therefore as capable of preventing the “bad” outcome.

In post-verdict surveys jurors unknowingly demonstrate some of the effects of hindsight bias:

“I can't understand why the managers didn't try to get more information or use the information they had available. They should have known there would be safety problems at the plant”.

“The defendants should have known people would remove the safety shield around the tire. There should have been warnings so people wouldn't do that”

“Even though he was a kid, he should have known that once he showed the others who had been drinking that he had a gun, things would get out of hand. He should have known guns invited violence.

Jurors influenced by the hindsight bias look at the evidence presented and determine that the defendants knew or should have known their actions were unsafe, unwise, or created a dangerous situation. Hindsight bias often results in the judgment that the event was “an accident or tragedy waiting to happen.”

* * *

Protection Against Hindsight Bias

In Principles of Forecasting, Jon Scott Armstrong, offers the following advice on how to protect yourself:

The surest protection against (hindsight bias) is disciplining ourselves to make explicit predictions, showing what we did in fact know (sounds like a decision journal). That record can also provide us with some protection against those individuals who are wont to second guess us, producing exaggerated claims of what we should have known (and perhaps should have told them). If these observers look to this record, it may show them that we are generally less proficient as forecaster than they would like while protecting us against charges of having blown a particular assignment. Having an explicit record can also protect us against overconfidence in our own forecasting ability: If we feel that we “knew all along” what was going to happen, then it is natural enough to think that we will have similar success in the future. Unfortunately, an exaggerated perception of a surprise-free past maybe portend a surprised-full future.

Documenting the reasons we made a forecast makes it possible for us to know not only how well the forecast did, but also where it went astray. For example, subsequent experiences may show that we used wrong (or misunderstood) inputs. In that case, we can, in principle, rerun the forecasting process with better inputs and assess the accuracy of our (retrospectively) revised forecasts. Perhaps we did have the right theory and procedures, but were applying them to a mistaken picture of then-current conditions…Of course inputs are also subject to hindsight bias, hence we need to record them explicitly as well. The essence of making sense out of outcome knowledge is reinterpreting the processes and conditions that produced the reported event.

Hindsight Bias is part of the Farnam Street latticework of mental models.