Over 400,000 people visited Farnam Street last month to learn how to make better decisions, create new ideas, and avoid stupid errors. With more than 98,000 subscribers to our popular weekly digest, we've become an online intellectual hub. To learn more about we what do, start here.

Mental Model: Conjunctive and Disjunctive Events Bias

Welcome to the conjunctive and disjunctive events bias.

Why are we so optimistic in our estimation of a projects cost and schedule? Why are we so surprised when something inevitably goes wrong? Because of the human tendency to underestimate disjunctive events.

According to Daniel Kahneman and his long-time co-author Amos Tversky (1974): “A complex system, such as a nuclear reactor or the human body, will malfunction if any of its essential components fails.” They continue, “Even when the likelihood of failure in each component is slight, the probability of an overall failure can be high if many components are involved.”

In Seeking Wisdom: From Darwin to Munger, Peter Bevelin writes:

A project is composed of a series of steps where all must be achieved for success. Each individual step has some probability of failure. We often underestimate the large number of things that may happen in the future or all opportunities for failure that may cause a project to go wrong. Humans make mistakes, equipment fails, technologies don’t work as planned, unrealistic expectations, biases including sunk cost-syndrome, inexperience, wrong incentives, changing requirements, random events, ignoring early earning signals are reasons for delays, cost overruns and mistakes. Often we focus too much on the specific base project case and ignore what normally happens in similar situations (base rate frequency of outcomes—personal and others). Why should some project be any different from the long-term record of similar ones? George Bernard Shaw said: “We learn from history that man can never learn anything from history.”

The more independent steps that are involved in achieving a scenarios, the more opportunities for failure and the less likely it is that the scenario will happen. We often underestimate the number of steps, people, and decisions involved.

Add to this that we often forget that the reliability of a system is a function of the whole system. The weakest link sets the upper limit for the whole chain.

* * *

In “Thinking, Fast and Slow“, Kahneman offers the following example:

Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with the issues of discrimination and social justice, and also participated in antinuclear demonstrations.

Which alternative is most probable:
Linda is a bank teller.
Linda is a bank teller and is active in the feminist movement.

…About 85% to 95% of undergraduates at several major universities chose the second option, contrary to logic.

The word fallacy is used, in general, when people fail to apply a logical rule that is obviously relevant. Amos and I introduced the idea of a conjunction fallacy, which people commit when they judge a conjunction or two events to be more probably than one of the events in a direct comparison.

…The judgments of probability that our respondents offered, (in the Linda problem), corresponded precisely to the judgments of representativeness (similarity to stereotypes). Representativeness belongs to a cluster of closely related basic assessments that are likely to be generated together. The most representative outcomes combine with the personality description to produce the most coherent stories. The most coherent stories are not necessarily the most probable, but they are plausible, and the notions of coherence, plausibility, and probability are easily confused by the unwary.

* * *

From Max Bazerman’s Judgment in Managerial Decision Making:

Which of the following instances appears most likely? Which appears second most likely?
A. Drawing a red marble from a bag containing 50 percent red marbles and 50 percent white marbles.
B. Drawing a red marble seven times in succession, with replacement (i.e., a selected marble is put back into the bag before the next marble is selected), from a bag containing 90 percent red marbles and 10 percent white marbles.
C. Drawing at least one red marble in seven tries, with replacement, from a bag containing 10 percent red marbles and 90 percent white marbles.

The most common ordering of preference is B-A-C. Interestingly, the correct order of likelihood is C (52%), A (50%), and B (48%). This result illustrates a general bias to overestimate the probability of conjunctive events, or events that must occur in conjunction with one another and to underestimate the probability of disjunctive events, or events that occur independently. Thus, when multiple events all need to occur (choice B) we overestimate the true likelihood of this happening, while if only one of many events needs to occur (choice C), we underestimate the true likelihood of this occurring.

The overestimation of conjunctive events offers a powerful explanation for the problems that typically occur with projects that require multistage planning. Individuals, businesses, and governments frequently fall victim to the conjunctive-events bias in terms of timing and budgets. Home remodelling, new product ventures, and public works projects seldom finish on time.

* * *

Astronomy Professor Carl Sagan said in Carl Sagan: A Life in the Cosmos: “The Chernobyl and Challenger disasters remind us that the highly visible technological systems in which enormous national prestige had been invested can nevertheless experience catastrophic failure.”

Safety is a function of the total system – that is the interaction of all of the components. If only one component fails (and it need not be a key one), the system may fail. Assume an airplane has 2,000 independent parts or systems. Each part or system is designed to have a working probability of 99%. All the parts need to work together for the airplane to work, however the probability that at least one of the parts fails (causing the plane to malfunction) is 86%.

* * *

From Judgment and Decision Making, by Daniel Kahneman (1974):

Studies of choice among gamblers and of judgments of probability indicate that people tend to overestimate the probability of conjunctive events (Cohen, Chesnick, and Haran, 1972, 24) and to underestimate the probability of disjunctive events. These biases are readily explained as effects of anchoring.

The stated probability of the elementary event (success at any one stage) provides a natural starting point for the estimation of the probabilities of both conjunctive and disjunctive events. Since adjustment from the starting point is typically insufficient, the final estimates remain too close to the probabilities of the elementary events in both cases.

Note the overall probability of a conjunctive event is lower than the probability of each elementary event, whereas the overall probability of a disjunctive event is higher than the probability of each elementary event. As a consequence of anchoring, the overall probability will be overestimated in conjunctive problems and underestimated in disjunctive problems.

Biases in the evaluation of compound events are particularly significant in the context of planning. The successful completion of an undertaking, such as the development of a new product or thesis, typically has a conjunctive character: for the undertaking to succeed, each of a series of events must occur. Even when each of these events is very likely, the overall probability of success can be quite low if the number of events is large.

The general tendency to overestimate the probability of conjunctive events leads to unwarranted optimism in the evaluation of the likelihood that a plan will success or that a project will be completed on time. Conversely, disjunctive structures are typically encountered in the evaluation of risks. A complex system, such as a nuclear reactor or the human body, will malfunction if any of its essential components fails. Even when the likelihood of failure in each component is slight, the probability of an overall failure can be high if many components are involved. Because of anchoring, people will tend to underestimate the probabilities of failure in complex systems. Thus, the direction of the anchoring bias can sometimes be inferred from the structure of the event. The chain-like structure of conjunctions leads to over-estimation, the funnel-like structure of disjunctions leads to underestimation.

* * *

Biases in the evaluation of compound events are particularly significant in the context of planning. Any complex undertaking has the character of a conjunctive event: lots of things have to click into place in order for the whole thing to work. Even when the probability of each individual event is very likely, the overall probability can be very low. People in general way overestimate the probability of the conjunctive event, leading to massive time and cost overruns in real projects.

Conversely, disjunctive structures are typically encountered in the evaluation of risks. A complex system, such as a nuclear reactor or a human body, will malfunction if just one key component fails. Even if the probability of failure of any one event is very low, the overall probability of some event going wrong is very high. People always underestimate the probability of complex systems, like the Challenger, going wrong. (source)

Conjunctive and Disjunctive Events Bias is part of the Farnam Street Latticework of Mental Models.