Tag Archives: Michael Mauboussin

Daniel Kahneman in Conversation with Michael Mauboussin on Intuition, Causality, Loss Aversion and More

The Santa Fe Institute Board of Trustees Chair Michael Mauboussin interviews Nobel Prize winner Daniel Kahneman. The wide-ranging conversation covers disciplined intuition, causality, base rates, loss aversion and so much more.

Here’s an excerpt from Kahneman I think you’ll enjoy.

The Sources of Power is a very eloquent book on expert intuition with magnificent examples, and so he is really quite hostile to my point of view, basically.

We spent years working on that, on the question of when can intuitions be trusted? What’s the boundary between trustworthy and untrustworthy intuitions?

I would summarize the answer as saying there is one thing you should not do. People’s confidence in their intuition is not a good guide to their validity. Confidence is something else entirely, and maybe we can talk about confidence separately later, but confidence is not it.

What there is, if you want to know whether you can trust intuition, it really is like deciding on a painting, whether it’s genuine or not. You can look at the painting all you want, but asking about the provenance is usually the best guide about whether a painting is genuine or not.

Similarly for expertise and intuition, you have to ask not how happy the individual is with his or her own intuitions, but first of all, you have to ask about the domain. Is the domain one where there is enough regularity to support intuitions? That’s true in some medical domains, it certainly is true in chess, it is probably not true in stock picking, and so there are domains in which intuition can develop and others in which it cannot. Then you have to ask whether, if it’s a good domain, one in which there are regularities that can be picked up by the limited human learning machine. If there are regularities, did the individual have an opportunity to learn those regularities? That primarily has to do with the quality of the feedback.

Those are the questions that I think should be asked, so there is a wide domain where intuitions can be trusted, and they should be trusted, and in a way, we have no option but to trust them because most of the time, we have to rely on intuition because it takes too long to do anything else.

Then there is a wide domain where people have equal confidence but are not to be trusted, and that may be another essential point about expertise. People typically do not know the limits of their expertise, and that certainly is true in the domain of finances, of financial analysis and financial knowledge. There is no question that people who advise others about finances have expertise about finance that their advisees do not have. They know how to look at balance sheets, they understand what happens in conversations with analysts.

There is a great deal that they know, but they do not really know what is going to happen to a particular stock next year. They don’t know that, that is one of the typical things about expert intuition in that we know domains where we have it, there are domains where we don’t, but we feel the same confidence and we do not know the limits of our expertise, and that sometimes is quite dangerous.

To continue reading (the rest of the transcript) you must be a member. (Current members log-in here.)


To learn more about our membership program please visit this page. Or instantly sign up and become a Farnam Street VIP.

Making Decisions in a Complex Adaptive System


In Think Twice: Harnessing the Power of Counterintuition, Mauboussin does a good job adding to the work we’ve already done on complex adaptive systems:

You can think of a complex adaptive system in three parts (see the image at the top of this post). First, there is a group of heterogeneous agents. These agents can be neurons in your brain, bees in a hive, investors in a market, or people in a city. Heterogeneity means each agent has different and evolving decision rules that both reflect the environment and attempt to anticipate change in it. Second, these agents interact with one another, and their interactions create structure— scientists often call this emergence. Finally, the structure that emerges behaves like a higher-level system and has properties and characteristics that are distinct from those of the underlying agents themselves. … The whole is greater than the sum of the parts.


The inability to understand the system based on its components prompted Nobel Prize winner and physicist Philip Anderson, to draft the essay, “More Is Different.” Anderson wrote, “The behavior of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of the simple extrapolation of the properties of a few particles. Instead, at each level of complexity entirely new properties appear.”

Mauboussin comments that we are fooled by randomness:

The problem goes beyond the inscrutable nature of complex adaptive systems. Humans have a deep desire to understand cause and effect, as such links probably conferred humans with evolutionary advantage. In complex adaptive systems, there is no simple method for understanding the whole by studying the parts, so searching for simple agent-level causes of system-level effects is useless. Yet our minds are not beyond making up a cause to relieve the itch of an unexplained effect. When a mind seeking links between cause and effect meets a system that conceals them, accidents will happen.

Misplaced Focus on the Individual

One mistake we make is extrapolating the behaviour of an individual component, say an individual, to explain the entire system. Yet when we have to solve a problem dealing with a complex system, we often address an individual component. In so doing, we ignore Garrett Hardin’s first law of Ecology, you can never do merely one thing and become a fragilista.

That unintended system-level consequences arise from even the best-intentioned individual-level actions has long been recognized. But the decision-making challenge remains for a couple of reasons. First, our modern world has more interconnected systems than before. So we encounter these systems with greater frequency and, most likely, with greater consequence. Second, we still attempt to cure problems in complex systems with a naïve understanding of cause and effect.


When I speak with executives from around the world going through a period of poor performance, it doesn’t take long for them to mention they want to hire a star from another company. “If only we had Kate,” they’ll say, “we could smash the competition and regain our footing.”

At first, poaching stars from competitors or even teams within the same organization seems like a winning strategy. But once the star comes over the results often fail to materialize.

What we fail to grasp is that their performance is part of an ecosystem and removing them from that ecosystem — that is isolating the individual performance — is incredibly hard without properly considering the entire ecosystem. (Reversion to the mean also likely accounts for some of the star’s fading as well).

Three Harvard professors concluded, “When a company hires a star, the star’s performance plunges, there is a sharp decline in the functioning of the group or team the person works with, and the company’s market value falls.”

If it sounds like a lot of work to think this through at many levels, it should be. Why should it be easy?

Another example of this at an organizational level has to do with innovation. Most people want to solve the innovation problem. Ignoring for a second that that is the improper framing, how do most organizations go about this? They copy what the most successful organizations do. I can’t count the number of times the solution to an organization’s “innovation problem” is to be more like Google. Well-intentioned executives blindly copy approaches by others such as 20% innovation time, without giving an ounce of thought to the role the ecosystem plays.

Isolating and focusing on an individual part of a complex adaptive system without an appreciation and understanding of that system itself is sure to lead to disaster.

What Should We Do?

So this begs the question, what should we do when we find ourselves dealing with a complex adaptive system? Mauboussin provides three pieces of advice:

1. Consider the system at the correct level.

Remember the phrase “more is different.” The most prevalent trap is extrapolating the behavior of individual agents to gain a sense of system behavior. If you want to understand the stock market, study it at the market level. Consider what you see and read from individuals as entertainment, not as education. Similarly, be aware that the function of an individual agent outside the system may be very different from that function within the system. For instance, mammalian cells have the same metabolic rates in vitro, whether they are from shrews or elephants. But the metabolic rate of cells in small mammals is much higher than the rate of those in large mammals. The same structural cells work at different rates, depending on the animals they find themselves in.

2. Watch for tightly coupled systems.

A system is tightly coupled when there is no slack between items, allowing a process to go from one stage to the next without any opportunity to intervene. Aircraft, space missions, and nuclear power plants are classic examples of complex, tightly coupled systems. Engineers try to build in buffers or redundancies to avoid failure, but frequently don’t anticipate all possible contingencies. Most complex adaptive systems are loosely coupled, where removing or incapacitating one or a few agents has little impact on the system’s performance. For example, if you randomly remove some investors, the stock market will continue to function fine. But when the agents lose diversity and behave in a coordinated fashion, a complex adaptive system can behave in a tightly coupled fashion. Booms and crashes in financial markets are an illustration.

3. Use simulations to create virtual worlds.

Dealing with complex systems is inherently tricky because the feedback is equivocal, information is limited, and there is no clear link between cause and effect. Simulation is a tool that can help our learning process. Simulations are low cost, provide feedback, and have proved their value in other domains like military planning and pilot training.

Still Curious? Think Twice: Harnessing the Power of Counterintuition.

How Situations Influence Decisions

Michael Mauboussin, the first guest on my podcast, The Knowledge Project, explains how our situations influence our decisions enormously in Think Twice: Harnessing the Power of Counterintuition.

Mistakes born out of situations are difficult to avoid, in part because the influences on us are operating at a subconscious level. “Making good decisions in the face of subconscious pressure,” Mauboussin writes, “requires a very high degree of background knowledge and self-awareness.”

How do you feel when you read the word “treasure”? Do you feel good? What images come to mind? If you are like most people, just ruminating on “treasure” gives you a little lift. Our minds naturally make connections and associate ideas. So if someone introduces a cue to you— a word, a smell, a symbol— your mind often starts down an associative path. And you can be sure the initial cue will color a decision that waits at the path’s end. All this happens outside of your perception.

People around us also influence our decisions, often with good reason. Social influence arises for a couple of reasons. The first is asymmetric information, a fancy phrase meaning someone knows something you don’t. In those cases, imitation makes sense because the information upgrade allows you to make better decisions.

Peer pressure, or the desire to be part of the in-group, is a second source of social influence. For good evolutionary reasons, humans like to be part of a group— a collection of interdependent individuals— and naturally spend a good deal of time assessing who is “in” and who is “out.” Experiments in social psychology have repeatedly confirmed this.

We explain behavior based on an individual’s choices and disposition and not the situation. That is, we associate bad behaviour with the person and not the situation. Unless, of course, we’re talking about ourselves. This is “the fundamental attribution error”, a phrase coined by Lee Ross, a social psychologist at Stanford University.

There are two sides to this sword as the power of situations can work for good and evil. “Some of the greatest atrocities known to mankind,” Mauboussin writes, “resulted from putting normal people into bad situations.”

We believe our choices are independent of circumstance, however, the evidence points in another direction.

Some Wine With Your Music?

Consider how something as simple as the music playing in a store influences what wine we purchase.

Imagine strolling down the supermarket aisle and coming upon a display of French and German wines, roughly matched for price and quality. You do some quick comparisons, place a German wine in your cart, and continue shopping. After you check out, a researcher approaches and asks why you bought the German wine. You mention the price, the wine’s dryness, and how you anticipate it will go nicely with a meal you are planning. The researcher then asks whether you noticed the German music playing and whether it had any bearing on your decision. Like most, you would acknowledge hearing the music and avow that it had nothing to do with your selection.

But this isn’t a hypothetical, it’s an actual study and the results affirm that the environment influences our decisions.

In this test, the researchers placed the French and German wines next to each other, along with small national flags. Over two weeks, the scientists alternated playing French accordion music and German Bierkeller pieces and watched the results. When French music played, French wines represented 77 percent of the sales. When German music played, consumers selected German wines 73 percent of the time. (See the image below) The music made a huge difference in shaping purchases. But that’s not what the shoppers thought.

While the customers acknowledged that the music made them think of either France or Germany, 86 percent denied the tunes had any influence on their choice.


This is an example of priming, which psychologists formally define as “the incidental activation of knowledge structures by the current situational context.”1 and priming happens all the time. For priming to be most effective it must have a strong connection to our situation’s goals.

Another example of how situations influence us is the default. In a fast moving world of non-stop bits and bytes the default is the path of least resistance — that is, it’s the system one option. To move away from the default is labour intensive on our brains. Studies have repeatedly shown that most people go with defaults.

This applies to a wide array of choices, from insignificant issues like the ringtone on a new cell phone to consequential issues like financial savings, educational choice, and medical alternatives. Richard Thaler, an economist, and Cass Sunstein, a law professor, call the relationship between choice presentation and the ultimate decision “choice architecture.” They convincingly argue that we can easily nudge people toward a particular decision based solely on how we arrange the choices for them.

One context for decision making is how choices are structured. Knowing that many people opt for the default option, we can influence (for better or worse) large groups of people.

Mauboussin relates a story about a prominent psychologist popular on the speaking circuit that “underscores how underappreciated choice architecture remains.”

When companies call to invite him to speak, he offers them two choices. Either they can pay him his set fee and get a standard presentation, or they can pay him nothing in exchange for the opportunity to work with him on an experiment to improve choice architecture (e.g., redesign a form or Web site). Of course, the psychologist benefits by getting more real-world results on choice architecture, but it seems like a pretty good deal for the company as well, because an improved architecture might translate into financial benefits vastly in excess of his speaking fee. He noted ruefully that so far not one company has taken him up on his experiment offer.

(As a brief aside, I engage in public speaking on a fairly regular basis. I’ve toyed with similar ideas. Once I even went as far as offering to speak for no pre-set fee, only “value added” as judged by the client. They opted for the fee.)

Another great example of how environments affect behaviour is Stanley Milgram’s famous experiment on obedience to authority. “Ordinary people, simply doing their jobs, and without any particular hostility on their part, can become agents in a terrible destructive process,” wrote Stanley Milgram. The Stanford Prison Experiment is, yet, another example.


The key point is that situations are generally more powerful than we think and we can do things to resist the pull of “unwelcome social influence.” Mauboussin offers four tips.

1. Be aware of your situation.

You can think of this in two parts. There is the conscious element, where you can create a positive environment for decision making in your own surroundings by focusing on process, keeping stress to an acceptable level, being a thoughtful choice architect, and making sure to diffuse the forces that encourage negative behaviors.

Then there is coping with the subconscious influences. Control over these influences requires awareness of the influence, motivation to deal with it, and the willingness to devote attention to address possible poor decisions. In the real world, satisfying all three control conditions is extremely difficult, but the path starts with awareness.

2. Consider the situation first and the individual second.

This concept, called attributional charity, insists that you evaluate the decisions of others by starting with the situation and then turning to the individuals, not the other way around. While easier for Easterners than Westerners, most of us consistently underestimate the role of the situation in assessing the decisions we see others make. Try not to make the fundamental attribution error.

3. Watch out for the institutional imperative.

Warren Buffett, the celebrated investor and chairman of Berkshire Hathaway, coined the term institutional imperative to explain the tendency of organizations to “mindlessly” imitate what peers are doing. There are typically two underlying drivers of the imperative. First, companies want to be part of the in-group, much as individuals do. So if some companies in an industry are doing mergers, chasing growth, or expanding geographically, others will be tempted to follow. Second are incentives. Executives often reap financial rewards by following the group. When decision makers make money from being part of the crowd, the draw is nearly inescapable.

One example comes from a Financial Times interview with the former chief executive officer of Citigroup Chuck Prince in 2007, before the brunt of the financial crisis. “When the music stops, things will be complicated,” offered Prince, demonstrating that he had some sense of what was to come. “But as long as the music is playing, you’ve got to get up and dance.” The institutional imperative is rarely a good dance partner.

4. Avoid inertia.

Periodically revisit your processes and ask whether they are serving their purpose. Organizations sometimes adopt routines and structures that become crystallized, impeding positive change. Efforts to reform education in the United States, for example, have been met with resistance from teachers and administrators who prefer the status quo.

We like to think that we’re better than the situation, that we follow the decision making process and rationally weigh the facts, consider alternatives, and determine the best course of action. While others are easily influenced, we are not. This is how we’re wrong.

Decision making is fundamentally a social exercise, something I cover in my Re:Think Decision Making workshop.

1. “Automaticity of Social Behavior: Direct Effects of Trait Construction and Stereotype Activation on Action”

The Wisdom of Crowds and The Expert Squeeze

As networks harness the wisdom of crowds, the ability of experts to add value in their predictions is steadily declining. This is the expert squeeze.

As networks harness the wisdom of crowds, the ability of experts to add value in their predictions is steadily declining. This is the expert squeeze.

In Think Twice: Harnessing the Power of Counterintuition, Michael Mauboussin, the first guest on my podcast, The Knowledge Project, explains the expert squeeze and its implications for how we make decisions.

As networks harness the wisdom of crowds and computing power grows, the ability of experts to add value in their predictions is steadily declining. I call this the expert squeeze, and evidence for it is mounting. Despite this trend, we still pine for experts— individuals with special skill or know-how— believing that many forms of knowledge are technical and specialized. We openly defer to people in white lab coats or pinstripe suits, believing they hold the answers, and we harbor misgivings about computergenerated outcomes or the collective opinion of a bunch of tyros.

The expert squeeze means that people stuck in old habits of thinking are failing to use new means to gain insight into the problems they face. Knowing when to look beyond experts requires a totally fresh point of view, and one that does not come naturally. To be sure, the future for experts is not all bleak. Experts retain an advantage in some crucial areas. The challenge is to know when and how to use them.

The Value of Experts
The Value of Experts

So how can we manage this in our role as decision maker? The first step is to classify the problem.

(The figure above — The Value of Experts) helps to guide this process. The second column from the left covers problems that have rules-based solutions with limited possible outcomes. Here, someone can investigate the problem based on past patterns and write down rules to guide decisions. Experts do well with these tasks, but once the principles are clear and well defined, computers are cheaper and more reliable. Think of tasks such as credit scoring or simple forms of medical diagnosis. Experts agree about how to approach these problems because the solutions are transparent and for the most part tried and true.


Now let’s go to the opposite extreme, the column on the far right that deals with probabilistic fields with a wide range of outcomes. Here are no simple rules. You can only express possible outcomes in probabilities, and the range of outcomes is wide. Examples include economic and political forecasts. The evidence shows that collectives outperform experts in solving these problems.


The middle two columns are the remaining province for experts. Experts do well with rules-based problems with a wide range of outcomes because they are better than computers at eliminating bad choices and making creative connections between bits of information.

Once you’ve classified the problem, you can turn to the best method for solving it.

… computers and collectives remain underutilized guides for decision making across a host of realms including medicine, business, and sports. That said, experts remain vital in three capacities. First, experts must create the very systems that replace them. … Of course, the experts must stay on top of these systems, improving the market or equation as need be.

Next, we need experts for strategy. I mean strategy broadly, including not only day-to-day tactics but also the ability to troubleshoot by recognizing interconnections as well as the creative process of innovation, which involves combining ideas in novel ways. Decisions about how best to challenge a competitor, which rules to enforce, or how to recombine existing building blocks to create novel products or experiences are jobs for experts.

Finally, we need people to deal with people. A lot of decision making involves psychology as much as it does statistics. A leader must understand others, make good decisions, and encourage others to buy in to the decision.

So what are the practical tips you can do to make the expert squeeze work for you instead of against you? Here Mauboussin offers 3 tips.

1. Match the problem you face with the most appropriate solution.

What we know is that experts do a poor job in many settings, suggesting that you should try to supplement expert views with other approaches.

2. Seek diversity.

(Philip) Tetlock’s work shows that while expert predictions are poor overall, some are better than others. What distinguishes predictive ability is not who the experts are or what they believe, but rather how they think. Borrowing from Archilochus— through Isaiah Berlin— Tetlock sorted experts into hedgehogs and foxes. Hedgehogs know one big thing and try to explain everything through that lens. Foxes tend to know a little about a lot of things and are not married to a single explanation for complex problems. Tetlock finds that foxes are better predictors than hedgehogs. Foxes arrive at their decisions by stitching “together diverse sources of information,” lending credence to the importance of diversity. Naturally, hedgehogs are periodically right— and often spectacularly so— but do not predict as well as foxes over time. For many important decisions, diversity is the key at both the individual and collective levels.

3. Use technology when possible. Leverage technology to side-step the squeeze when possible.

Flooded with candidates and aware of the futility of most interviews, Google decided to create algorithms to identify attractive potential employees. First, the company asked seasoned employees to fill out a three-hundred-question survey, capturing details about their tenure, their behavior, and their personality. The company then compared the survey results to measures of employee performance, seeking connections. Among other findings, Google executives recognized that academic accomplishments did not always correlate with on-the-job performance. This novel approach enabled Google to sidestep problems with ineffective interviews and to start addressing the discrepancy.

Learning the difference between when experts help or hurt can go a long way toward avoiding stupidity. This starts with identifying the type of problem you’re facing and then considering the various approaches to solve the problem with pros and cons.

Still curious? Follow up by reading Generalists vs. Specialists, Think Twice: Harnessing the Power of Counterintuition, and reviewing the work of Philip Tetlock on why how you think matters more than what you think.

Countering the Inside View and Making Better Decisions

Countering the Inside View And Making Better Decisions

You can reduce the number of mistakes you make by thinking about problems more clearly.

In his book Think Twice: Harnessing the Power of Counterintuition, Michael Mauboussin discusses how we can “fall victim to simplified mental routines that prevent us from coping with the complex realities inherent in important judgment calls.” One of those routines is the inside view, which we’re going to talk about in this article but first let’s get a bit of context.

No one wakes up thinking, “I am going to make bad decisions today.” Yet we all make them. What is particularly surprising is some of the biggest mistakes are made by people who are, by objective standards, very intelligent. Smart people make big, dumb, and consequential mistakes.


Mental flexibility, introspection, and the ability to properly calibrate evidence are at the core of rational thinking and are largely absent on IQ tests. Smart people make poor decisions because they have the same factory settings on their mental software as the rest of us, and that software isn’t designed to cope with many of today’s problems.

We don’t spend enough time thinking and learning from the process. Generally we’re pretty ambivalent about the process by which we make decisions.

… typical decision makers allocate only 25 percent of their time to thinking about the problem properly and learning from experience. Most spend their time gathering information, which feels like progress and appears diligent to superiors. But information without context is falsely empowering.

That reminds me of what Daniel Kahneman wrote in Thinking, Fast and Slow:

A remarkable aspect of your mental life is that you are rarely stumped … The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it.

So we’re not really gathering information as much as trying to satisfice our existing intuition. The very thing a good decision process should help root out.

Ego Induced Blindness

One prevalent error we make is that we tend to favour the inside view over the outside view.

An inside view considers a problem by focusing on the specific task and by using information that is close at hand, and makes predictions based on that narrow and unique set of inputs. These inputs may include anecdotal evidence and fallacious perceptions. This is the approach that most people use in building models of the future and is indeed common for all forms of planning.


The outside view asks if there are similar situations that can provide a statistical basis for making a decision. Rather than seeing a problem as unique, the outside view wants to know if others have faced comparable problems and, if so, what happened. The outside view is an unnatural way to think, precisely because it forces people to set aside all the cherished information they have gathered.

When the inside view is more positive than the outside view you effectively have a base rate argument. You’re saying (knowingly or, more likely, unknowingly) that this time is different. Our brains are all too happy to help us construct this argument.

Mauboussin argues that we embrace the inside view for a few primary reasons. First, we’re optimistic by nature. Second, is the “illusion of optimism” (we see our future as brighter than that of others). Finally, is the illusion of control (we think that chance events are subject to our control).

One interesting point is that while we’re bad at looking at the outside view when it comes to ourselves, we’re better at it when it comes to other people.

In fact, the planning fallacy embodies a broader principle. When people are forced to look at similar situations and see the frequency of success, they tend to predict more accurately. If you want to know how something is going to turn out for you, look at how it turned out for others in the same situation. Daniel Gilbert, a psychologist at Harvard University, ponders why people don’t rely more on the outside view, “Given the impressive power of this simple technique, we should expect people to go out of their way to use it. But they don’t.” The reason is most people think of themselves as different, and better, than those around them.

So it’s mostly ego. I’m better than the people tackling this problem before me. We see the differences between situations and use those as rationalizations as to why things are different this time.

Consider this:

We incorrectly think that differences are more valuable than similarities.

After all, anyone can see what’s the same but it takes true insight to see what’s different, right? We’re all so busy trying to find differences that we forget to pay attention to what is the same.

How to Incorporate the Outside View into your Decisions

In Think Twice, Mauboussin distills the work of Kahneman and Tversky into four steps and adds some commentary.

1. Select a Reference Class

Find a group of situations, or a reference class, that is broad enough to be statistically significant but narrow enough to be useful in analyzing the decision that you face. The task is generally as much art as science, and is certainly trickier for problems that few people have dealt with before. But for decisions that are common—even if they are not common for you— identifying a reference class is straightforward. Mind the details. Take the example of mergers and acquisitions. We know that the shareholders of acquiring companies lose money in most mergers and acquisitions. But a closer look at the data reveals that the market responds more favorably to cash deals and those done at small premiums than to deals financed with stock at large premiums. So companies can improve their chances of making money from an acquisition by knowing what deals tend to succeed.

2. Assess the distribution of outcomes.

Once you have a reference class, take a close look at the rate of success and failure. … Study the distribution and note the average outcome, the most common outcome, and extreme successes or failures.


Two other issues are worth mentioning. The statistical rate of success and failure must be reasonably stable over time for a reference class to be valid. If the properties of the system change, drawing inference from past data can be misleading. This is an important issue in personal finance, where advisers make asset allocation recommendations for their clients based on historical statistics. Because the statistical properties of markets shift over time, an investor can end up with the wrong mix of assets.

Also keep an eye out for systems where small perturbations can lead to large-scale change. Since cause and effect are difficult to pin down in these systems, drawing on past experiences is more difficult. Businesses driven by hit products, like movies or books, are good examples. Producers and publishers have a notoriously difficult time anticipating results, because success and failure is based largely on social influence, an inherently unpredictable phenomenon.

3. Make a prediction.

With the data from your reference class in hand, including an awareness of the distribution of outcomes, you are in a position to make a forecast. The idea is to estimate your chances of success and failure. For all the reasons that I’ve discussed, the chances are good that your prediction will be too optimistic.

Sometimes when you find the right reference class, you see the success rate is not very high. So to improve your chance of success, you have to do something different than everyone else.

4. Assess the reliability of your prediction and fine-tune.

How good we are at making decisions depends a great deal on what we are trying to predict. Weather forecasters, for instance, do a pretty good job of predicting what the temperature will be tomorrow. Book publishers, on the other hand, are poor at picking winners, with the exception of those books from a handful of best-selling authors. The worse the record of successful prediction is, the more you should adjust your prediction toward the mean (or other relevant statistical measure). When cause and effect is clear, you can have more confidence in your forecast.


The main lesson we can take from this is that we tend to focus on what’s different whereas the best decisions often focus on just the opposite: what’s the same. While this situation seems a little different, it’s almost always the same.

As Charlie Munger has said: “if you notice, the plots are very similar. The same plot comes back time after time.”

Particulars may vary but, unless those particulars are the variables that govern the outcome of the situation, the pattern remains. If we’re going to focus on what’s different rather than what’s the same, you’d best be sure the variables you’re clinging to matter.

Michael Mauboussin on Intuition, Experts, Technology, and Making Better Decisions

Michael Mauboussin, Credit Suisse
Michael Mauboussin, Credit Suisse

Welcome to The Knowledge Project, an experimental podcast aimed at acquiring wisdom through interviews with key luminaries from across the globe to gain insights into how they think, live, and connect ideas. The core themes will seem familiar to readers: Decision Making, Leadership, Innovation. But it also touches on questions about what it means to live a good life.


The first episode of The Knowledge Project features Michael Mauboussin, the head of Global Financial Strategies at Credit Suisse. He’s also written numerous books, including More Than You Know: Finding Financial Wisdom in Unconventional Places, Think Twice: Harnessing the Power of Counterintuition, and most recently The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing. More importantly, Mauboussin spends more time thinking about thinking than most people.

In this episode we explore parenting, daily routines, reading, and how to make better decisions.


A transcript is available for members.

Here are a list of books mentioned in the podcast:


In this excerpt from the podcast, Mauboussin comments on the role of intuition in the decision making process:

The way I like to think about this, and by the way there’s a great book by David Myers on this, called “Intuition.” It’s a book I really would recommend. It’s one of the better treatments of this, and more thoughtful treatments of this.

The way I think about this is, intuition is very domain-specific. Specifically, I would use the language of Danny Kahneman – System one, System two. System one is our experiential system. It’s fast, it’s automatic, but it’s not very malleable. It’s difficult to train.

Our System two, of course, our analytical system, is slower, more purposeful, more deliberate but more trainable. Intuition applies when you participate in a particular activity to a sufficient amount that you effectively train your System one.

So that things become, go from your slow system to your fast system. Where would this work, for instance? It would work in things like, obviously, with things like chess. Chess masters, we know, they chunk. They can see the board very quickly, know who’s at advantage, who’s not at advantage.

But it’s not going to work… So, the key characteristic is it’s going to work in what I would call stable linear environments. Stable linear environments. Athletics would be another example. For long parts of history, it was in warfare. Certain elements of warfare would work.

But if you get into unstable, non-linear environments, all bets are going to be off. There is a great quote from Greg Northcraft, which I love, when he says you have to differentiate between experience and expertise. Intuition relates to this.

He said expertise… An expert is someone who has a predictive model that works, and so just because you’ve been doing something for a long time doesn’t mean that you have a predictive model that works.

I would say intuition should be used with a lot of caution.

The key is to have disciplined intuition.

(Danny Kahneman) said, “You know, you’re going to have these base rates, or statistical ways of thinking about things, and then you’re going to have your intuition. How do you use those two things, and in what order?”

The argument he made was you should always start with the base rate the statistical approach, and then layer in your intuition. He called it “disciplined intuition.” Otherwise, if you go with your intuition first, you’re going to seek out, right, you’re going to seek out things that support your point of view.

I always think about it that way. I know that a lot of people make decisions using their gut or their intuition, but I don’t know that that’s the best way to do it in most settings. Some settings, yes, but most settings, no.

Become a Farnam Street VIP and join our exclusive community with a membership.

Get The Best Newsletter on the Internet