Michael Mauboussin on Intuition, Experts, Technology, and Making Better Decisions

Michael Mauboussin, Credit Suisse
Michael Mauboussin, Credit Suisse

Welcome to The Knowledge Project, an experimental podcast aimed at acquiring wisdom through interviews with key luminaries from across the globe to gain insights into how they think, live, and connect ideas. The core themes will seem familiar to readers: Decision Making, Leadership, Innovation. But it also touches on questions about what it means to live a good life.

***

The first episode of The Knowledge Project features Michael Mauboussin, the head of Global Financial Strategies at Credit Suisse. He’s also written numerous books, including More Than You Know: Finding Financial Wisdom in Unconventional Places, Think Twice: Harnessing the Power of Counterintuition, and most recently The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing. More importantly, Mauboussin spends more time thinking about thinking than most people.

In this episode we explore parenting, daily routines, reading, and how to make better decisions.

***

Here are a list of books mentioned in the podcast:

***

In this excerpt from the podcast, Mauboussin comments on the role of intuition in the decision making process:

The way I like to think about this, and by the way there’s a great book by David Myers on this, called “Intuition.” It’s a book I really would recommend. It’s one of the better treatments of this, and more thoughtful treatments of this.

The way I think about this is, intuition is very domain-specific. Specifically, I would use the language of Danny Kahneman – System one, System two. System one is our experiential system. It’s fast, it’s automatic, but it’s not very malleable. It’s difficult to train.

Our System two, of course, our analytical system, is slower, more purposeful, more deliberate but more trainable. Intuition applies when you participate in a particular activity to a sufficient amount that you effectively train your System one.

So that things become, go from your slow system to your fast system. Where would this work, for instance? It would work in things like, obviously, with things like chess. Chess masters, we know, they chunk. They can see the board very quickly, know who’s at advantage, who’s not at advantage.

But it’s not going to work… So, the key characteristic is it’s going to work in what I would call stable linear environments. Stable linear environments. Athletics would be another example. For long parts of history, it was in warfare. Certain elements of warfare would work.

But if you get into unstable, non-linear environments, all bets are going to be off. There is a great quote from Greg Northcraft, which I love, when he says you have to differentiate between experience and expertise. Intuition relates to this.

He said expertise… An expert is someone who has a predictive model that works, and so just because you’ve been doing something for a long time doesn’t mean that you have a predictive model that works.

I would say intuition should be used with a lot of caution.

The key is to have disciplined intuition.

(Danny Kahneman) said, “You know, you’re going to have these base rates, or statistical ways of thinking about things, and then you’re going to have your intuition. How do you use those two things, and in what order?”

The argument he made was you should always start with the base rate the statistical approach, and then layer in your intuition. He called it “disciplined intuition.” Otherwise, if you go with your intuition first, you’re going to seek out, right, you’re going to seek out things that support your point of view.

I always think about it that way. I know that a lot of people make decisions using their gut or their intuition, but I don’t know that that’s the best way to do it in most settings. Some settings, yes, but most settings, no.

Developing a Mental Framework for Effective Thinking

Becoming a better thinker means understanding the way you think and developing a way of approaching problems that allows you to see things from multiple lenses. These lenses, or mental models, are built on the foundations of physics, biology, math, psychology, as well as history and economics. The more tools you have in your mental toolbox the better able you will be to make an incrementally better decision.

These tools also allow you to better understand when to follow and when to reject conventional wisdom. Ideally you want to go through them checklist style — just run right through them — asking what applies.

Consilient Thinker
John Snow was a doctor based in London during the acute cholera outbreak of the summer of 1854. He represents a powerful example of the impact a lollapalooza effect can have. A lollapalooza is when several ideas combine to produce an unusually powerful result. Snow developed systems to ease the pain of surgery with ether and chloroform.

In the book The Ghost Map, author Steven Johnson explains:

Snow was a truly consilient thinker, in the sense of the term as it was originally formulated by the Cambridge philosopher William Whewell in the 1840s (and recently popularized by Harvard biologist E. O. Wilson). “The Consilience of Inductions,” Whewell wrote, “takes place when an Induction, obtained from one class of facts, coincides with an Induction obtained from another different class. This Consilience is a test of the truth of the Theory in which it occurs.” Snow’s work was constantly building bridges between different disciplines, some which barely existed as functional sciences in his day, using data on one scale of investigation to make predictions about behavior on other scales. In studying ether and chloroform, he had moved from the molecular properties of the gas itself, to its circulation of those properties throughout the body’s overall system, to the psychological effects produced by these biological changes. He even ventured beyond the natural world into the design of technology that would best reflect our understanding of the anesthetics. Snow was not interested in individual, isolated phenomena; he was interested in chains and networks in the movement from scale to scale. His mind tripped happily from molecules to cells to brains to machines, and it was precisely that consilient study that helped Snow uncover so much about this nascent field in such a shockingly short amount of time.

Suspending belief in the common theory at the time on how diseases were spread, Snow ended up rejecting miasma theory, which said the disease was spread via “bad air.” He did this through science. He conducted interviews with residents and traced the majority of cases back to a single water source. His willingness to challenge conventional thinking, along with approaching the problem through multiple lenses, resulted in finding the deadly source and changes in municipal water systems from that day forward.

***

Elements of the mental framework

Charlie Munger is a strong advocate of a mental framework. In Damn Right: Behind the Scenes with Berkshire Hathaway Billionaire Charlie Munger, he offered five-simple notions that help solve complex problems.

In The Focused Few: Taking a Multidisciplinary Approach to Focus Investing, Richard Rockwood explores the concepts from many disciplines. Adding them together can yield a useful mental checklist.

Element 1: Invert

In The Focused Few, Rockwood writes:

Inverting, or thinking problems through backward, is a great way to understand information. Charlie Munger provides the best illustration I have ever seen of this type of thinking.

During a speech he offered an example of how a situation could be examined using the inversion process. He discussed the development process of Coca-Cola from the perspective of a person creating a soda company from scratch and examining the key issues that would need to be resolved to make it a reality.

He listed some of the issues the entrepreneur would need to address:

  • What kind of properties should the new drink strive for, and what are those it should avoid? One property the drink should not have is an aftertaste. Consumers should be able to consume large quantities over a period of time and not be deterred by an unpleasant aftertaste.
  • The soda should be developed in such a manner that it can be shipped in large quantities at minimal costs. This makes it easier to develop an efficient, large-scale distribution system.
  • Keeping the soda formulation a secret will help alleviate competition and create a certain aura of mystique around the product.
  • The company also can deter competition by expanding the business as quickly as possible. For example, the distribution system could be expanded until it reaches a critical mass that competitors would find hard to duplicate without massive capital expenditures.

Element 2: First- and second-level thinking

In The Focused Few, Rockwood writes:

Let’s examine the decision-making process by breaking it down into two components. The first component, first-level thinking, generally occurs when you make decisions quickly based on a simple theme or common sense. For example, a person may decide to invest in a company simply because its products are trendy. Making decisions based on first-level reasoning has significant problems, however. Common sense “is wonderful at making sense of the world, but not necessarily at understanding it.” (Duncan Watts Everything Is Obvious: How Common Sense Fails Us)

The danger is that you may think you understand a particular situation when in fact you have only developed a likely story.

Second-level thinkers, in contrast, approach decisions differently. What kinds of questions should a second-level thinker ask?

In his book, The Most Important Thing: Uncommon Sense for the Thoughtful Investor, Howard Marks provides a useful list of questions to ask.

  1. What is the range of likely future outcomes?
  2. Which outcome do I think will occur?
  3. What is the probability that I’m right?
  4. What is the prevailing consensus?
  5. How does my expectation differ from the consensus?
  6. How does the current price for the asset comport with the consensus view of the future— and with mine?
  7. Is the consensus psychology that is incorporated into the price too bullish or bearish?
  8. What will happen to the asset’s price if the consensus turns out to be right, and what if I’m right?

Element 3: Use decision trees

decision trees

In The Focused Few, Rockwood writes:

Decision trees are excellent tools for helping you decide on a course of action. They enable you to lay out several possible scenarios, investigate their possible outcomes, and create a balanced picture of the risks and rewards associated with each.

[…]

Let’s examine the decision-tree process in greater detail. First, identify the decision and the outcome alternatives available at each point. After you lay out each course of action, determine which option has the greatest value to you. Start by assigning a cash value to each possible outcome (i.e., what the expected value would be if that particular outcome were to occur). Next, look at each break, or point of uncertainty, in the tree and estimate the probability of each outcome occurring. If you use percentages, the combined total must equal 100% at each break point. If you use fractions, these must add up to 1.

After these two steps have been taken (i.e., the values of the outcomes have been entered and the probabilities have been estimated), it is time to begin calculating the expected values of the various branches in the decision tree.

Element 4: The multidisciplinary approach

When trying to resolve a difficult situation or determining exactly why a product has been, and may continue to be, successful, it helps to think about the problem by creating a checklist that incorporates the vital components of other disciplines.

The Focused Few goes on to explore more of the elements of multidisciplinary thinking.

Fooled By Randomness

fooled by randomness

I don’t want you to make the same mistake I did.

I waited too long before reading Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets by Nassim Taleb. He wrote the book before the Black Swan and Antifragile, which propelled him into intellectual celebrity. Interestingly, Fooled by Randomness contains semi-explored gems of the ideas that would later go on to become the best-selling books The Black Swan and Antifragile.

***
Hindsight Bias

Part of the argument that Fooled by Randomness presents is that when we look back at things that have happened we see them as less random than they actually were.

It is as if there were two planets: the one in which we actually live and the one, considerably more deterministic, on which people are convinced we live. It is as simple as that: Past events will always look less random than they were (it is called the hindsight bias). I would listen to someone’s discussion of his own past realizing that much of what he was saying was just backfit explanations concocted ex post by his deluded mind.

***
The Courage of Montaigne

Writing on Montaigne as the role model for the modern thinker, Taleb also addresses his courage:

It certainly takes bravery to remain skeptical; it takes inordinate courage to introspect, to confront oneself, to accept one’s limitations— scientists are seeing more and more evidence that we are specifically designed by mother nature to fool ourselves.

***
Probability

Fooled by Randomness is about probability, not in a mathematical way but as skepticism.

In this book probability is principally a branch of applied skepticism, not an engineering discipline. …

Probability is not a mere computation of odds on the dice or more complicated variants; it is the acceptance of the lack of certainty in our knowledge and the development of methods for dealing with our ignorance. Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem or a brain teaser. Mother nature does not tell you how many holes there are on the roulette table , nor does she deliver problems in a textbook way (in the real world one has to guess the problem more than the solution).

Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem” which is fascinating given how we tend to solve problems. In decisions under uncertainty, I discussed how risk and uncertainty are different things, which creates two types of ignorance.

Most decisions are not risk-based, they are uncertainty-based and you either know you are ignorant or you have no idea you are ignorant. There is a big distinction between the two. Trust me, you’d rather know you are ignorant.

***
Randomness Disguised as Non-Randomness

The core of the book is about luck that we understand as skill or “randomness disguised as non-randomness (that is determinism).”

This problem manifests itself most frequently in the lucky fool, “defined as a person who benefited from a disproportionate share of luck but attributes his success to some other, generally very precise, reason.”

Such confusion crops up in the most unexpected areas, even science, though not in such an accentuated and obvious manner as it does in the world of business. It is endemic in politics, as it can be encountered in the shape of a country’s president discoursing on the jobs that “he” created, “his” recovery, and “his predecessor’s” inflation.

These lucky fools are often fragilistas — they have no idea they are lucky fools. For example:

[W]e often have the mistaken impression that a strategy is an excellent strategy, or an entrepreneur a person endowed with “vision,” or a trader a talented trader, only to realize that 99.9% of their past performance is attributable to chance, and chance alone. Ask a profitable investor to explain the reasons for his success; he will offer some deep and convincing interpretation of the results. Frequently, these delusions are intentional and deserve to bear the name “charlatanism.”

This does not mean that all success is luck or randomness. There is a difference between “it is more random than we think” and “it is all random.”

Let me make it clear here : Of course chance favors the prepared! Hard work, showing up on time, wearing a clean (preferably white) shirt, using deodorant, and some such conventional things contribute to success— they are certainly necessary but may be insufficient as they do not cause success. The same applies to the conventional values of persistence, doggedness and perseverance: necessary, very necessary. One needs to go out and buy a lottery ticket in order to win. Does it mean that the work involved in the trip to the store caused the winning? Of course skills count, but they do count less in highly random environments than they do in dentistry.

No, I am not saying that what your grandmother told you about the value of work ethics is wrong! Furthermore, as most successes are caused by very few “windows of opportunity,” failing to grab one can be deadly for one’s career. Take your luck!

That last paragraph connects to something Charlie Munger once said: “Really good investment opportunities aren’t going to come along too often and won’t last too long, so you’ve got to be ready to act. Have a prepared mind.

Taleb thinks of success in terms of degrees, so mild success might be explained by skill and labour but outrageous success “is attributable variance.”

***
Luck Makes You Fragile

One thing Taleb hits on that really stuck with me is that “that which came with the help of luck could be taken away by luck (and often rapidly and unexpectedly at that). The flipside, which deserves to be considered as well (in fact it is even more of our concern), is that things that come with little help from luck are more resistant to randomness.” How Antifragile.

Taleb argues this is the problem of induction, “it does not matter how frequently something succeeds if failure is too costly to bear.”

***
Noise and Signal

We are confused between noise and signal.

…the literary mind can be intentionally prone to the confusion between noise and meaning, that is, between a randomly constructed arrangement and a precisely intended message. However, this causes little harm; few claim that art is a tool of investigation of the Truth— rather than an attempt to escape it or make it more palatable. Symbolism is the child of our inability and unwillingness to accept randomness; we give meaning to all manner of shapes; we detect human figures in inkblots.

All my life I have suffered the conflict between my love of literature and poetry and my profound allergy to most teachers of literature and “critics.” The French thinker and poet Paul Valery was surprised to listen to a commentary of his poems that found meanings that had until then escaped him (of course, it was pointed out to him that these were intended by his subconscious).

If we’re concerned about situations where randomness is confused with non randomness should we also be concerned with situations where non randomness is mistaken for randomness, which would result in signal being ignored?

First, I am not overly worried about the existence of undetected patterns. We have been reading lengthy and complex messages in just about any manifestation of nature that presents jaggedness (such as the palm of a hand, the residues at the bottom of Turkish coffee cups, etc.). Armed with home supercomputers and chained processors, and helped by complexity and “chaos” theories, the scientists, semiscientists, and pseudoscientists will be able to find portents. Second, we need to take into account the costs of mistakes; in my opinion, mistaking the right column for the left one is not as costly as an error in the opposite direction. Even popular opinion warns that bad information is worse than no information at all.

If you haven’t yet, pick up a copy of Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets. Don’t make the same mistake I did and wait to read this important book.

(image via)

The Decision-Maker: A Tool For a Lifetime

Seymour Schulich

The first chapter in Seymour Schulich’s book, Get Smarter: Life and Business Lessons, offers a decision tool that adds to the simple pro-and-con list that many of us have used to make decisions. Schulich, a self-made billionaire, is one of Canada’s richest and best-known businessmen.

I learned this tool in a practical mathematics course more than fifty years ago and have used it for virtually every major decision of my adult life. It has never let me down and it will serve you well, too.

You all know the simple pro-and-con list? The one where you divide the page in two and simply list out all the pros and cons. Well, the Decision-Maker adds a twist to that. Here’s how it works.

On one sheet of paper, list all the positive things you can about the issue in question, then give each one a score from zero to ten—the higher the score, the more important it is to you.

On another sheet, list the negative points, and score them from zero to ten—only this time, ten means it’s a major drawback. Suppose you are thinking of buying a house, and you tour one that’s in your price range, except the owners have painted every room to look like a giant banana. If you really hate yellow and can’t stand the thought of lifting a paint brush, you might give “ugly yellow house” a ten, and if it’s not that big a deal, maybe a two or a three.

Now add up the scores. But here’s the rule.

If the positive score is at least double the negative score, you should do it—whatever “it” is. But if the positives don’t outweigh the negatives by that two-to-one ratio, don’t do it, or at least think twice about it.

Yes that sounds simple. I agree. But I also don’t think that things need to be complicated in order to be effective.

The Decision-Maker is designed not to allow one or two factors to sway a major life decision in a disproportionate way. It forces you to strip away the emotion and really examine the relative importance of each point—which, of course, is why it works so well.

This tool works for groups too.

When we were considering whether to sell our royalty company, Franco-Nevada, to Newmont Mining, Franco’s executive team produced a collective Decision-Maker. We listed all the pros and cons, then the top four executives assigned their own point scores to each. We averaged them, the positives far outweighed the negatives, and we sold the company.

The Difference Between Seeing and Observing

The Art of Observation
In A Scandal in Bohemia, Sherlock Holmes teaches Watson the difference between seeing and observing:

“When I hear you give your reasons,” I remarked, “the thing always appears to me to be so ridiculously simple that I could easily do it myself, though at each successive instance of your reasoning, I am baffled until you explain your process. And yet I believe that my eyes are as good as yours.”

“Quite so,” he answered, lighting a cigarette, and throwing himself down into an armchair. “You see, but you do not observe. The distinction is clear. For example, you have frequently seen the steps which lead up from the hall to this room.”

“Frequently.”

“How often?”

“Well, some hundreds of times.”

“Then how many are there?”

“How many? I don’t know.”

“Quite so! You have not observed. And yet you have seen. That is just my point. Now, I know that there are seventeen steps, because I have both seen and observed.”

The difference between seeing and observing is fundamental to many aspects of life. Indeed, we can learn a lot from how Sherlock Holmes thinks. Noticing is even something that Nassim Taleb has chimed in on with Noise and Signal.

In the video below, Harvard Business School Professor Max Bazerman, author of The Power of Noticing: What the Best Leaders See, discusses how important it is not just to be able to focus, but to be a good noticer as well. What he’s really talking about is observation.

A number of years ago I had an opportunity to notice and I failed to do so and it’s been an obsession with me ever since. On March 10, 2005 I was hired by the U.S. Department of Justice in a landmark case that they were fighting against the tobacco industry. I was hired as a remedy witness. That is, I was hired to provide recommendations to the court about what the penalty would be if, in fact, the Department of Justice succeeded in its trial against the tobacco industry. I had spent a couple hundred hours working for the Department of Justice including submitting my written direct testimony which had been submitted to the court.

I was scheduled to be on the stand on May 4 where the tobacco industry attorneys would be asking me a series of questions. On April 30, a number of days before my May 4 testimony I was in Washington D.C. to meet with the Department of Justice attorneys to prepare for my time on the stand. When the day started the Department of Justice attorney that I had been working with said to me, “Professor Bazerman.” This occurred long after he had learned to call me Max. He said, “Professor Bazerman, the Department of Justice requests that you amend your testimony to note that the testimony would not be relevant if any of the following four conditions existed.”

He then read to me four legal conditions that I didn’t understand. When he was done talking I said to him, “Why would you ask me to amend my testimony when you know that I didn’t understand what you just said to me.” And his response was because if you don’t agree, there’s a significant chance that senior leadership in the Department of Justice will remove you from the case before you are on the stand on May 4. To which I said, “Okay, I don’t agree to those changes.” And his response was, “Good. Let’s continue with your preparation.” I was jarred by the fact that something very strange had occurred. But I was overwhelmed in life. I was trying to help this case and I didn’t quite know what had occurred. But to this day I’m critical of the fact that I took no action. I did appear on trial on May 4 and the trial ended in early June.

But on June 17 I woke up in a hotel room in London. I was working with another client at the time. And I woke up early at 5:00 a.m. and I opened up The New York Times web edition and I read a story about Matt Myers, the president of Tobacco Free Kids, who had come forward to the media with evidence about how Robert McCallum, the number two official in the Department of Justice, was involved in attempting to get him to change his testimony. And I then read basically the same account that I had experienced back on April 30. Matt Myers had the insight to know that he should do something with this information about what had occurred in terms of the attempt to tamper with this testimony. And at that point it was straightforward to come forward to the media to speak to congressional representatives about what happened. And my own role received media attention as well.

But to this day I’m still struck by the fact that I didn’t come forward on April 30 when, in the back of my mind I knew something had occurred. The reason I tell you this story is because I think a lot of our failure to notice happens when we’re busy. It happens when we don’t know exactly what’s happening. But I think it’s our job as executives, as leaders, as professionals to act when we’re pretty sure that there’s something wrong. It’s our job to notice and to not simply be passive when we can’t quite figure out the evidence. It’s our job to take steps to figure out what’s going on and to act on critical information.

***

Follow your curiosity and learn about why your decision making environment matters.

The Relationship Between Design and Planning

QR_design_spagetti

While I’m not all that interested in military doctrine and tactics in and of themselves, I am interested in complex systems, how the weak win wars, and the lessons military leaders offer (for example, see the lessons of William McRaven and Stanley McChrystal).

This is how I found myself flipping through The U.S. Army / Marine Corps Counterinsurgency Field Manual, which was written to facilitate a common understanding of the problems inherent in counterinsurgency campaigns.

There was a fascinating section on the difference between designing and planning that caused me to pause and reflect.

While both activities seek to formulate ways to bring about preferable futures, they are cognitively different. Planning applies established procedures to solve a largely understood problem within an accepted framework. Design inquires into the nature of a problem to conceive a framework for solving that problem. In general, planning is problem solving, while design is problem setting. Where planning focuses on generating a plan—a series of executable actions—design focuses on learning about the nature of an unfamiliar problem.

When situations do not conform to established frames of reference — when the hardest part of the problem is figuring out what the problem is—planning alone is inadequate and design becomes essential. In these situations, absent a design process to engage the problem’s essential nature, planners default to doctrinal norms; they develop plans based on the familiar rather than an understanding of the real situation. Design provides a means to conceptualize and hypothesize about the underlying causes and dynamics that explain an unfamiliar problem. Design provides a means to gain understanding of a complex problem and insights towards achieving a workable solution.

To better understand the multifaceted problems many of us face today it helps to talk with people who have different perspectives. This helps achieve better situational understanding. At best this can point the way to solutions and at worst this should help with learning what to avoid.

Often we skip the information gathering phase because it’s a lot of work. A lot of conversations. However this process helps us become informed, rather than just opinionated.

The underlying premise is this: when participants achieve a level of understanding such that the situation no longer appears complex, they can exercise logic and intuition effectively. As a result, design focuses on framing the problem rather than developing courses of action.

Just as you can never step in the same river twice, design is not something you do once and walk away. It’s an ongoing inquiry into the nature of problems and the various factors and relationships to help improve understanding. Constantly assessing the situation from a design perspective, helps gauge the effectiveness of the planning and subsequent actions. If you don’t periodically reassess the situation, you might be solving a problem that no longer exists.

The U.S. Army / Marine Corps Counterinsurgency Field Manual is full of other thought-provoking content.

(image source)