Michael Mauboussin on Intuition, Experts, Technology, and Making Better Decisions

Michael Mauboussin, Credit Suisse
Michael Mauboussin, Credit Suisse

Welcome to The Knowledge Project, an experimental podcast aimed at acquiring wisdom through interviews with key luminaries from across the globe to gain insights into how they think, live, and connect ideas. The core themes will seem familiar to readers: Decision Making, Leadership, Innovation. But it also touches on questions about what it means to live a good life.


The first episode of The Knowledge Project features Michael Mauboussin, the head of Global Financial Strategies at Credit Suisse. He’s also written numerous books, including More Than You Know: Finding Financial Wisdom in Unconventional Places, Think Twice: Harnessing the Power of Counterintuition, and most recently The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing. More importantly, Mauboussin spends more time thinking about thinking than most people.

In this episode we explore parenting, daily routines, reading, and how to make better decisions.


Here are a list of books mentioned in the podcast:


In this excerpt from the podcast, Mauboussin comments on the role of intuition in the decision making process:

The way I like to think about this, and by the way there’s a great book by David Myers on this, called “Intuition.” It’s a book I really would recommend. It’s one of the better treatments of this, and more thoughtful treatments of this.

The way I think about this is, intuition is very domain-specific. Specifically, I would use the language of Danny Kahneman – System one, System two. System one is our experiential system. It’s fast, it’s automatic, but it’s not very malleable. It’s difficult to train.

Our System two, of course, our analytical system, is slower, more purposeful, more deliberate but more trainable. Intuition applies when you participate in a particular activity to a sufficient amount that you effectively train your System one.

So that things become, go from your slow system to your fast system. Where would this work, for instance? It would work in things like, obviously, with things like chess. Chess masters, we know, they chunk. They can see the board very quickly, know who’s at advantage, who’s not at advantage.

But it’s not going to work… So, the key characteristic is it’s going to work in what I would call stable linear environments. Stable linear environments. Athletics would be another example. For long parts of history, it was in warfare. Certain elements of warfare would work.

But if you get into unstable, non-linear environments, all bets are going to be off. There is a great quote from Greg Northcraft, which I love, when he says you have to differentiate between experience and expertise. Intuition relates to this.

He said expertise… An expert is someone who has a predictive model that works, and so just because you’ve been doing something for a long time doesn’t mean that you have a predictive model that works.

I would say intuition should be used with a lot of caution.

The key is to have disciplined intuition.

(Danny Kahneman) said, “You know, you’re going to have these base rates, or statistical ways of thinking about things, and then you’re going to have your intuition. How do you use those two things, and in what order?”

The argument he made was you should always start with the base rate the statistical approach, and then layer in your intuition. He called it “disciplined intuition.” Otherwise, if you go with your intuition first, you’re going to seek out, right, you’re going to seek out things that support your point of view.

I always think about it that way. I know that a lot of people make decisions using their gut or their intuition, but I don’t know that that’s the best way to do it in most settings. Some settings, yes, but most settings, no.

How Using a Decision Journal can Help you Make Better Decisions

"Odds are you’re going to discover two things right away. First, you’re right a lot of the time. Second, it’s often for the wrong reasons."
“Odds are you’re going to discover two things. First, you’re right a lot of the time. Second, it’s often for the wrong reasons.”

One question I’m often asked is what should a decision journal look like?

You should care enormously whether you’re making good ones or bad ones. After all, in most knowledge organizations, your product is decisions.

A good decision process matters more than analysis by a factor of six. A process or framework for making decisions, however, is only one part of an overall approach to making better decisions.

The way to test the quality of your decisions, whether individually or organizationally, is by testing the process by which they are made. And one way to do that is to use a decision journal.

You can think of a decision journal as quality control — something like we’d find in a manufacturing plant.

Conceptually this is pretty easy but it requires some discipline and humility to implement and maintain. In consulting with various organizations on how to make better decisions I’ve seen everything from people who take great strides to improve their decision making to people who doctor decision journals for optics over substance.

“The idea,” says Michael Mauboussin, “is whenever you are making a consequential decision, write down what you decided, why you decided as you did, what you expect to happen, and if you’re so inclined, how you feel mentally and physically.”

Whenever you’re making a consequential decision either individually or as part of a group you take a moment and write down:

  1. The situation or context;
  2. The problem statement or frame;
  3. The variables that govern the situation;
  4. The complications or complexity as you see it;
  5. Alternatives that were seriously considered and why they were not chosen; (think: the work required to have an opinion).
  6. A paragraph explaining the range of outcomes
  7. A paragraph explaining what you expect to happen and, importantly, the reasoning and actual probabilities you assign to each. (The degree of confidence matters, a lot.)
  8. Time of day the decision was made and how you feel physically and mentally (if you’re tired, for example, write it down.)

Of course, this can be tailored to the situation and context. Specific decisions might include tradeoffs, weighting criteria, or other relevant factors.

One point, worth noting, is not to spend too much time on the brief and obvious insight. Often these first thoughts are system one, not system two. Any decision you’re journaling is inherently complex (and may involve non-linear systems). In such a world small effects can cause disproportionate responses whereas bigger ones can have no impact. Remember that causality is complex, especially in complex domains.

One tip I’ve learned from helping organizations implement this is that there are two common ways people wiggle out of their own decision: hindsight bias and jargon.

I know we live in an age of computers but you simply must do this by hand because that will help reduce the odds of hindsight bias. It’s easy to look at a print-out and say, I didn’t see it that way. It’s a lot harder to look at your own writing and say the same thing.

Another thing to avoid is vague and ambiguous wording. If you’re talking in abstractions and fog, you’re not ready to make a decision, and you’ll find it easy to change the definitions to suit new information. This is where writing down the probabilities as you see them comes into play.

These journals should be reviewed on a regular basis—every six months or so. The review is an important part of the process. This is where you can get better. Realizing where you make mistakes, how you make them, what types of decisions you’re bad at, etc. will help you make better decisions if you’re rational enough. This is also where a coach can help. If you share your journal with someone, they can review it with you and help identify areas for improvement.

And keep in mind it’s not all about outcome. You might have made the right decision (which, in our sense means a good process) and had a bad outcome. We call that a bad break.

Odds are you’re going to discover two things right away. First, you’re right a lot of the time. Second, it’s often for the wrong reasons.

This can be somewhat humbling.

Let’s say you buy a stock and it goes up, but it goes up for reasons that are not the same as the ones you thought. You’re probably thinking high-fives all around right? But in a very real sense you were wrong. This feedback is incredibly important.

If you let it, the information provided by this journal will help identify cases where you think you know more than you do but in fact you’re operating outside your circle of competence.

It will also show you how your views change over time, when you tend to make better decisions, and how serious the deliberations were.

A Wonderfully Simple Heuristic to Recognize Charlatans

Nassim Nicholas Taleb

“For the Arab scholar and religious leader Ali Bin Abi-Taleb (no relation), keeping one’s distance from an ignorant person is equivalent to keeping company with a wise man.”

The idea of inversion isn’t new.

While we can learn a lot from what successful people do in the mornings, as Nassim Taleb points out, we can learn a lot from what failed people do before breakfast too.


Inversion is actually one of the most powerful mental models in our arsenal. Not only does inversion help us innovate but it also helps us deal with uncertainty.

“It is in the nature of things,” says Charlie Munger, “that many hard problems are best solved when they are addressed backward.”

Sometimes we can’t articulate what we want. Sometimes we don’t know. Sometimes there is so much uncertainty that the best approach is to attempt to avoid certain outcomes rather than attempt to guide towards the ones we desire. In short, we don’t always know what we want but we know what we don’t want.

Avoiding stupidity is often easier than seeking brilliance.

The “apophatic,” writes Nassim Taleb in Antifragile, “focuses on what cannot be said directly in words, from the greek apophasis (saying no, or mentioning without meaning).”

The method began as an avoidance of direct description, leading to a focus on negative description, what is called in Latin via negativa, the negative way, after theological traditions, particularly in the Eastern Orthodox Church. Via negativa does not try to express what God is— leave that to the primitive brand of contemporary thinkers and philosophasters with scientistic tendencies. It just lists what God is not and proceeds by the process of elimination.

Statues are carved by subtraction.

Michelangelo was asked by the pope about the secret of his genius, particularly how he carved the statue of David, largely considered the masterpiece of all masterpieces. His answer was: “It’s simple. I just remove everything that is not David.”

Where Is the Charlatan?

Recall that the interventionista focuses on positive action—doing. Just like positive definitions, we saw that acts of commission are respected and glorified by our primitive minds and lead to, say, naive government interventions that end in disaster, followed by generalized complaints about naive government interventions, as these, it is now accepted, end in disaster, followed by more naive government interventions. Acts of omission, not doing something, are not considered acts and do not appear to be part of one’s mission.

I have used all my life a wonderfully simple heuristic: charlatans are recognizable in that they will give you positive advice, and only positive advice, exploiting our gullibility and sucker-proneness for recipes that hit you in a flash as just obvious, then evaporate later as you forget them. Just look at the “how to” books with, in their title, “Ten Steps for—” (fill in: enrichment, weight loss, making friends, innovation, getting elected, building muscles, finding a husband, running an orphanage, etc.).

We learn the most from the negative.

[I]n practice it is the negative that’s used by the pros, those selected by evolution: chess grandmasters usually win by not losing; people become rich by not going bust (particularly when others do); religions are mostly about interdicts; the learning of life is about what to avoid. You reduce most of your personal risks of accident thanks to a small number of measures.

Skill doesn’t always win.

In anything requiring a combination of skill and luck the most skillful don’t always win. That’s one of the key messages of Michael Mauboussin’s book The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing. This is hard for us to swallow because we intuitively feel that if you are successful you have skill for the same reasons that if the outcome is good we think you made a good decision. We can’t predict whether a person who has skills will succeed but Taleb argues that we can “pretty much predict” that a person without skills will eventually have their luck run out.

Subtractive Knowledge
Taleb argues that the greatest “and most robust contribution to knowledge consists in removing what we think is wrong—subtractive epistemology.” He continues that “we know a lot more about what is wrong than what is right.” What does not work, that is negative knowledge, is more robust than positive knowledge. This is because it’s a lot easier for something we know to fail than it is for something we know that isn’t so to succeed.

There is a whole book on the half-life of what we consider to be ‘knowledge or fact’ called The Half-Life of Facts. Basically, because of our partial understanding of the world, which is constantly evolving, we believe things that are not true. That’s not the only reason that we believe things that are not true but it’s a big one.

The thing is we’re not so smart. If I’ve only seen white swans, saying “all swans are white” may be accurate given my limited view of the world but we can never be sure that there are no black swans until we’ve seen everything.

Or as Taleb puts it: “since one small observation can disprove a statement, while millions can hardly confirm it, disconfirmation is more rigorous than confirmation.”

Most people attribute this philosophical argument to Karl Popper but Taleb dug up some evidence that it goes back to the “skeptical-empirical” medical schools of the post classical era in the Eastern Mediterranean.

Being antifragile isn’t about what you do, but rather what you avoid. Avoid fragility. Avoid stupidity. Don’t be the sucker. …

Three Steps to Effective Decision Making

three steps

Making an important decision is never easy, but making the right decision is even more challenging. Effective decision-making isn’t just about accumulating information and going with what seems to make the most sense. Sometimes, internal biases can impact the way we seek out and process information, polluting the conclusions we reach in the process. It’s critical to be conscious of those tendencies and to accumulate the sort of fact-based and unbiased inputs that will result in the highest likelihood that a decision actually leads to the desired outcome.

In this video, Michael Mauboussin, Credit Suisse’s Head of Financial Strategies, lays out three steps that can help focus a decision-maker’s thinking.

How do we take new information that comes in and integrate it with our point of view?

Typically we don’t really take into consideration new information. The first major barrier to that is something called the confirmation bias. Once you’ve decided on something and you think this is the right way to think about it, you either blow off new information or if it’s ambiguous you interpret it in a way that’s favorable to you. Now the next problem, and we all have this, is called pseudo and subtly-diagnostic information. Pseudodiagnostic means information that isn’t very relevant but you think it is. Subtly-diagnostic is information that is relevant and you don’t pay attention to it.

So the key in all of this, is we have this torrent of information coming in, how do I sort that in a way that should lead me to increase or decrease my probabilities of a particular event happening.

Michael Mauboussin, Interview No. 4

Michael Mauboussin is the author of numerous books, including More Than You Know: Finding Financial Wisdom in Unconventional Places, Think Twice: Harnessing the Power of Counterintuition, and most recently The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing (a book that found its way to Warren Buffett’s desk.)

While Michael is well known in investment circles for his knowledge of biases and clarity of thinking, a lot of others are missing out on his insight. You need no more proof than listening to his interview on my podcast, The Knowledge Project.

His latest book takes a look at how both skill and luck play a role in outcomes — they are, on a continuum. For instance, he believes that basketball is 12% luck whereas hockey is 53% luck. Skill still plays a certain role but talent might mean more in certain places.

As part of my irregular series of interviews, Michael and I talk about what advice he’d offer his younger self today, the definition of luck, decision journals, and how organizations can improve their decisions and more.

Let’s get started.

I believe you graduated with a B.A. in Government. How did you end up starting your career as a packaged food analyst?

My first job after college was in a training program at Drexel Burnham Lambert. We had classroom training and rotated through more than a dozen departments in the investment bank. It was during those rotations I realized that I enjoyed research and that it suited my skills reasonably well.

In the early to mid-1980s, Drexel had a great food industry analyst – to this day, I believe he’s the best analyst I’ve ever seen. So I naturally followed him closely. Shortly after I left Drexel, I was able to secure a job as a junior analyst working with two analysts – one following capital goods and the other food, beverage, and tobacco. From there I was able to secure a job as a senior analyst following the packaged food industry at County NatWest. Interestingly, County NatWest had taken over a good chunk of Drexel’s equity business after Drexel went bankrupt. So I was back in a familiar environment.

So I guess the answer has three elements. First, I was exposed to an inspirational analyst. Second, I found research to be an area I greatly enjoyed. And, finally, I was a complete failure at the job for which I was trained—a financial advisor. So failure played a big role as well.

If you could hop on the elevator with your younger self going into your first day on the job, what would you say?

I would probably suggest the motto of the Royal Society – “nullius in verba” – which roughly translates to “take nobody’s word for it.” Basically, the founders were urging their colleagues to avoid deferring to authority and to verify statements by considering facts. They wanted to make sure everyone would think for themselves.

In the world of investing, that means constant learning—which entails constant reading. So I would encourage my younger self to read widely, to constantly learn, and to develop points of view independent of what others say and based on facts. Specifically, I would recommend developing the habit of reading. Constantly ask good questions and seek to answer them.

I noticed you recently moved back to Credit Suisse after a stint at Legg Mason. Can you tell me a little about your job there and how it’s different?

Over the years I have been fortunate to have sponsors who have allowed me to do work that is a little off the beaten path. Brady Dougan, now chief executive officer of Credit Suisse, has been one of those people. So when the time came to make a job switch, I was lucky to have a conversation with someone with whom I worked before and who understood the kind of work I do.

My job covers four different parts of the bank, including corporate, investment banking, securities, and the private bank. As I work with talented colleagues in each of these areas, I get to live the firm’s objective of operating as one bank.

My actual research will continue to dwell on four areas. The first is capital markets theory, the attempt to better understand issues of market efficiency. Second is valuation, an area I’ve always been very focused on. Third is competitive strategy analysis—and in particular I’m keen on understanding the intersection of competitive strategy and valuation. Finally, I work on aspects of decision making. How do we make decisions, and what kinds of mistakes are we prone to?

So the types of problems I’m working on will be similar to the past but the constituencies I get to work with are more diverse.

It seems that today, more than ever, people are going to Wall Street with very similar backgrounds. How do you see the impact of this?

For the last few decades Wall Street has attracted a lot of bright people. By and large, the folks I deal with on the Street are smart, thoughtful, and motivated. The key to robust markets and organizations is diversity of thought. And I don’t personally find such diversity greatly lacking.

There are a couple of areas worth watching, though. There does seem to be evidence that hedge funds are increasingly moving into similar positions—often called a “crowded trade.” This was exemplified by the trades of Long-Term Capital Management. Crowded trades can be a big problem.

Somewhat related is the world of quantitative analysis. Many quants read the same papers, use the same data sets, and hence put on similar trades. We’ve seen some hiccups in the quantitative world—August 2007 is a good illustration—and there may well be more to come.

Your most recent book, The Success Equation, points out something that few people seem to consider: that most things in life are a mixture of luck and skill. What’s the mental model we can take away from this?

The main point is to think critically about the activity you’re participating in and consider how much luck contributes to the outcome. In some realms it’s negligible, such as a running race. But in others, it’s huge. Once you understand luck’s role, you can understand how to approach the activity more thoughtfully, including how you develop skill and interpret results.

But I can tell you that our minds are horrible at understanding luck. So any mental model has to overcome our natural tendency to think causally—that is, that good outcomes are the result of good skill and bad outcomes reflect bad skill.

You say, convincingly, that we need to accept the presence of luck so that we can understand where it is not a useful teacher. But we often interpret luck as a quality individuals possess, similar to “judgment” or “good instinct,” rather than as simply the expression of statistics in our lives. What are your thoughts about Napoleon’s famous question regarding a general being praised to him, “yes, yes, I know he is brilliant. But tell me, is he lucky?”

I have read that Napoleon quotation many times and don’t really know what he was trying to convey. Perhaps the answer lies in how you define luck.

Naturally, in writing a book about luck and skill I had to spend a lot of time trying to define luck. I settled on the idea that luck exists when three conditions are in place: it operates for an individual or organization; it can be good or bad; and it is reasonable to expect a different outcome to occur.

Another similar way to think about it is skill is what’s within your control and luck is what is outside your control. If there’s something you can do to improve your lot, I would call that skill.

Now I don’t want to deny that intuition exists. It does. It just happens to dwell in specific domains and hence is vastly rarer than people think. Specifically, intuition can be developed in activities that are stable and linear. This applies to chess, for example, or many sports. In these activities proper training can develop intuition. But in fields that are unstable and non-linear, all bets are off regarding intuition. The problem is we generally don’t distinguish the activity before considering how good intuition is likely to be.

So to answer the question, I wouldn’t want to bet on anyone who has truly succeeded by dint of luck because luck by definition is unlikely to persist.

You describe creeping determinism, the desire we have to give past events rational causes and thus make them inevitable. It is a myth of control. But we have a huge desire to see control. It seems preferable even to give control to someone else rather than to deny it existed at all. I’m not sure we are psychologically capable of saying that something in the past happened because of a conjunction of events and actions without any overriding intent or plan. What do you think?

This reminds me of something Victor Hugo said: “The mind, like nature, abhors a vacuum.” It is psychologically extremely difficult to attribute something to luck. The reason is that in the left hemisphere of our brains is a part that neuroscientists call the “interpreter.” The job of the interpreter is to create a cause for all the effects it sees. Now in most cases, the cause and effect relationships it comes up with make perfect sense. Throw a rock at a window and the window smashes. No problem.

The key is that the interpreter doesn’t know anything about luck. It didn’t get the memo. So the interpreter creates a story to explain results that are attributable solely to luck. The key is to realize that the interpreter operates in all of our brains all of the time. Almost always, it’s on the mark. But when you’re dealing with realms filled with luck, you can be sure that the interpreter will create a narrative that is powerful and false.

You talk about what happens when companies, for examples, hire stars, and how so very often that proves to be an expensive failure, because so much of the star’s success is attributed to the individual him or herself, and the specific context of the star’s success is ignored. It seems to me there must be hundreds of case studies that prove this is true, similarly an overwhelming amount of data that supports your thoughts on hiring sports stars on contracts that don’t take into account their declining skills. A vast amount of data that supports your views, yet the hiring of stars continues; it must be one of the most quantitatively demonstrably false assumptions in the business and sports world. So why does it continue?

I think there are two related underlying factors. The first is a failure to recognize reversion to the mean. Let’s take sports as an example. If a player has a great year statistically, we can almost always conclude that he was skillful and lucky. He gets bid away by another team based on his terrific results. What happens next? Well, on average his skill may remain stable but his luck will run out. So his performance will revert to the mean. Reversion to the mean is a very subtle concept that most people think they understand but few actually do. Certainly, the aggregate behaviors of people suggest that they don’t understand reversion to the mean.

The second underlying factor is underestimating the role of social context in success. An individual within an organization is not operating independently; she is surrounded by colleagues and an infrastructure that contribute to her outcomes. When high-performing individuals go from one organization to another, the social context changes, and generally that has a negative impact on results.

Do you think we make similar mistakes when we promote people in organizations? Specifically, I’m thinking that hiring can get really complicated if you have to look at how long someone has been doing a job, the people they work with, the difficulty of the job itself, there are so many variables at play that it’s hard to tease out skill versus luck. How can we get better at this when it comes to promoting people internally?

This can be a challenge. But there’s an interesting angle here. Typically, the lower you are in an organization, the easier it is to measure your skill. That’s because basic functions are generally “algorithmic,” people are executing their jobs based on certain known principles. So outcomes are an accurate reflection of skill.

But as you move up in an organization, luck often plays a bigger role. For example, developing a strategy for a new product is no sure thing—luck can play a large role in shaping the strategy’s success or failure. Said differently, even strategies that are really well thought through will fail some percentage of the time as the result of bad luck.

So as people move up in organizations, it makes sense to pay more attention to the process of decision making than the outcomes alone. For example, I would argue that capital allocation is a CEO’s most important job. And capital allocation is inherently based on process.

So as individuals advance in their careers, their duties often slide towards the luck side of the continuum. Furthermore, you note that fluid intelligence peaks at age 20. What does this say about leadership? If I was to be extreme, I would interpret you as saying that people assume or are given positions of leadership at the very time when they are least fitted to be leaders. For example, the median age of U.S. presidents is 54. That is not an age when I should expect someone to be able to deal well with complex, or unprecedented issues and decisions. I don’t consider the corresponding increase in crystallized intelligence to be adequate compensation. And what does this say about leadership development, executive coaching and the like? If I am an executive coach, should I be explaining to my clients that their biggest mistake will be to ignore the overwhelming role luck will play in their success as leaders?

There are a couple of issues here. First, as you mentioned, cognitive performance combines fluid intelligence, which peaks when you are young, and crystallized intelligence, which tends to grow throughout your life. For older people the problem is not that they don’t have the knowledge or wisdom to make good decisions, it’s that they tend to become cognitively lazy and fall back on rules of thumb that served them well in the past. Using terms that Danny Kahneman popularized, they rely more on System 1—fast, automatic, and difficult to train—and less on System 2, which is analytical, slower, and more purposeful. You can overcome this tendency by having others work with you on your decision making, ensuring that you’re looking at the proper options, and considering consequences as completely as possible.

If I were an executive coach, I would try to focus each individual on the facets they can control. Emphasizing what’s in your control allows you to adopt an attitude of equanimity toward luck. You’ve done all that you can, and from there you have to live with the results—good or bad.

I thought you description of Mark Granovetter’s research on phase transitions fascinating. But this seems to me to contradict the idea of fundamental attribution error, and in fact, make fundamental attribution a real, existing phenomenon. If we talk about, for example, an organization that is trying to change a common behavior of its executives (say, getting them to stop taking calls or messages on their cellphones when they are in meetings), then it seems to me that if the CEO models this new behavior it has a great likelihood of becoming the norm. If he does not, then the likelihood is nil. So this would be an example of fundamental attribution. Would this not be the same for more complex issues where a specific action or behavior by the leader of the organization would be the cause of phase transition?

I think the common denominator of both of your thoughts is the role of social context. Granovetter’s model emphasizes how small changes in the network structure can lead to disproportionately different outcomes. This is very important for any kind of diffusion process—be it the flu, an idea, a new technology, or a social behavior. The spread of disease provides a vivid example. There are basically two essential parameters in understanding disease propagation: the contagiousness of the disease and the degree of interaction between people in the population. A disease only spreads quickly if contagiousness and interaction are high.

The disease metaphor works pretty well for the diffusion of any innovation or behavior. The main point is that most products, ideas, and diseases fail to propagate. Some of that is a function of the inherent nature of what’s trying to propagate and some is from the network itself.

The fundamental attribution error says that when we observe the behavior of another person—this is especially true for bad behavior—we tend to attribute the behavior more to the individual than to the social context. But both studies and observations in life show that social context is deeply influential in shaping behavior.

Tying this back to your point, I think it’s crucial for leaders to acknowledge two realities. First, they operate in a social context which shapes their behavior. For example, Warren Buffett talks about the “institutional imperative,” which says among other things that a company will mindlessly imitate the behavior of peer companies.

Second, leaders have to recognize that they create a social context for the decisions of their employees. Some social contexts are not conducive to good decisions. For example, if employees feel too much stress, they will shorten the time horizons for their decisions. So an executive may say he or she is thinking for the long term but his or her behavior may be shaping an environment where employees are focused only on the here and now.

When you talk about the qualities that make a statistic useful, do you have any thoughts on organizations that are trying to be more evidence-based and quantitative in how they measure their performance, yet seem to have great trouble in identifying useful statistics? Examples that come to mind would be government departments and agencies, particularly those that do not provide a direct public service. What is the process they organizations should follow to identify what are useful statistics for measuring effectiveness? Government departments and agencies are notorious for confusing luck and skill.

I suggest two simple criteria identifying a useful statistic. First, it should be persistent, or what statisticians call “reliable.” A statistic is persistent if it correlates highly with itself over time, and hence is frequently an indicator of skill. Next, it should be predictive, or what statisticians call “valid.” That is, the statistic correlates highly with what the organization is trying to achieve.

With these two criteria in mind, you can adopt a four-step process for finding useful statistics. Step one is to define your governing objective. What is your organization trying to achieve? Step two is to develop a theory of cause and effect to help identify the drivers of success. Step three is identifying the specific actions that people within the organization can take to serve the governing objective. And step four is to regularly evaluate the statistics to see if they are doing the job.

Naturally, luck plays a role in outcomes almost everywhere you look. But this process of selecting statistics gives you the best chance of properly emphasizing what is in the control of the organization.

To sort of round this interview out, I’d like to talk with you about a subject I suspect you spend a lot of time thinking about: improving decisions in organizations. One of your pieces of advice is to create a decision journal, can you tell me what that looks like to you?

A decision journal is actually very simple to do in principle, but requires some discipline to maintain. The idea is whenever you are making a consequential decision, write down what you decided, why you decided as you did, what you expect to happen, and if you’re so inclined, how you feel mentally and physically. This need not take much time.

The value is that you document your thinking in real time and thus immunize yourself against hindsight bias—the pernicious tendency to think that you knew what was going to happen with more clarity than you actually did. The journal also allows you to audit your decision making process, looking for cases where you may have been right for the wrong reasons or wrong for the right reasons.

I imagine what people record in a decision journal is somewhat personal but can you give me an idea of what sorts of things you note?

Since I make few investment decisions, my journal doesn’t have a lot of the material that an investor would have. What I try to do is keep track of meetings, thoughts, and ideas so that I can return to them over time. What I can promise is that if you keep a journal with some detail, you’ll be surprised at how your views change over time.

People’s emotional state has a bearing on how they work. How do you go about accounting for that when making decisions?

Emotional state is enormously important. There is the obvious advice that everyone knows, such as don’t make a consequential decision at a point of high arousal—whether that arousal is positive or negative. We’ve already discussed stress, but I’ll reiterate the point. Too much stress is very debilitating for the process of making good decisions, especially for long-term decisions.

Finally, we all have different personalities and those differences portend strengths and weaknesses. Most great investors are somewhat indifferent about what others think. They feel comfortable following their conviction based on analysis. Investors who are highly attuned to others tend to struggle much more, because they have difficulty being a contrarian.

If there is a single change you could recommend to an organization to improve their decisions, what would it be?

Elizabeth Mannix and Margaret Neale, two well-known psychologists, have a great line in one of their survey papers. They write, “To implement policies and practices that increase diversity of the workforce without understanding how diverse individuals can come together to form effective teams is irresponsible.” I love that. So my answer would be that organizations need to learn how to create and manage diversity within their organizations. Most leaders have no idea how to do that.

Let’s end with a variant on my favorite question. You’ve just taken over a university and are required to pick 3 books for every student to read. What would they be and why?

This is an impossible question to answer!

I’d probably start with The Origin of Species by Charles Darwin. Understanding evolution strikes me as essential to be a good thinker. Naturally, much has come along to fortify Darwin’s ideas, but many of the core ideas are there. Plus, Darwin himself is a wonderful model: hardworking, humble, modest, always learning.

Next I’d go with something very modern, Daniel Kahneman’s Thinking, Fast and Slow. That this is the opus of the man who is likely the greatest living psychologist is reason alone to read it. But my motivation would be that learning how to make good decisions is perhaps the greatest skill you can have in your life. And with some sense of how you are likely to go wrong, perhaps you can increase your odds of getting it right.

Finally, I’d go with How to Read a Book by Mortimer Adler and Charles Van Doren. (Ed: see the cheat sheet.) This is a guide on how to be effective at what should be one of your most important activities in life: reading.

And I’d recommend you read all of Mauboussin’s books, starting with The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing.

(Note: A few of the questions were submitted by a friend of mine: Neil Cruickshank.)

How to Improve Decision Making in Your Organization

When Michael Mauboussin met Nobel Laureate Daniel Kahneman, he asked how we can improve our performance.

Kahneman, the author of Thinking: Fast and Slow, replied, almost without hesitation, that you should go down to the local drugstore and buy a very cheap notebook and start keeping track of your decisions.

Thinking Matters

Whenever you’re making a decision of consequence, take a moment to think; Write down the relevant variables that will govern the outcome, what you expect to happen, and why you expect it to happen. (Optionally you can add how you feel about the decision and your confidence level in the outcome you expect.)

A journal of this nature will reduce hindsight bias and give you accurate and honest feedback.

It’ll also help you distinguish between when you’re right for the wrong reasons and when you’re wrong for the right reasons.

If you’re anything like me, one thing that you’ll discover is that, on the few occasions when you’re right, it’s often for the wrong reasons.

A Decision Journal

Somewhat surprisingly, few organizations keep track of what’s decided and why.

This seems idiotic when you consider that often thousands of dollars are spent making a decision. Of the few that do keep track of decisions, fewer will be honest about what’s actually discussed.

While accurate, for example, few people will write down that the highest paid person in the room, usually the boss, said to do X, and that’s why you’re doing it.

But that’s kinda the point isn’t it?

If you can’t write down what was discussed, the relevant variables that govern the decision, and why you think something will play out, than you should seriously think about why you’re making the decision in the first place. It could be that you don’t understand what’s going on at all. If that’s the case it’s important to know.

I get that we have to make decisions under uncertainty. But we’re not going to learn from those decisions if we don’t keep track of, and review, what’s decided and why.

While people come and go, organizations often seem to make the same mistakes over and over. Improving our ability to make decisions is simple, but not easy.

A decision journal will not only allow you to reduce your hindsight bias, but it will force to you make your rationale explicit upfront. This habit will often surface bad thinking that might have otherwise slipped by.

Still curious? Pair with Daniel Kahneman’s Favorite Approach For Making Better Decisions and How to create and use a decision journal.