How Using a Decision Journal can Help you Make Better Decisions

"Odds are you’re going to discover two things right away. First, you’re right a lot of the time. Second, it’s often for the wrong reasons."
“Odds are you’re going to discover two things. First, you’re right a lot of the time. Second, it’s often for the wrong reasons.”

We’ve talked quite a bit about decision journals and the one question I get asked more than any other is what should my decision journal look like?

After all, in most knowledge organizations, your product is decisions: you should care enormously whether you’re making good ones or bad ones. “There ought to be something that is the equivalent to the quality control you can find in manufacturing.”

We know that a good decision process matters more than analysis by a factor of six. A process or framework for making decisions, however, is only one part of an overall approach to making better decisions.

The way to test the quality of your decisions, whether individually or organizationally, is by testing the process by which they are made. And one way to do that is to use a decision journal.

Conceptually this is pretty easy but it requires some discipline and humility to implement and maintain. In consulting with various organizations on how to make better decisions I’ve seen everything from people who take great strides to improve their decision making to people who doctor decision journals for optics over substance.

“The idea,” says Michael Mauboussin, “is whenever you are making a consequential decision, write down what you decided, why you decided as you did, what you expect to happen, and if you’re so inclined, how you feel mentally and physically.”

Whenever you’re making a consequential decision either individually or as part of a group you take a moment and write down:

  1. The situation or context;
  2. The problem statement or frame;
  3. The variables that govern the situation;
  4. The complications or complexity as you see it;
  5. Alternatives that were seriously considered and why they were not chosen; (think: the work required to have an opinion).
  6. A paragraph explaining the range of outcomes
  7. A paragraph explaining what you expect to happen and, importantly, the reasoning and actual probabilities you assign to each. (The degree of confidence matters, a lot.)
  8. Time of day the decision was made and how you feel physically and mentally (if you’re tired, for example, write it down.)

Of course, this can be tailored to the situation and context. Specific decisions might include tradeoffs, weighting criteria, or other relevant factors.

One point, worth noting, is not to spend too much time on the brief and obvious insight. Often these first thoughts are system one, not system two. Any decision you’re journaling is inherently complex (and may involve non-linear systems). In such a world small effects can cause disproportionate responses whereas bigger ones can have no impact. Remember that causality is complex, especially in complex domains.

Review
One tip I’ve learned from helping organizations implement this is that there are two common ways people wiggle out of their own decision: hindsight bias and jargon.

I know we live in an age of computers but you simply must do this by hand because that will help reduce the odds of hindsight bias. It’s easy to look at a print-out and say, I didn’t see it that way. It’s a lot harder to look at your own writing and say the same thing.

Another thing to avoid is vague and ambiguous wording. If you’re talking in abstractions and fog, you’re not ready to make a decision, and you’ll find it easy to change the definitions to suit new information. This is where writing down the probabilities as you see them comes into play.

These journals should be reviewed on a regular basis—every six months or so. The review is an important part of the process. This is where you can get better. Realizing where you make mistakes, how you make them, what types of decisions you’re bad at, etc. will help you make better decisions if you’re rational enough. This is also where a coach can help. If you share your journal with someone, they can review it with you and help identify areas for improvement.

And keep in mind it’s not all about outcome. You might have made the right decision (which, in our sense means a good process) and had a bad outcome. We call that a bad break.

Odds are you’re going to discover two things right away. First, you’re right a lot of the time. Second, it’s often for the wrong reasons.

This can be somewhat humbling.

Let’s say you buy a stock and it goes up, but it goes up for reasons that are not the same as the ones you thought. You’re probably thinking high-fives all around right? But in a very real sense you were wrong. This feedback is incredibly important.

If you let it, the information provided by this journal will help identify cases where you think you know more than you do but in fact you’re operating outside your circle of competence.

It will also show you how your views change over time, when you tend to make better decisions, and how serious the deliberations were.

A Wonderfully Simple Heuristic to Recognize Charlatans

Nassim Nicholas Taleb

“For the Arab scholar and religious leader Ali Bin Abi-Taleb (no relation), keeping one’s distance from an ignorant person is equivalent to keeping company with a wise man.”

The idea of inversion isn’t new.

While we can learn a lot from what successful people do in the mornings, as Nassim Taleb points out, we can learn a lot from what failed people do before breakfast too.

 

Inversion is actually one of the most powerful mental models in our arsenal. Not only does inversion help us innovate but it also helps us deal with uncertainty.

“It is in the nature of things,” says Charlie Munger, “that many hard problems are best solved when they are addressed backward.”

Sometimes we can’t articulate what we want. Sometimes we don’t know. Sometimes there is so much uncertainty that the best approach is to attempt to avoid certain outcomes rather than attempt to guide towards the ones we desire. In short, we don’t always know what we want but we know what we don’t want.

Avoiding stupidity is often easier than seeking brilliance.

The “apophatic,” writes Nassim Taleb in Antifragile, “focuses on what cannot be said directly in words, from the greek apophasis (saying no, or mentioning without meaning).”

The method began as an avoidance of direct description, leading to a focus on negative description, what is called in Latin via negativa, the negative way, after theological traditions, particularly in the Eastern Orthodox Church. Via negativa does not try to express what God is— leave that to the primitive brand of contemporary thinkers and philosophasters with scientistic tendencies. It just lists what God is not and proceeds by the process of elimination.

Statues are carved by subtraction.

Michelangelo was asked by the pope about the secret of his genius, particularly how he carved the statue of David, largely considered the masterpiece of all masterpieces. His answer was: “It’s simple. I just remove everything that is not David.”

Where Is the Charlatan?

Recall that the interventionista focuses on positive action—doing. Just like positive definitions, we saw that acts of commission are respected and glorified by our primitive minds and lead to, say, naive government interventions that end in disaster, followed by generalized complaints about naive government interventions, as these, it is now accepted, end in disaster, followed by more naive government interventions. Acts of omission, not doing something, are not considered acts and do not appear to be part of one’s mission.

I have used all my life a wonderfully simple heuristic: charlatans are recognizable in that they will give you positive advice, and only positive advice, exploiting our gullibility and sucker-proneness for recipes that hit you in a flash as just obvious, then evaporate later as you forget them. Just look at the “how to” books with, in their title, “Ten Steps for—” (fill in: enrichment, weight loss, making friends, innovation, getting elected, building muscles, finding a husband, running an orphanage, etc.).

We learn the most from the negative.

[I]n practice it is the negative that’s used by the pros, those selected by evolution: chess grandmasters usually win by not losing; people become rich by not going bust (particularly when others do); religions are mostly about interdicts; the learning of life is about what to avoid. You reduce most of your personal risks of accident thanks to a small number of measures.

Skill doesn’t always win.

In anything requiring a combination of skill and luck the most skillful don’t always win. That’s one of the key messages of Michael Mauboussin’s book The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing. This is hard for us to swallow because we intuitively feel that if you are successful you have skill for the same reasons that if the outcome is good we think you made a good decision. We can’t predict whether a person who has skills will succeed but Taleb argues that we can “pretty much predict” that a person without skills will eventually have their luck run out.

Subtractive Knowledge
Taleb argues that the greatest “and most robust contribution to knowledge consists in removing what we think is wrong—subtractive epistemology.” He continues that “we know a lot more about what is wrong than what is right.” What does not work, that is negative knowledge, is more robust than positive knowledge. This is because it’s a lot easier for something we know to fail than it is for something we know that isn’t so to succeed.

There is a whole book on the half-life of what we consider to be ‘knowledge or fact’ called The Half-Life of Facts. Basically, because of our partial understanding of the world, which is constantly evolving, we believe things that are not true. That’s not the only reason that we believe things that are not true but it’s a big one.

The thing is we’re not so smart. If I’ve only seen white swans, saying “all swans are white” may be accurate given my limited view of the world but we can never be sure that there are no black swans until we’ve seen everything.

Or as Taleb puts it: “since one small observation can disprove a statement, while millions can hardly confirm it, disconfirmation is more rigorous than confirmation.”

Most people attribute this philosophical argument to Karl Popper but Taleb dug up some evidence that it goes back to the “skeptical-empirical” medical schools of the post classical era in the Eastern Mediterranean.

Being antifragile isn’t about what you do, but rather what you avoid. Avoid fragility. Avoid stupidity. Don’t be the sucker. …

Three Steps to Effective Decision Making

three steps

Making an important decision is never easy, but making the right decision is even more challenging. Effective decision-making isn’t just about accumulating information and going with what seems to make the most sense. Sometimes, internal biases can impact the way we seek out and process information, polluting the conclusions we reach in the process. It’s critical to be conscious of those tendencies and to accumulate the sort of fact-based and unbiased inputs that will result in the highest likelihood that a decision actually leads to the desired outcome.

In this video, Michael Mauboussin, Credit Suisse’s Head of Financial Strategies, lays out three steps that can help focus a decision-maker’s thinking.

How do we take new information that comes in and integrate it with our point of view?

Typically we don’t really take into consideration new information. The first major barrier to that is something called the confirmation bias. Once you’ve decided on something and you think this is the right way to think about it, you either blow off new information or if it’s ambiguous you interpret it in a way that’s favorable to you. Now the next problem, and we all have this, is called pseudo and subtly-diagnostic information. Pseudodiagnostic means information that isn’t very relevant but you think it is. Subtly-diagnostic is information that is relevant and you don’t pay attention to it.

So the key in all of this, is we have this torrent of information coming in, how do I sort that in a way that should lead me to increase or decrease my probabilities of a particular event happening.

Michael Mauboussin, Interview No. 4

Michael Mauboussin is the author of numerous books, including More Than You Know: Finding Financial Wisdom in Unconventional Places, Think Twice: Harnessing the Power of Counterintuition, and most recently The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing (a book that found its way to Warren Buffett’s desk.)

While Michael is well known in investment circles for his knowledge of biases and clarity of thinking, a lot of others are missing out on his insight.

His latest book takes a look at how both skill and luck play a role in outcomes — they are, on a continuum. For instance, he believes that basketball is 12% luck whereas hockey is 53% luck. Skill still plays a certain role but talent might mean more in certain places.

As part of my irregular series of interviews, Michael and I talk about what advice he’d offer his younger self today, the definition of luck, decision journals, and how organizations can improve their decisions and more.

Let’s get started.

I believe you graduated with a B.A. in Government. How did you end up starting your career as a packaged food analyst?

My first job after college was in a training program at Drexel Burnham Lambert. We had classroom training and rotated through more than a dozen departments in the investment bank. It was during those rotations I realized that I enjoyed research and that it suited my skills reasonably well.

In the early to mid-1980s, Drexel had a great food industry analyst – to this day, I believe he’s the best analyst I’ve ever seen. So I naturally followed him closely. Shortly after I left Drexel, I was able to secure a job as a junior analyst working with two analysts – one following capital goods and the other food, beverage, and tobacco. From there I was able to secure a job as a senior analyst following the packaged food industry at County NatWest. Interestingly, County NatWest had taken over a good chunk of Drexel’s equity business after Drexel went bankrupt. So I was back in a familiar environment.

So I guess the answer has three elements. First, I was exposed to an inspirational analyst. Second, I found research to be an area I greatly enjoyed. And, finally, I was a complete failure at the job for which I was trained—a financial advisor. So failure played a big role as well.

If you could hop on the elevator with your younger self going into your first day on the job, what would you say?

I would probably suggest the motto of the Royal Society – “nullius in verba” – which roughly translates to “take nobody’s word for it.” Basically, the founders were urging their colleagues to avoid deferring to authority and to verify statements by considering facts. They wanted to make sure everyone would think for themselves.

In the world of investing, that means constant learning—which entails constant reading. So I would encourage my younger self to read widely, to constantly learn, and to develop points of view independent of what others say and based on facts. Specifically, I would recommend developing the habit of reading. Constantly ask good questions and seek to answer them.

I noticed you recently moved back to Credit Suisse after a stint at Legg Mason. Can you tell me a little about your job there and how it’s different?

Over the years I have been fortunate to have sponsors who have allowed me to do work that is a little off the beaten path. Brady Dougan, now chief executive officer of Credit Suisse, has been one of those people. So when the time came to make a job switch, I was lucky to have a conversation with someone with whom I worked before and who understood the kind of work I do.

My job covers four different parts of the bank, including corporate, investment banking, securities, and the private bank. As I work with talented colleagues in each of these areas, I get to live the firm’s objective of operating as one bank.

My actual research will continue to dwell on four areas. The first is capital markets theory, the attempt to better understand issues of market efficiency. Second is valuation, an area I’ve always been very focused on. Third is competitive strategy analysis—and in particular I’m keen on understanding the intersection of competitive strategy and valuation. Finally, I work on aspects of decision making. How do we make decisions, and what kinds of mistakes are we prone to?

So the types of problems I’m working on will be similar to the past but the constituencies I get to work with are more diverse.

It seems that today, more than ever, people are going to Wall Street with very similar backgrounds. How do you see the impact of this?

For the last few decades Wall Street has attracted a lot of bright people. By and large, the folks I deal with on the Street are smart, thoughtful, and motivated. The key to robust markets and organizations is diversity of thought. And I don’t personally find such diversity greatly lacking.

There are a couple of areas worth watching, though. There does seem to be evidence that hedge funds are increasingly moving into similar positions—often called a “crowded trade.” This was exemplified by the trades of Long-Term Capital Management. Crowded trades can be a big problem.

Somewhat related is the world of quantitative analysis. Many quants read the same papers, use the same data sets, and hence put on similar trades. We’ve seen some hiccups in the quantitative world—August 2007 is a good illustration—and there may well be more to come.

Your most recent book, The Success Equation, points out something that few people seem to consider: that most things in life are a mixture of luck and skill. What’s the mental model we can take away from this?

The main point is to think critically about the activity you’re participating in and consider how much luck contributes to the outcome. In some realms it’s negligible, such as a running race. But in others, it’s huge. Once you understand luck’s role, you can understand how to approach the activity more thoughtfully, including how you develop skill and interpret results.

But I can tell you that our minds are horrible at understanding luck. So any mental model has to overcome our natural tendency to think causally—that is, that good outcomes are the result of good skill and bad outcomes reflect bad skill.

You say, convincingly, that we need to accept the presence of luck so that we can understand where it is not a useful teacher. But we often interpret luck as a quality individuals possess, similar to “judgment” or “good instinct,” rather than as simply the expression of statistics in our lives. What are your thoughts about Napoleon’s famous question regarding a general being praised to him, “yes, yes, I know he is brilliant. But tell me, is he lucky?”

I have read that Napoleon quotation many times and don’t really know what he was trying to convey. Perhaps the answer lies in how you define luck.

Naturally, in writing a book about luck and skill I had to spend a lot of time trying to define luck. I settled on the idea that luck exists when three conditions are in place: it operates for an individual or organization; it can be good or bad; and it is reasonable to expect a different outcome to occur.

Another similar way to think about it is skill is what’s within your control and luck is what is outside your control. If there’s something you can do to improve your lot, I would call that skill.

Now I don’t want to deny that intuition exists. It does. It just happens to dwell in specific domains and hence is vastly rarer than people think. Specifically, intuition can be developed in activities that are stable and linear. This applies to chess, for example, or many sports. In these activities proper training can develop intuition. But in fields that are unstable and non-linear, all bets are off regarding intuition. The problem is we generally don’t distinguish the activity before considering how good intuition is likely to be.

So to answer the question, I wouldn’t want to bet on anyone who has truly succeeded by dint of luck because luck by definition is unlikely to persist.

You describe creeping determinism, the desire we have to give past events rational causes and thus make them inevitable. It is a myth of control. But we have a huge desire to see control. It seems preferable even to give control to someone else rather than to deny it existed at all. I’m not sure we are psychologically capable of saying that something in the past happened because of a conjunction of events and actions without any overriding intent or plan. What do you think?

This reminds me of something Victor Hugo said: “The mind, like nature, abhors a vacuum.” It is psychologically extremely difficult to attribute something to luck. The reason is that in the left hemisphere of our brains is a part that neuroscientists call the “interpreter.” The job of the interpreter is to create a cause for all the effects it sees. Now in most cases, the cause and effect relationships it comes up with make perfect sense. Throw a rock at a window and the window smashes. No problem.

The key is that the interpreter doesn’t know anything about luck. It didn’t get the memo. So the interpreter creates a story to explain results that are attributable solely to luck. The key is to realize that the interpreter operates in all of our brains all of the time. Almost always, it’s on the mark. But when you’re dealing with realms filled with luck, you can be sure that the interpreter will create a narrative that is powerful and false.

You talk about what happens when companies, for examples, hire stars, and how so very often that proves to be an expensive failure, because so much of the star’s success is attributed to the individual him or herself, and the specific context of the star’s success is ignored. It seems to me there must be hundreds of case studies that prove this is true, similarly an overwhelming amount of data that supports your thoughts on hiring sports stars on contracts that don’t take into account their declining skills. A vast amount of data that supports your views, yet the hiring of stars continues; it must be one of the most quantitatively demonstrably false assumptions in the business and sports world. So why does it continue?

I think there are two related underlying factors. The first is a failure to recognize reversion to the mean. Let’s take sports as an example. If a player has a great year statistically, we can almost always conclude that he was skillful and lucky. He gets bid away by another team based on his terrific results. What happens next? Well, on average his skill may remain stable but his luck will run out. So his performance will revert to the mean. Reversion to the mean is a very subtle concept that most people think they understand but few actually do. Certainly, the aggregate behaviors of people suggest that they don’t understand reversion to the mean.

The second underlying factor is underestimating the role of social context in success. An individual within an organization is not operating independently; she is surrounded by colleagues and an infrastructure that contribute to her outcomes. When high-performing individuals go from one organization to another, the social context changes, and generally that has a negative impact on results.

Do you think we make similar mistakes when we promote people in organizations? Specifically, I’m thinking that hiring can get really complicated if you have to look at how long someone has been doing a job, the people they work with, the difficulty of the job itself, there are so many variables at play that it’s hard to tease out skill versus luck. How can we get better at this when it comes to promoting people internally?

This can be a challenge. But there’s an interesting angle here. Typically, the lower you are in an organization, the easier it is to measure your skill. That’s because basic functions are generally “algorithmic,” people are executing their jobs based on certain known principles. So outcomes are an accurate reflection of skill.

But as you move up in an organization, luck often plays a bigger role. For example, developing a strategy for a new product is no sure thing—luck can play a large role in shaping the strategy’s success or failure. Said differently, even strategies that are really well thought through will fail some percentage of the time as the result of bad luck.

So as people move up in organizations, it makes sense to pay more attention to the process of decision making than the outcomes alone. For example, I would argue that capital allocation is a CEO’s most important job. And capital allocation is inherently based on process.

So as individuals advance in their careers, their duties often slide towards the luck side of the continuum. Furthermore, you note that fluid intelligence peaks at age 20. What does this say about leadership? If I was to be extreme, I would interpret you as saying that people assume or are given positions of leadership at the very time when they are least fitted to be leaders. For example, the median age of U.S. presidents is 54. That is not an age when I should expect someone to be able to deal well with complex, or unprecedented issues and decisions. I don’t consider the corresponding increase in crystallized intelligence to be adequate compensation. And what does this say about leadership development, executive coaching and the like? If I am an executive coach, should I be explaining to my clients that their biggest mistake will be to ignore the overwhelming role luck will play in their success as leaders?

There are a couple of issues here. First, as you mentioned, cognitive performance combines fluid intelligence, which peaks when you are young, and crystallized intelligence, which tends to grow throughout your life. For older people the problem is not that they don’t have the knowledge or wisdom to make good decisions, it’s that they tend to become cognitively lazy and fall back on rules of thumb that served them well in the past. Using terms that Danny Kahneman popularized, they rely more on System 1—fast, automatic, and difficult to train—and less on System 2, which is analytical, slower, and more purposeful. You can overcome this tendency by having others work with you on your decision making, ensuring that you’re looking at the proper options, and considering consequences as completely as possible.

If I were an executive coach, I would try to focus each individual on the facets they can control. Emphasizing what’s in your control allows you to adopt an attitude of equanimity toward luck. You’ve done all that you can, and from there you have to live with the results—good or bad.

I thought you description of Mark Granovetter’s research on phase transitions fascinating. But this seems to me to contradict the idea of fundamental attribution error, and in fact, make fundamental attribution a real, existing phenomenon. If we talk about, for example, an organization that is trying to change a common behavior of its executives (say, getting them to stop taking calls or messages on their cellphones when they are in meetings), then it seems to me that if the CEO models this new behavior it has a great likelihood of becoming the norm. If he does not, then the likelihood is nil. So this would be an example of fundamental attribution. Would this not be the same for more complex issues where a specific action or behavior by the leader of the organization would be the cause of phase transition?

I think the common denominator of both of your thoughts is the role of social context. Granovetter’s model emphasizes how small changes in the network structure can lead to disproportionately different outcomes. This is very important for any kind of diffusion process—be it the flu, an idea, a new technology, or a social behavior. The spread of disease provides a vivid example. There are basically two essential parameters in understanding disease propagation: the contagiousness of the disease and the degree of interaction between people in the population. A disease only spreads quickly if contagiousness and interaction are high.

The disease metaphor works pretty well for the diffusion of any innovation or behavior. The main point is that most products, ideas, and diseases fail to propagate. Some of that is a function of the inherent nature of what’s trying to propagate and some is from the network itself.

The fundamental attribution error says that when we observe the behavior of another person—this is especially true for bad behavior—we tend to attribute the behavior more to the individual than to the social context. But both studies and observations in life show that social context is deeply influential in shaping behavior.

Tying this back to your point, I think it’s crucial for leaders to acknowledge two realities. First, they operate in a social context which shapes their behavior. For example, Warren Buffett talks about the “institutional imperative,” which says among other things that a company will mindlessly imitate the behavior of peer companies.

Second, leaders have to recognize that they create a social context for the decisions of their employees. Some social contexts are not conducive to good decisions. For example, if employees feel too much stress, they will shorten the time horizons for their decisions. So an executive may say he or she is thinking for the long term but his or her behavior may be shaping an environment where employees are focused only on the here and now.

When you talk about the qualities that make a statistic useful, do you have any thoughts on organizations that are trying to be more evidence-based and quantitative in how they measure their performance, yet seem to have great trouble in identifying useful statistics? Examples that come to mind would be government departments and agencies, particularly those that do not provide a direct public service. What is the process they organizations should follow to identify what are useful statistics for measuring effectiveness? Government departments and agencies are notorious for confusing luck and skill.

I suggest two simple criteria identifying a useful statistic. First, it should be persistent, or what statisticians call “reliable.” A statistic is persistent if it correlates highly with itself over time, and hence is frequently an indicator of skill. Next, it should be predictive, or what statisticians call “valid.” That is, the statistic correlates highly with what the organization is trying to achieve.

With these two criteria in mind, you can adopt a four-step process for finding useful statistics. Step one is to define your governing objective. What is your organization trying to achieve? Step two is to develop a theory of cause and effect to help identify the drivers of success. Step three is identifying the specific actions that people within the organization can take to serve the governing objective. And step four is to regularly evaluate the statistics to see if they are doing the job.

Naturally, luck plays a role in outcomes almost everywhere you look. But this process of selecting statistics gives you the best chance of properly emphasizing what is in the control of the organization.

To sort of round this interview out, I’d like to talk with you about a subject I suspect you spend a lot of time thinking about: improving decisions in organizations. One of your pieces of advice is to create a decision journal, can you tell me what that looks like to you?

A decision journal is actually very simple to do in principle, but requires some discipline to maintain. The idea is whenever you are making a consequential decision, write down what you decided, why you decided as you did, what you expect to happen, and if you’re so inclined, how you feel mentally and physically. This need not take much time.

The value is that you document your thinking in real time and thus immunize yourself against hindsight bias—the pernicious tendency to think that you knew what was going to happen with more clarity than you actually did. The journal also allows you to audit your decision making process, looking for cases where you may have been right for the wrong reasons or wrong for the right reasons.

I imagine what people record in a decision journal is somewhat personal but can you give me an idea of what sorts of things you note?

Since I make few investment decisions, my journal doesn’t have a lot of the material that an investor would have. What I try to do is keep track of meetings, thoughts, and ideas so that I can return to them over time. What I can promise is that if you keep a journal with some detail, you’ll be surprised at how your views change over time.

People’s emotional state has a bearing on how they work. How do you go about accounting for that when making decisions?

Emotional state is enormously important. There is the obvious advice that everyone knows, such as don’t make a consequential decision at a point of high arousal—whether that arousal is positive or negative. We’ve already discussed stress, but I’ll reiterate the point. Too much stress is very debilitating for the process of making good decisions, especially for long-term decisions.

Finally, we all have different personalities and those differences portend strengths and weaknesses. Most great investors are somewhat indifferent about what others think. They feel comfortable following their conviction based on analysis. Investors who are highly attuned to others tend to struggle much more, because they have difficulty being a contrarian.

If there is a single change you could recommend to an organization to improve their decisions, what would it be?

Elizabeth Mannix and Margaret Neale, two well-known psychologists, have a great line in one of their survey papers. They write, “To implement policies and practices that increase diversity of the workforce without understanding how diverse individuals can come together to form effective teams is irresponsible.” I love that. So my answer would be that organizations need to learn how to create and manage diversity within their organizations. Most leaders have no idea how to do that.

Let’s end with a variant on my favorite question. You’ve just taken over a university and are required to pick 3 books for every student to read. What would they be and why?

This is an impossible question to answer!

I’d probably start with The Origin of Species by Charles Darwin. Understanding evolution strikes me as essential to be a good thinker. Naturally, much has come along to fortify Darwin’s ideas, but many of the core ideas are there. Plus, Darwin himself is a wonderful model: hardworking, humble, modest, always learning.

Next I’d go with something very modern, Daniel Kahneman’s Thinking, Fast and Slow. That this is the opus of the man who is likely the greatest living psychologist is reason alone to read it. But my motivation would be that learning how to make good decisions is perhaps the greatest skill you can have in your life. And with some sense of how you are likely to go wrong, perhaps you can increase your odds of getting it right.

Finally, I’d go with How to Read a Book by Mortimer Adler and Charles Van Doren. (Ed: see the cheat sheet.) This is a guide on how to be effective at what should be one of your most important activities in life: reading.

And I’d recommend you read all of Mauboussin’s books, starting with The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing.

(Note: A few of the questions were submitted by a friend of mine: Neil Cruickshank.)

How to Improve Decision Making in Your Organization

When Michael Mauboussin met Nobel Laureate Daniel Kahneman, he asked how we can improve our performance.

Kahneman, the author of Thinking: Fast and Slow, replied, almost without hesitation, that you should go down to the local drugstore and buy a very cheap notebook and start keeping track of your decisions.

Thinking Matters

Whenever you’re making a decision of consequence, take a moment to think; Write down the relevant variables that will govern the outcome, what you expect to happen, and why you expect it to happen. (Optionally you can add how you feel about the decision and your confidence level in the outcome you expect.)

A journal of this nature will reduce hindsight bias and give you accurate and honest feedback.

It’ll also help you distinguish between when you’re right for the wrong reasons and when you’re wrong for the right reasons.

If you’re anything like me, one thing that you’ll discover is that, on the few occasions when you’re right, it’s often for the wrong reasons.

A Decision Journal

Somewhat surprisingly, few organizations keep track of what’s decided and why.

This seems idiotic when you consider that often thousands of dollars are spent making a decision. Of the few that do keep track of decisions, fewer will be honest about what’s actually discussed.

While accurate, for example, few people will write down that the highest paid person in the room, usually the boss, said to do X, and that’s why you’re doing it.

But that’s kinda the point isn’t it?

If you can’t write down what was discussed, the relevant variables that govern the decision, and why you think something will play out, than you should seriously think about why you’re making the decision in the first place. It could be that you don’t understand what’s going on at all. If that’s the case it’s important to know.

I get that we have to make decisions under uncertainty. But we’re not going to learn from those decisions if we don’t keep track of, and review, what’s decided and why.

While people come and go, organizations often seem to make the same mistakes over and over. Improving our ability to make decisions is simple, but not easy.

A decision journal will not only allow you to reduce your hindsight bias, but it will force to you make your rationale explicit upfront. This habit will often surface bad thinking that might have otherwise slipped by.

Still curious? Pair with Daniel Kahneman’s Favorite Approach For Making Better Decisions and How to create and use a decision journal.

The Dreamliner — Innovation and Outsourcing

James Surowiecki writing in the New Yorker:

In the past, the F.A.A. was remarkably hesitant to take planes out of service. The problems with the DC-10 were well known to regulators for years before a 1979 crash forced them to ground the plane. But, again, those standards no longer apply. In the nineteen-seventies, after all, airplane crashes occurred with disturbing regularity. Today, they are extraordinarily rare; there hasn’t been a fatal airliner crash in the United States in almost four years. The safer we get, the safer we expect to be, so the performance bar keeps rising. And this, ultimately, is why the decision to give other companies responsibility for the Dreamliner now looks misguided. Boeing is in a business where the margin of error is small. It shouldn’t have chosen a business model where the chance of making a serious mistake was so large.

Some incomplete thoughts:

Michael Mauboussin reminds us that Clay Christensen, author of The Innovator’s Dilemma, believes that outsourcing only makes sense when components are modular.

But, of course, there is a ‘cost’ to being modular too.

“They still think they are in charge,” says Christensen commenting in general on outsourcing, “but they aren’t. They have outsourced their brains without realizing it.”

Christensen believes outsourcing is driven by ratios and returns.

Americans measure profitability by a ratio. There’s a problem with that. No banks accept deposits denominated in ratios. The way we measure profitability is in ‘tons of money’. You use the return on assets ratio if cash is scarce. But if there is actually a lot of cash, then that is causing you to economize on something that is abundant. …Modular disruptors should carry their low-cost business models up-market as fast as possible, to keep competing at the margin against higher-cost markers of proprietary products

This, however, can lead to disaster.

If you study the root causes of business disasters and management missteps, you’ll often find a predisposition toward endeavors that offer immediate gratification. Many companies’ decision-making systems are designed to steer investments to initiatives that offer the most tangible returns, so companies often favor these and short-change investments in initiatives that are crucial to their long-term strategies.

Will anyone be able to “snap” together a plane in the future?

…if it’s becoming commoditized and modular, you cannot make money at that level in the stack. But the whole industry doesn’t become unprofitable, rather its activities above and below that original [product or service], that’s where the money is made. And that has to be happening in the pharmaceutical industry, but I can’t see what it is yet. By example, the auto industry is becoming commoditized; cars are being assembled by sub-assemblies from tier-one suppliers. Anybody can get these modules and snap together a car. So it’s really hard to differentiate your car from anybody else’s car, so where the money is being made is in the subsystems that define the performance of the car, and by activities that sit on top of that, like OnStar. That’s where the money is made.

And these thoughts:

Before long, modularity rules, and commoditization sets in. When the relevant dimensions of your product’s performance are determined not by you but by the subsystems, it becomes difficult to earn anything more than substinence returns. When your world becomes modular, you’ll need to look elsewhere in the value chain to make any serious money.

I’ll end with some more thoughts from Christensen

But even in a modular architecture, successful companies still are integrated—just in a different place. Consider the computer industry in the 1990s. The computer’s basic performance was more than good enough. What did customers want instead? They wanted lower prices and a computer customized for their needs. Because the product’s functionality was more than good enough, companies like Dell could outsource the subsystems from which its machines were assembled. What was not good enough? The interface with the customer. By directly interacting with customers, Dell could ensure it delivered what customers wanted—convenience and customization. Value flowed to Dell and to the manufacturers of important subsystems that themselves were not good enough, like Microsoft and Intel.

In short, companies must be integrated across whatever interface drives performance along the dimension that customers value. In an industry’s early days, integration typically needs to occur across interfaces that drive raw performance—for example, design and assembly. Once a product’s basic performance is more than good enough, competition forces firms to compete on convenience or customization. In these situations, specialist firms emerge and the necessary locus of integration typically shifts to the interface with the customer.

The real take-away is knowing what industry you’re in, the complexity of your products, and the second and third order effects of outsourcing.

If you do outsource, pay attention to what you’re giving up in terms of information and complexity — what starts as “raw labor” easily moves up the value chain. Pay attention to the reasons for your outsourcing: are they all driven by financial ratios? If so, that’s a red flag. Try to think a decade ahead. I know this all sounds very ambiguous and difficult but if it were easy everyone would know the answer.

Untangling Skill and Luck in Business, Sports, and Investing

the success equation

I just finished Michael Mauboussin’s latest book, The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing.

In the book Mauboussin goes beyond the general idea that luck is important to outcomes. He explains the type of interactions where luck is important and dives into why we have a difficult time comprehending the role of luck. “The basic challenge,” he writes, “is that we love stories and have a yearning to understand the relationship between cause and effect. As a result, statistical reasoning is hard, and we start to view the past as something that was inevitable.”

Mauboussin goes on to explain how we should untangle skill and luck so we can get a feel for where they fall on the luck-skill continuum. “Where an activity falls on that continuum,” he writes “provides a great deal of insight into how to deal with it.” Skill tends to follow an arc – improving, stagnating, and then ultimately going lower.

In activities where the results are independent of one another, simple models effectively explain what we see. But when a past result affects a future result, predicting winners becomes very difficult. The most skillful don’t always win.

As a sports fan, I enjoyed the breakdown of the contribution of luck in some professional sports leagues:
IMG_0031

And something to keep in mind, “the contribution of luck has been rising steadily over time in most sports, which means that the players are all converging on an equal level of skill.” Although, this doesn’t apply to basketball.

Automatic decision making

The problem with this sort of automatic decision-making apparatus is that it only works under very specific circumstances. Intuition works when the environment is stable and an individual has the opportunity to spend a great deal of time learning about it. . . . Trouble arises when individuals rely too heavily on their experience in making automatic decisions. When we age, we tend to avoid exerting too much cognitive effort and deliberating extensively over a decision that needs to be made. We gradually come to rely more on rule of thumb. This means that we make poorer choices in environments that are complex and unstable.

Organizations also lose skill with age.

Probably the best explanation for why companies decline is that they fall prey to organizational rigidities. Companies must balance exploiting profitable markets with exploring new markets. Exploiting known markets requires optimizing processes and executing effectively, and leads to reliable, near-term successes. Exploring unknown markets requires search and experimentation and offers none of the immediate benefits of exploitation.

Finding the best balance between exploration and exploitation depends on the rate of change in the environment. When change comes slowly, the balance can tilt toward exploitation. When it comes quickly, an organization must dedicate more resources to exploration, since profits are quickly exhausted. In general, companies tend to lean more on exploitation, which increases efficiency and profits in the short run but makes the company rigid, a state of affairs that only grows worse with age. Similar to aging individuals, companies rely on methods and rules of thumb that worked well in the past rather than embrace novelty. Companies, too, follow an arc of skill.

Mauboussin also offers some tips. Deliberate practice helps develop differentiating skill when only a little luck is involved in the outcome. That is, the larger the impact of skill on the outcome, the more effective deliberate practice becomes. On the other hand, “where luck is rampant,” he writes, “we must think of skill in terms of a process, because the results don’t provide clear feedback.”

Checklists, are also important because they guide behavior.

Sometimes, of course, you want to defend against luck. If you’re a heavy favorite, you want to neutralize the other teams luck, so your aim should be to simplify the game. When you’re the underdog you want to make the game as complex as possible. Effort and strategy can compensate for skill.

And if you need another reason to read The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing, you can find a copy on Warren Buffett’s Desk.
The Success Equation

Three Things to Consider in Order To Make an Effective Prediction

Michael Mauboussin commenting on Daniel Kahneman:

When asked which was his favorite paper of all-time, Daniel Kahneman pointed to “On the Psychology of Prediction,” which he co-authored with Amos Tversky in 1973. Tversky and Kahneman basically said that there are three things to consider in order to make an effective prediction: the base rate, the individual case, and how to weight the two. In luck-skill language, if luck is dominant you should place most weight on the base rate, and if skill is dominant then you should place most weight on the individual case. And the activities in between get weightings that are a blend.

In fact, there is a concept called the “shrinkage factor” that tells you how much you should revert past outcomes to the mean in order to make a good prediction. A shrinkage factor of 1 means that the next outcome will be the same as the last outcome and indicates all skill, and a factor of 0 means the best guess for the next outcome is the average. Almost everything interesting in life is in between these extremes.

To make this more concrete, consider batting average and on-base percentage, two statistics from baseball. Luck plays a larger role in determining batting average than it does in determining on-base percentage. So if you want to predict a player’s performance (holding skill constant for a moment), you need a shrinkage factor closer to 0 for batting average than for on-base percentage.

I’d like to add one more point that is not analytical but rather psychological. There is a part of the left hemisphere of your brain that is dedicated to sorting out causality. It takes in information and creates a cohesive narrative. It is so good at this function that neuroscientists call it the “interpreter.”

Now no one has a problem with the suggestion that future outcomes combine skill and luck. But once something has occurred, our minds quickly and naturally create a narrative to explain the outcome. Since the interpreter is about finding causality, it doesn’t do a good job of recognizing luck. Once something has occurred, our minds start to believe it was inevitable. This leads to what psychologists call “creeping determinism” – the sense that we knew all along what was going to happen. So while the single most important concept is knowing where you are on the luck-skill continuum, a related point is that your mind will not do a good job of recognizing luck for what it is.

Mauboussin is the author of The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing.