Tag: Howard Marks

The Probability Distribution of the Future

The best colloquial definition of risk may be the following:

“Risk means more things can happen than will happen.”

We found it through the inimitable Howard Marks, but it's a quote from Elroy Dimson of the London Business School. Doesn't that capture it pretty well?

Another way to state it is: If there were only one thing that could happen, how much risk would there be, except in an extremely banal sense? You'd know the exact probability distribution of the future. If I told you there was a 100% probability that you'd get hit by a car today if you walked down the street, you simply wouldn't do it. You wouldn't call walking down the street a “risky gamble” right? There's no gamble at all.

But the truth is that in practical reality, there aren't many 100% situations to bank on. Way more things can happen than will happen. That introduces great uncertainty into the future, no matter what type of future you're looking at: An investment, your career, your relationships, anything.

How do we deal with this in a pragmatic way? The investor Howard Marks starts it this way:

Key point number one in this memo is that the future should be viewed not as a fixed outcome that’s destined to happen and capable of being predicted, but as a range of possibilities and, hopefully on the basis of insight into their respective likelihoods, as a probability distribution.

This is the most sensible way to think about the future: A probability distribution where more things can happen than will happen. Knowing that we live in a world of great non-linearity and with the potential for unknowable and barely understandable Black Swan events, we should never become too confident that we know what's in store, but we can also appreciate that some things are a lot more likely than others. Learning to adjust probabilities on the fly as we get new information is called Bayesian updating.

But.

Although the future is certainly a probability distribution, Marks makes another excellent point in the wonderful memo above: In reality, only one thing will happen. So you must make the decision: Are you comfortable if that one thing happens, whatever it might be? Even if it only has a 1% probability of occurring? Echoing the first lesson of biology, Warren Buffett stated that “In order to win, you must first survive.” You have to live long enough to play out your hand.

Which leads to an important second point: Uncertainty about the future does not necessarily equate with risk, because risk has another component: Consequences. The world is a place where “bad outcomes” are only “bad” if you know their (rough) magnitude. So in order to think about the future and about risk, we must learn to quantify.

It's like the old saying (usually before something terrible happens): What's the worst that could happen? Let's say you propose to undertake a six month project that will cost your company $10 million, and you know there's a reasonable probability that it won't work. Is that risky?

It depends on the consequences of losing $10 million, and the probability of that outcome. It's that simple! (Simple, of course, does not mean easy.) A company with $10 billion in the bank might consider that a very low-risk bet even if it only had a 10% chance of succeeding.

In contrast, a company with only $10 million in the bank might consider it a high-risk bet even if it only had a 10% of failing. Maybe five $2 million projects with uncorrelated outcomes would make more sense to the latter company.

In the real world, risk = probability of failure x consequences. That concept, however, can be looked at through many lenses. Risk of what? Losing money? Losing my job? Losing face? Those things need to be thought through. When we observe others being “too risk averse,” we might want to think about which risks they're truly avoiding. Sometimes risk is not only financial. 

***

Let's cover one more under-appreciated but seemingly obvious aspect of risk, also pointed out by Marks: Knowing the outcome does not teach you about the risk of the decision.

This is an incredibly important concept:

If you make an investment in 2012, you’ll know in 2014 whether you lost money (and how much), but you won’t know whether it was a risky investment – that is, what the probability of loss was at the time you made it.

To continue the analogy, it may rain tomorrow, or it may not, but nothing that happens tomorrow will tell you what the probability of rain was as of today. And the risk of rain is a very good analogue (although I’m sure not perfect) for the risk of loss.

How many times do we see this simple dictum violated? Knowing that something worked out, we argue that it wasn't that risky after all. But what if, in reality, we were simply fortunate? This is the Fooled by Randomness effect.

The way to think about it is the following: The worst thing that can happen to a young gambler is that he wins the first time he goes to the casinoHe might convince himself he can beat the system.

The truth is that most times we don't know the probability distribution at all. Because the world is not a predictable casino game — an error Nassim Taleb calls the Ludic Fallacy — the best we can do is guess.

With intelligent estimations, we can work to get the rough order of magnitude right, understand the consequences if we're wrong, and always be sure to never fool ourselves after the fact.

If you're into this stuff, check out Howard Marks' memos to his clients, or check out his excellent book, The Most Important Thing. Nate Silver also has an interesting similar idea about the difference between risk and uncertainty. And lastly, another guy that understands risk pretty well is Jason Zweig, who we've interviewed on our podcast before.

***

If you liked this article you'll love:

Nassim Taleb on the Notion of Alternative Histories — “The quality of a decision cannot be solely judged based on its outcome.”

The Four Types of Relationships — As Seneca said, “Time discovers truth.”

Second-Order Thinking: What Smart People Use to Outperform

“Experience is what you got when you didn’t get what you wanted.”
— Howard Marks

***

Second Order Thinking

Successful decision making requires thoughtful attention to many separate aspects.

Decision making is as much art as science. The goal, if we have one, is not to make perfect decisions but rather to make better than average decisions and get better over time. Doing this requires better insight or making fewer errors. One of the ways to gain insight and make fewer mistakes is the use of second-order thinking.

In most of life, you can get a step ahead of others by going to the gym or the library, or even a better school. In thinking, however, a lot of what you'd think gets you ahead is only window dressing.

Would be thinkers and deciders can attend the best schools, take the best courses and, if they are lucky, attach themselves to the best mentors. Yet only a few of them will achieve the skills and superior insight necessary to be an above average thinker.

But how do we become a better thinker in a world where everyone else is also smart and well-informed? How do we improve in a world that is increasingly becoming computerized?

You must find an edge. You must think differently.

Second-Order Thinking

In his exceptional book, The Most Important Thing, Howard Marks hits on the concept of second-order thinking, which he calls second-level thinking.

First-level thinking is simplistic and superficial, and just about everyone can do it (a bad sign for anything involving an attempt at superiority). All the first-level thinker needs is an opinion about the future, as in “The outlook for the company is favorable, meaning the stock will go up.” Second-level thinking is deep, complex and convoluted.

Second-order thinkers take into account a lot of what we put into our decision journals. Things like, What is the range of possible outcomes? What’s the probability I’m right? What’s the follow-on? How could I be wrong?

The real difference for me is that first-order thinkers are the people that look for things that are simple, easy, and defendable. Second-order thinkers push harder and don't accept the first conclusion.

“It’s not supposed to be easy. Anyone who finds it easy is stupid.”

— Charlie Munger

Marks writes:

First-level thinkers think the same way other first-level thinkers do about the same things, and they generally reach the same conclusions. By definition, this can’t be the route to superior results.

This is where things get interesting. Extraordinary performance comes from being different. It must be that way. Of course, below average performance comes from being different too — on the downside.

The Necessity of Smart Divergence

“The problem is that extraordinary performance comes only from correct nonconsensual forecasts, but nonconsensual forecasts are hard to make, hard to make correctly and hard to act on,” writes Marks.

You can’t do the same things that other people are doing and expect to outperform. When you do what everyone else does you're going to get the same results everyone else gets.

It's not enough to be different — you also need to be correct. The goal is not blind divergence but rather a way of thinking that sets you apart from others. A way of thinking that gives you an advantage.

We can look at this as a simple two-by-two matrix (via The Most Important Thing).

Second Level Thinking Matrix

I’m generalizing a bit here, but if your thoughts and behavior are conventional, you’re likely to get conventional results. Steve Jobs was right.

This is where loss aversion comes in. Most people are simply unwilling to be wrong because that means they might look like a fool. Yet this is a grave mistake.

The ability to risk looking like an idiot is necessary for being different. You never look like a fool if you look like everyone else.

“Worldly wisdom teaches that it is better for reputation to fail conventionally than to succeed unconventionally.”

— John Meynard Keynes

Conventional thinking and behavior are safe. But they almost guarantee mediocrity. To get an edge, you need to know when your performance is likely to be improved by being unconventional.

Second-order thinking takes a lot of work. It's not easy. However, this is a smart way to separate yourself from the masses.

Here’s a pro tip. If you want to have fun at work this week, do one of two things. First, start digging below the surface of people’s opinions. Ask them why they think what they think. Second, ask them to take the other side of the argument.

Developing a Mental Framework for Effective Thinking

mental framework effective thinking

Becoming a better thinker means understanding the way you think and developing a way of approaching problems that allows you to see things from multiple lenses. These lenses, or mental models, are built on the foundations of physics, biology, math, psychology, as well as history and economics. The more tools you have in your mental toolbox the better able you will be to make an incrementally better decision.

These tools also allow you to better understand when to follow and when to reject conventional wisdom. Ideally you want to go through them checklist style — just run right through them — asking what applies.

Consilient Thinker
John Snow was a doctor based in London during the acute cholera outbreak of the summer of 1854. He represents a powerful example of the impact a lollapalooza effect can have. A lollapalooza is when several ideas combine to produce an unusually powerful result. Snow developed systems to ease the pain of surgery with ether and chloroform.

In the book The Ghost Map, author Steven Johnson explains:

Snow was a truly consilient thinker, in the sense of the term as it was originally formulated by the Cambridge philosopher William Whewell in the 1840s (and recently popularized by Harvard biologist E. O. Wilson). “The Consilience of Inductions,” Whewell wrote, “takes place when an Induction, obtained from one class of facts, coincides with an Induction obtained from another different class. This Consilience is a test of the truth of the Theory in which it occurs.” Snow’s work was constantly building bridges between different disciplines, some which barely existed as functional sciences in his day, using data on one scale of investigation to make predictions about behavior on other scales. In studying ether and chloroform, he had moved from the molecular properties of the gas itself, to its circulation of those properties throughout the body’s overall system, to the psychological effects produced by these biological changes. He even ventured beyond the natural world into the design of technology that would best reflect our understanding of the anesthetics. Snow was not interested in individual, isolated phenomena; he was interested in chains and networks in the movement from scale to scale. His mind tripped happily from molecules to cells to brains to machines, and it was precisely that consilient study that helped Snow uncover so much about this nascent field in such a shockingly short amount of time.

Suspending belief in the common theory at the time on how diseases were spread, Snow ended up rejecting miasma theory, which said the disease was spread via “bad air.” He did this through science. He conducted interviews with residents and traced the majority of cases back to a single water source. His willingness to challenge conventional thinking, along with approaching the problem through multiple lenses, resulted in finding the deadly source and changes in municipal water systems from that day forward.

***

Elements of the mental framework

Charlie Munger is a strong advocate of a mental framework. In Damn Right: Behind the Scenes with Berkshire Hathaway Billionaire Charlie Munger, he offered five-simple notions that help solve complex problems.

In The Focused Few: Taking a Multidisciplinary Approach to Focus Investing, Richard Rockwood explores the concepts from many disciplines. Adding them together can yield a useful mental checklist.

Element 1: Invert

In The Focused Few, Rockwood writes:

Inverting, or thinking problems through backward, is a great way to understand information. Charlie Munger provides the best illustration I have ever seen of this type of thinking.

During a speech he offered an example of how a situation could be examined using the inversion process. He discussed the development process of Coca-Cola from the perspective of a person creating a soda company from scratch and examining the key issues that would need to be resolved to make it a reality.

He listed some of the issues the entrepreneur would need to address:

  • What kind of properties should the new drink strive for, and what are those it should avoid? One property the drink should not have is an aftertaste. Consumers should be able to consume large quantities over a period of time and not be deterred by an unpleasant aftertaste.
  • The soda should be developed in such a manner that it can be shipped in large quantities at minimal costs. This makes it easier to develop an efficient, large-scale distribution system.
  • Keeping the soda formulation a secret will help alleviate competition and create a certain aura of mystique around the product.
  • The company also can deter competition by expanding the business as quickly as possible. For example, the distribution system could be expanded until it reaches a critical mass that competitors would find hard to duplicate without massive capital expenditures.

Element 2: First- and second-level thinking

In The Focused Few, Rockwood writes:

Let’s examine the decision-making process by breaking it down into two components. The first component, first-level thinking, generally occurs when you make decisions quickly based on a simple theme or common sense. For example, a person may decide to invest in a company simply because its products are trendy. Making decisions based on first-level reasoning has significant problems, however. Common sense “is wonderful at making sense of the world, but not necessarily at understanding it.” (Duncan Watts Everything Is Obvious: How Common Sense Fails Us)

The danger is that you may think you understand a particular situation when in fact you have only developed a likely story.

Second-level thinkers, in contrast, approach decisions differently. What kinds of questions should a second-level thinker ask?

In his book, The Most Important Thing: Uncommon Sense for the Thoughtful Investor, Howard Marks provides a useful list of questions to ask.

  1. What is the range of likely future outcomes?
  2. Which outcome do I think will occur?
  3. What is the probability that I’m right?
  4. What is the prevailing consensus?
  5. How does my expectation differ from the consensus?
  6. How does the current price for the asset comport with the consensus view of the future— and with mine?
  7. Is the consensus psychology that is incorporated into the price too bullish or bearish?
  8. What will happen to the asset’s price if the consensus turns out to be right, and what if I’m right?

Element 3: Use decision trees

decision trees

In The Focused Few, Rockwood writes:

Decision trees are excellent tools for helping you decide on a course of action. They enable you to lay out several possible scenarios, investigate their possible outcomes, and create a balanced picture of the risks and rewards associated with each.

[…]

Let’s examine the decision-tree process in greater detail. First, identify the decision and the outcome alternatives available at each point. After you lay out each course of action, determine which option has the greatest value to you. Start by assigning a cash value to each possible outcome (i.e., what the expected value would be if that particular outcome were to occur). Next, look at each break, or point of uncertainty, in the tree and estimate the probability of each outcome occurring. If you use percentages, the combined total must equal 100% at each break point. If you use fractions, these must add up to 1.

After these two steps have been taken (i.e., the values of the outcomes have been entered and the probabilities have been estimated), it is time to begin calculating the expected values of the various branches in the decision tree.

Element 4: The multidisciplinary approach

When trying to resolve a difficult situation or determining exactly why a product has been, and may continue to be, successful, it helps to think about the problem by creating a checklist that incorporates the vital components of other disciplines.

The Focused Few goes on to explore more of the elements of multidisciplinary thinking.