Over 400,000 people visited Farnam Street last month to learn how to make better decisions, create new ideas, and avoid stupid errors. With more than 98,000 subscribers to our popular weekly digest, we've become an online intellectual hub. To learn more about we what do, start here.

Category Archives: Probability

The Probability Distribution of the Future

The best colloquial definition of risk may be the following:

“Risk means more things can happen than will happen.”

We found it through the inimitable Howard Marks, but it’s a quote from Elroy Dimson of the London Business School. Doesn’t that capture it pretty well?

Another way to state it is: If there were only one thing that could happen, how much risk would there be, except in an extremely banal sense? You’d know the exact probability distribution of the future. If I told you there was a 100% probability that you’d get hit by a car today if you walked down the street, you simply wouldn’t do it. You wouldn’t call walking down the street a “risky gamble” right? There’s no gamble at all.

But the truth is that in practical reality, there aren’t many 100% situations to bank on. Way more things can happen than will happen. That introduces great uncertainty into the future, no matter what type of future you’re looking at: An investment, your career, your relationships, anything.

How do we deal with this in a pragmatic way? The investor Howard Marks starts it this way:

Key point number one in this memo is that the future should be viewed not as a fixed outcome that’s destined to happen and capable of being predicted, but as a range of possibilities and, hopefully on the basis of insight into their respective likelihoods, as a probability distribution.

This is the most sensible way to think about the future: A probability distribution where more things can happen than will happen. Knowing that we live in a world of great non-linearity and with the potential for unknowable and barely understandable Black Swan events, we should never become too confident that we know what’s in store, but we can also appreciate that some things are a lot more likely than others. Learning to adjust probabilities on the fly as we get new information is called Bayesian updating.

But.

Although the future is certainly a probability distribution, Marks makes another excellent point in the wonderful memo above: In reality, only one thing will happen. So you must make the decision: Are you comfortable if that one thing happens, whatever it might be? Even if it only has a 1% probability of occurring? Echoing the first lesson of biology, Warren Buffett stated that “In order to win, you must first survive.” You have to live long enough to play out your hand.

Which leads to an important second point: Uncertainty about the future does not necessarily equate with risk, because risk has another component: Consequences. The world is a place where “bad outcomes” are only “bad” if you know their (rough) magnitude. So in order to think about the future and about risk, we must learn to quantify.

It’s like the old saying (usually before something terrible happens): What’s the worst that could happen? Let’s say you propose to undertake a six month project that will cost your company $10 million, and you know there’s a reasonable probability that it won’t work. Is that risky?

It depends on the consequences of losing $10 million, and the probability of that outcome. It’s that simple! (Simple, of course, does not mean easy.) A company with $10 billion in the bank might consider that a very low-risk bet even if it only had a 10% chance of succeeding.

In contrast, a company with only $10 million in the bank might consider it a high-risk bet even if it only had a 10% of failing. Maybe five $2 million projects with uncorrelated outcomes would make more sense to the latter company.

In the real world, risk = probability of failure x consequences. That concept, however, can be looked at through many lenses. Risk of what? Losing money? Losing my job? Losing face? Those things need to be thought through. When we observe others being “too risk averse,” we might want to think about which risks they’re truly avoiding. Sometimes risk is not only financial. 

***

Let’s cover one more under-appreciated but seemingly obvious aspect of risk, also pointed out by Marks: Knowing the outcome does not teach you about the risk of the decision.

This is an incredibly important concept:

If you make an investment in 2012, you’ll know in 2014 whether you lost money (and how much), but you won’t know whether it was a risky investment – that is, what the probability of loss was at the time you made it.

To continue the analogy, it may rain tomorrow, or it may not, but nothing that happens tomorrow will tell you what the probability of rain was as of today. And the risk of rain is a very good analogue (although I’m sure not perfect) for the risk of loss.

How many times do we see this simple dictum violated? Knowing that something worked out, we argue that it wasn’t that risky after all. But what if, in reality, we were simply fortunate? This is the Fooled by Randomness effect.

The way to think about it is the following: The worst thing that can happen to a young gambler is that he wins the first time he goes to the casinoHe might convince himself he can beat the system.

The truth is that most times we don’t know the probability distribution at all. Because the world is not a predictable casino game — an error Nassim Taleb calls the Ludic Fallacy — the best we can do is guess.

With intelligent estimations, we can work to get the rough order of magnitude right, understand the consequences if we’re wrong, and always be sure to never fool ourselves after the fact.

If you’re into this stuff, check out Howard Marks’ memos to his clients, or check out his excellent book, The Most Important Thing. Nate Silver also has an interesting similar idea about the difference between risk and uncertainty. And lastly, another guy that understands risk pretty well is Jason Zweig, who we’ve interviewed on our podcast before.

***

If you liked this article you’ll love:

Nassim Taleb on the Notion of Alternative Histories — “The quality of a decision cannot be solely judged based on its outcome.”

The Four Types of Relationships — As Seneca said, “Time discovers truth.”

Daniel Kahneman in Conversation with Michael Mauboussin on Intuition, Causality, Loss Aversion and More

Ever want to be the fly on the wall for a fascinating conversation. Well, here’s your chance. Santa Fe Institute Board of Trustees Chair Michael Mauboussin interviews Nobel Prize winner Daniel Kahneman. The wide-ranging conversation talks about disciplined intuition, causality, base rates, loss aversion and so much more. You don’t want to miss this.

Here’s an excerpt from Kahneman I think you’ll enjoy. You can read the entire transcript here.

The Sources of Power is a very eloquent book on expert intuition with magnificent examples, and so he is really quite hostile to my point of view, basically.

We spent years working on that, on the question of when can intuitions be trusted? What’s the boundary between trustworthy and untrustworthy intuitions?

I would summarize the answer as saying there is one thing you should not do. People’s confidence in their intuition is not a good guide to their validity. Confidence is something else entirely, and maybe we can talk about confidence separately later, but confidence is not it.

People’s confidence in their intuition is not a good guide to their validity. Click To Tweet

What there is, if you want to know whether you can trust intuition, it really is like deciding on a painting, whether it’s genuine or not. You can look at the painting all you want, but asking about the provenance is usually the best guide about whether a painting is genuine or not.

Similarly for expertise and intuition, you have to ask not how happy the individual is with his or her own intuitions, but first of all, you have to ask about the domain. Is the domain one where there is enough regularity to support intuitions? That’s true in some medical domains, it certainly is true in chess, it is probably not true in stock picking, and so there are domains in which intuition can develop and others in which it cannot. Then you have to ask whether, if it’s a good domain, one in which there are regularities that can be picked up by the limited human learning machine. If there are regularities, did the individual have an opportunity to learn those regularities? That primarily has to do with the quality of the feedback.

Those are the questions that I think should be asked, so there is a wide domain where intuitions can be trusted, and they should be trusted, and in a way, we have no option but to trust them because most of the time, we have to rely on intuition because it takes too long to do anything else.

Then there is a wide domain where people have equal confidence but are not to be trusted, and that may be another essential point about expertise. People typically do not know the limits of their expertise, and that certainly is true in the domain of finances, of financial analysis and financial knowledge. There is no question that people who advise others about finances have expertise about finance that their advisees do not have. They know how to look at balance sheets, they understand what happens in conversations with analysts.

There is a great deal that they know, but they do not really know what is going to happen to a particular stock next year. They don’t know that, that is one of the typical things about expert intuition in that we know domains where we have it, there are domains where we don’t, but we feel the same confidence and we do not know the limits of our expertise, and that sometimes is quite dangerous.

***

The Psychology of Risk and Reward

The Psychology of Risk and Reward

An excerpt from The Aspirational Investor: Taming the Markets to Achieve Your Life’s Goals that I think you’d enjoy.

Most of us have a healthy understanding of risk in the short term.

When crossing the street, for example, you would no doubt speed up to avoid an oncoming car that suddenly rounds the corner.

Humans are wired to survive: it’s a basic instinct that takes command almost instantly, enabling our brains to resolve ambiguity quickly so that we can take decisive action in the face of a threat.

The impulse to resolve ambiguity manifests itself in many ways and in many contexts, even those less fraught with danger. Glance at the (above) picture for no more than a couple of seconds. What do you see?

Some observers perceive the profile of a young woman with flowing hair, an elegant dress, and a bonnet. Others see the image of a woman stooped in old age with a wart on her large nose. Still others—in the gifted minority—are able to see both of the images simultaneously.

What is interesting about this illusion is that our brains instantly decide what image we are looking at, based on our first glance. If your initial glance was toward the vertical profile on the left-hand side, you were all but destined to see the image of the elegant young woman: it was just a matter of your brain interpreting every line in the picture according to the mental image that you already formed, even though each line can be interpreted in two different ways. Conversely, if your first glance fell on the central dark horizontal line that emphasizes the mouth and chin, your brain quickly formed an image of the older woman.

Regardless of your interpretation, your brain wasn’t confused. It simply decided what the picture was and filled in the missing pieces. Your brain resolved ambiguity and extracted order from conflicting information.

What does this have to do with decision making? Every bit of information can be interpreted differently according to our perspective. Ashvin Chhabra directs us to investing. I suggest you reframe this in the context of decision making in general.

Every trade has a seller and a buyer: your state of mind is paramount. If you are in a risk-averse mental framework, then you are likely to interpret a further fall in stocks as additional confirmation of your sell bias. If instead your framework is positive, you will interpret the same event as a buying opportunity.

The challenge of investing is compounded by the fact that our brains, which excel at resolving ambiguity in the face of a threat, are less well equipped to navigate the long term intelligently. Since none of us can predict the future, successful investing requires planning and discipline.

Unfortunately, when reason is in apparent conflict with our instincts—about markets or a “hot stock,” for example—it is our instincts that typically prevail. Our “reptilian brain” wins out over our “rational brain,” as it so often does in other facets of our lives. And as we have seen, investors trade too frequently, and often at the wrong time.

One way our brains resolve conflicting information is to seek out safety in numbers. In the animal kingdom, this is called “moving with the herd,” and it serves a very important purpose: helping to ensure survival. Just as a buffalo will try to stay with the herd in order to minimize its individual vulnerability to predators, we tend to feel safer and more confident investing alongside equally bullish investors in a rising market, and we tend to sell when everyone around us is doing the same. Even the so-called smart money falls prey to a herd mentality: one study, aptly titled “Thy Neighbor’s Portfolio,” found that professional mutual fund managers were more likely to buy or sell a particular stock if other managers in the same city were also buying or selling.

This comfort is costly. The surge in buying activity and the resulting bullish sentiment is self-reinforcing, propelling markets to react even faster. That leads to overvaluation and the inevitable crash when sentiment reverses. As we shall see, such booms and busts are characteristic of all financial markets, regardless of size, location, or even the era in which they exist.

Even though the role of instinct and human emotions in driving speculative bubbles has been well documented in popular books, newspapers, and magazines for hundreds of years, these factors were virtually ignored in conventional financial and economic models until the 1970s.

This is especially surprising given that, in 1951, a young PhD student from the University of Chicago, Harry Markowitz, published two very important papers. The first, entitled “Portfolio Selection,” published in the Journal of Finance, led to the creation of what we call modern portfolio theory, together with the widespread adoption of its important ideas such as asset allocation and diversification. It earned Harry Markowitz a Nobel Prize in Economics.

The second paper, entitled “The Utility of Wealth” and published in the prestigious Journal of Political Economy, was about the propensity of people to hold insurance (safety) and to buy lottery tickets at the same time. It delved deeper into the psychological aspects of investing but was largely forgotten for decades.

The field of behavioral finance really came into its own through the pioneering work of two academic psychologists, Amos Tversky and Daniel Kahneman, who challenged conventional wisdom about how people make decisions involving risk. Their work garnered Kahneman the Nobel Prize in Economics in 2002. Behavioral finance and neuroeconomics are relatively new fields of study that seek to identify and understand human behavior and decision making with regard to choices involving trade-offs between risk and reward. Of particular interest are the human biases that prevent individuals from making fully rational financial decisions in the face of uncertainty.

As behavioral economists have documented, our propensity for herd behavior is just the tip of the iceberg. Kahneman and Tversky, for example, showed that people who were asked to choose between a certain loss and a gamble, in which they could either lose more money or break even, would tend to choose the double down (that is, gamble to avoid the prospect of losses), a behavior the authors called “loss aversion.” Building on this work, Hersh Shefrin and Meir Statman, professors at the University of Santa Clara Leavey School of Business, have linked the propensity for loss aversion to investors’ tendency to hold losing investments too long and to sell winners too soon. They called this bias the disposition effect.

The lengthy list of behaviorally driven market effects often converge in an investor’s tale of woe. Overconfidence causes investors to hold concentrated portfolios and to trade excessively, behaviors that can destroy wealth. The illusion of control causes investors to overestimate the probability of success and underestimate risk because of familiarity—for example, causing investors to hold too much employer stock in their 401(k) plans, resulting in under-diversification. Cognitive dissonance causes us to ignore evidence that is contrary to our opinions, leading to myopic investing behavior. And the representativeness bias leads investors to assess risk and return based on superficial characteristics—for example, by assuming that shares of companies that make products you like are good investments.

Several other key behavioral biases come into play in the realm of investing. Framing can cause investors to make a decision based on how the question is worded and the choices presented. Anchoring often leads investors to unconsciously create a reference point, say for securities prices, and then adjust decisions or expectations with respect to that anchor. This bias might impede your ability to sell a losing stock, for example, in the false hope that you can earn your money back. Similarly, the endowment bias might lead you to overvalue a stock that you own and thus hold on to the position too long. And regret aversion may lead you to avoid taking a tough action for fear that it will turn out badly. This can lead to decision paralysis in the wake of a market crash, even though, statistically, it is a good buying opportunity.

Behavioral finance has generated plenty of debate. Some observers have hailed the field as revolutionary; others bemoan the discipline’s seeming lack of a transcendent, unifying theory. This much is clear: behavioral finance treats biases as mistakes that, in academic parlance, prevent investors from thinking “rationally” and cause them to hold “suboptimal” portfolios.

But is that really true? In investing, as in life, the answer is more complex than it appears. Effective decision making requires us to balance our “reptilian brain,” which governs instinctive thinking, with our “rational brain,” which is responsible for strategic thinking. Instinct must integrate with experience.

Put another way, behavioral biases are nothing more than a series of complex trade-offs between risk and reward. When the stock market is taking off, for example, a failure to rebalance by selling winners is considered a mistake. The same goes for a failure to add to a position in a plummeting market. That’s because conventional finance theory assumes markets to be inherently stable, or “mean-reverting,” so most deviations from the historical rate of return are viewed as fluctuations that will revert to the mean, or self-correct, over time.

But what if a precipitous market drop is slicing into your peace of mind, affecting your sleep, your relationships, and your professional life? What if that assumption about markets reverting to the mean doesn’t hold true and you cannot afford to hold on for an extended period of time? In both cases, it might just be “rational” to sell and accept your losses precisely when investment theory says you should be buying. A concentrated bet might also make sense, if you possess the skill or knowledge to exploit an opportunity that others might not see, even if it flies in the face of conventional diversification principles.

Of course, the time to create decision rules for extreme market scenarios and concentrated bets is when you are building your investment strategy, not in the middle of a market crisis or at the moment a high-risk, high-reward opportunity from a former business partner lands on your desk and gives you an adrenaline jolt. A disciplined process for managing risk in relation to a clear set of goals will enable you to use the insights offered by behavioral finance to your advantage, rather than fall prey to the common pitfalls. This is one of the central insights of the Wealth Allocation Framework. But before we can put these insights to practical use, we need to understand the true nature of financial markets.

Leonard Mlodinow: The Three Laws of Probability

"These three laws, simple as they are, form much of the basis of probability theory. Properly applied, they can give us much insight into the workings of nature and the everyday world. "
“These three laws, simple as they are, form much of the basis of probability theory. Properly applied, they can give us much insight into the workings of nature and the everyday world.”

 

In his book, The Drunkard’s Walk, Leonard Mlodinow outlines the three key “laws” of probability.

The first law of probability is the most basic of all. But before we get to that, let’s look at this question.

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
Which is more probable?

Linda is a bank teller.
Linda is a bank teller and is active in the feminist movement.

To Kahneman and Tversky’s surprise, 87 percent of the subjects in the study believed that the probability of Linda being a bank teller and active in the feminist movement was a higher probability than the probability that Linda is a bank teller.

1. The probability that two events will both occur can never be greater than the probability that each will occur individually.

This is the conjunction fallacy.

Mlodinow explains:

Why not? Simple arithmetic: the chances that event A will occur = the chances that events A and B will occur + the chance that event A will occur and event B will not occur.

The interesting thing that Kahneman and Tversky discovered was that we don’t tend to make this mistake unless we know something about the subject.

“For example,” Mlodinow muses, “suppose Kahneman and Tversky had asked which of these statements seems most probable:”

Linda owns an International House of Pancakes franchise.
Linda had a sex-change operation and is now known as Larry.
Linda had a sex-change operation, is now known as Larry, and owns an International House of Pancakes franchise.

In this case it’s unlikely you would choose the last option.

Via The Drunkard’s Walk:

If the details we are given fit our mental picture of something, then the more details in a scenario, the more real it seems and hence the more probable we consider it to be—even though any act of adding less-than-certain details to a conjecture makes the conjecture less probable.

Or as Kahneman and Tversky put it, “A good story is often less probable than a less satisfactory… [explanation].”

2. If two possible events, A and B, are independent, then the probability that both A and B will occur is equal to the product of their individual probabilities.

Via The Drunkard’s Walk:

Suppose a married person has on average roughly a 1 in 50 chance of getting divorced each year. On the other hand, a police officer has about a 1 in 5,000 chance each year of being killed on the job. What are the chances that a married police officer will be divorced and killed in the same year? According to the above principle, if those events were independent, the chances would be roughly 1⁄50 × 1⁄5,000, which equals 1⁄250,000. Of course the events are not independent; they are linked: once you die, darn it, you can no longer get divorced. And so the chance of that much bad luck is actually a little less than 1 in 250,000.

Why multiply rather than add? Suppose you make a pack of trading cards out of the pictures of those 100 guys you’ve met so far through your Internet dating service, those men who in their Web site photos often look like Tom Cruise but in person more often resemble Danny DeVito. Suppose also that on the back of each card you list certain data about the men, such as honest (yes or no) and attractive (yes or no). Finally, suppose that 1 in 10 of the prospective soul mates rates a yes in each case. How many in your pack of 100 will pass the test on both counts? Let’s take honest as the first trait (we could equally well have taken attractive). Since 1 in 10 cards lists a yes under honest, 10 of the 100 cards will qualify. Of those 10, how many are attractive? Again, 1 in 10, so now you are left with 1 card. The first 1 in 10 cuts the possibilities down by 1⁄10, and so does the next 1 in 10, making the result 1 in 100. That’s why you multiply. And if you have more requirements than just honest and attractive, you have to keep multiplying, so . . . well, good luck.

And there are situations where probabilities should be added. That’s the next law.

“These occur when we want to know the chances of either one event or another occurring, as opposed to the earlier situation, in which we wanted to know the chance of one event and another event happening.”

3. If an event can have a number of different and distinct possible outcomes, A, B, C, and so on, then the probability that either A or B will occur is equal to the sum of the individual probabilities of A and B, and the sum of the probabilities of all the possible outcomes (A, B, C, and so on) is 1 (that is, 100 percent).

Via The Drunkard’s Walk:

When you want to know the chances that two independent events, A and B, will both occur, you multiply; if you want to know the chances that either of two mutually exclusive events, A or B, will occur, you add. Back to our airline: when should the gate attendant add the probabilities instead of multiplying them? Suppose she wants to know the chances that either both passengers or neither passenger will show up. In this case she should add the individual probabilities, which according to what we calculated above, would come to 55 percent.

These three simple laws form the basis of probability. “Properly applied,” Mlodinow writes, “they can give us much insight into the workings of nature and the everyday world.” We use them all the time, we just don’t use them properly.