Tag: Over-confidence Bias

The Psychology of Risk and Reward

The Psychology of Risk and Reward

An excerpt from The Aspirational Investor: Taming the Markets to Achieve Your Life's Goals that I think you'd enjoy.

Most of us have a healthy understanding of risk in the short term.

When crossing the street, for example, you would no doubt speed up to avoid an oncoming car that suddenly rounds the corner.

Humans are wired to survive: it’s a basic instinct that takes command almost instantly, enabling our brains to resolve ambiguity quickly so that we can take decisive action in the face of a threat.

The impulse to resolve ambiguity manifests itself in many ways and in many contexts, even those less fraught with danger. Glance at the (above) picture for no more than a couple of seconds. What do you see?

Some observers perceive the profile of a young woman with flowing hair, an elegant dress, and a bonnet. Others see the image of a woman stooped in old age with a wart on her large nose. Still others—in the gifted minority—are able to see both of the images simultaneously.

What is interesting about this illusion is that our brains instantly decide what image we are looking at, based on our first glance. If your initial glance was toward the vertical profile on the left-hand side, you were all but destined to see the image of the elegant young woman: it was just a matter of your brain interpreting every line in the picture according to the mental image that you already formed, even though each line can be interpreted in two different ways. Conversely, if your first glance fell on the central dark horizontal line that emphasizes the mouth and chin, your brain quickly formed an image of the older woman.

Regardless of your interpretation, your brain wasn’t confused. It simply decided what the picture was and filled in the missing pieces. Your brain resolved ambiguity and extracted order from conflicting information.

What does this have to do with decision making? Every bit of information can be interpreted differently according to our perspective. Ashvin Chhabra directs us to investing. I suggest you reframe this in the context of decision making in general.

Every trade has a seller and a buyer: your state of mind is paramount. If you are in a risk-averse mental framework, then you are likely to interpret a further fall in stocks as additional confirmation of your sell bias. If instead your framework is positive, you will interpret the same event as a buying opportunity.

The challenge of investing is compounded by the fact that our brains, which excel at resolving ambiguity in the face of a threat, are less well equipped to navigate the long term intelligently. Since none of us can predict the future, successful investing requires planning and discipline.

Unfortunately, when reason is in apparent conflict with our instincts—about markets or a “hot stock,” for example—it is our instincts that typically prevail. Our “reptilian brain” wins out over our “rational brain,” as it so often does in other facets of our lives. And as we have seen, investors trade too frequently, and often at the wrong time.

One way our brains resolve conflicting information is to seek out safety in numbers. In the animal kingdom, this is called “moving with the herd,” and it serves a very important purpose: helping to ensure survival. Just as a buffalo will try to stay with the herd in order to minimize its individual vulnerability to predators, we tend to feel safer and more confident investing alongside equally bullish investors in a rising market, and we tend to sell when everyone around us is doing the same. Even the so-called smart money falls prey to a herd mentality: one study, aptly titled “Thy Neighbor’s Portfolio,” found that professional mutual fund managers were more likely to buy or sell a particular stock if other managers in the same city were also buying or selling.

This comfort is costly. The surge in buying activity and the resulting bullish sentiment is self-reinforcing, propelling markets to react even faster. That leads to overvaluation and the inevitable crash when sentiment reverses. As we shall see, such booms and busts are characteristic of all financial markets, regardless of size, location, or even the era in which they exist.

Even though the role of instinct and human emotions in driving speculative bubbles has been well documented in popular books, newspapers, and magazines for hundreds of years, these factors were virtually ignored in conventional financial and economic models until the 1970s.

This is especially surprising given that, in 1951, a young PhD student from the University of Chicago, Harry Markowitz, published two very important papers. The first, entitled “Portfolio Selection,” published in the Journal of Finance, led to the creation of what we call modern portfolio theory, together with the widespread adoption of its important ideas such as asset allocation and diversification. It earned Harry Markowitz a Nobel Prize in Economics.

The second paper, entitled “The Utility of Wealth” and published in the prestigious Journal of Political Economy, was about the propensity of people to hold insurance (safety) and to buy lottery tickets at the same time. It delved deeper into the psychological aspects of investing but was largely forgotten for decades.

The field of behavioral finance really came into its own through the pioneering work of two academic psychologists, Amos Tversky and Daniel Kahneman, who challenged conventional wisdom about how people make decisions involving risk. Their work garnered Kahneman the Nobel Prize in Economics in 2002. Behavioral finance and neuroeconomics are relatively new fields of study that seek to identify and understand human behavior and decision making with regard to choices involving trade-offs between risk and reward. Of particular interest are the human biases that prevent individuals from making fully rational financial decisions in the face of uncertainty.

As behavioral economists have documented, our propensity for herd behavior is just the tip of the iceberg. Kahneman and Tversky, for example, showed that people who were asked to choose between a certain loss and a gamble, in which they could either lose more money or break even, would tend to choose the double down (that is, gamble to avoid the prospect of losses), a behavior the authors called “loss aversion.” Building on this work, Hersh Shefrin and Meir Statman, professors at the University of Santa Clara Leavey School of Business, have linked the propensity for loss aversion to investors’ tendency to hold losing investments too long and to sell winners too soon. They called this bias the disposition effect.

The lengthy list of behaviorally driven market effects often converge in an investor’s tale of woe. Overconfidence causes investors to hold concentrated portfolios and to trade excessively, behaviors that can destroy wealth. The illusion of control causes investors to overestimate the probability of success and underestimate risk because of familiarity—for example, causing investors to hold too much employer stock in their 401(k) plans, resulting in under-diversification. Cognitive dissonance causes us to ignore evidence that is contrary to our opinions, leading to myopic investing behavior. And the representativeness bias leads investors to assess risk and return based on superficial characteristics—for example, by assuming that shares of companies that make products you like are good investments.

Several other key behavioral biases come into play in the realm of investing. Framing can cause investors to make a decision based on how the question is worded and the choices presented. Anchoring often leads investors to unconsciously create a reference point, say for securities prices, and then adjust decisions or expectations with respect to that anchor. This bias might impede your ability to sell a losing stock, for example, in the false hope that you can earn your money back. Similarly, the endowment bias might lead you to overvalue a stock that you own and thus hold on to the position too long. And regret aversion may lead you to avoid taking a tough action for fear that it will turn out badly. This can lead to decision paralysis in the wake of a market crash, even though, statistically, it is a good buying opportunity.

Behavioral finance has generated plenty of debate. Some observers have hailed the field as revolutionary; others bemoan the discipline’s seeming lack of a transcendent, unifying theory. This much is clear: behavioral finance treats biases as mistakes that, in academic parlance, prevent investors from thinking “rationally” and cause them to hold “suboptimal” portfolios.

But is that really true? In investing, as in life, the answer is more complex than it appears. Effective decision making requires us to balance our “reptilian brain,” which governs instinctive thinking, with our “rational brain,” which is responsible for strategic thinking. Instinct must integrate with experience.

Put another way, behavioral biases are nothing more than a series of complex trade-offs between risk and reward. When the stock market is taking off, for example, a failure to rebalance by selling winners is considered a mistake. The same goes for a failure to add to a position in a plummeting market. That’s because conventional finance theory assumes markets to be inherently stable, or “mean-reverting,” so most deviations from the historical rate of return are viewed as fluctuations that will revert to the mean, or self-correct, over time.

But what if a precipitous market drop is slicing into your peace of mind, affecting your sleep, your relationships, and your professional life? What if that assumption about markets reverting to the mean doesn’t hold true and you cannot afford to hold on for an extended period of time? In both cases, it might just be “rational” to sell and accept your losses precisely when investment theory says you should be buying. A concentrated bet might also make sense, if you possess the skill or knowledge to exploit an opportunity that others might not see, even if it flies in the face of conventional diversification principles.

Of course, the time to create decision rules for extreme market scenarios and concentrated bets is when you are building your investment strategy, not in the middle of a market crisis or at the moment a high-risk, high-reward opportunity from a former business partner lands on your desk and gives you an adrenaline jolt. A disciplined process for managing risk in relation to a clear set of goals will enable you to use the insights offered by behavioral finance to your advantage, rather than fall prey to the common pitfalls. This is one of the central insights of the Wealth Allocation Framework. But before we can put these insights to practical use, we need to understand the true nature of financial markets.

Bias from Overconfidence: A Mental Model

overconfidence

“What a Man wishes, he will also believe” – Demosthenes

Bias from overconfidence is a natural human state. All of us believe good things about ourselves and our skills. As Peter Bevelin writes in Seeking Wisdom:

Most of us believe we are better performers, more honest and intelligent, have a better future, have a happier marriage, are less vulnerable than the average person, etc. But we can’t all be better than average.

This inherent base rate of overconfidence is especially strong when projecting our beliefs about our future. Over-optimism is a form of overconfidence. Bevelin again:

We tend to Overestimate our ability to predict the future. People tend to put a higher probability on desired events than undesired events.

The bias from overconfidence is insidious because of how many factors can create and inflate it. Emotional, cognitive and social factors all influence it. Emotional, as we see, because of the emotional pain of believing bad things about ourselves, or in our lives.

Emotional and Cognitive distortion that creates overconfidence is the dangerous and unavoidable accompaniment to any form of success.

Roger Lowenstein writes in When Genius Failed, “there is nothing like success to blind one to the possibility of failure.”

In Seeking Wisdom Bevelin writes:

What tends to inflate the price that CEOs pay for acquisitions? Studies found evidence of infection through three sources of hubris: 1) overconfidence after recent success, 2) a sense of self-importance; the belief that a high salary compared to other senior ranking executives implies skill, and 3) the CEOs belief in their own press coverage. The media tend to glorify the CEO and over-attribute business success to the role of the CEO rather than to other factors and people. This makes CEOs more likely to become both more overconfident about their abilities and more committed to the actions that made them media celebrities.

This isn’t an effect confined to CEOs and large transactions. This feedback loop happens every day between employees and their managers. Or between students and professors, even peers and spouses.

Perhaps the most surprising, pervasive, and dangerous reinforcer of Overconfidence is the social incentives. Take a look at this example of social pressures on doctors, from Kahneman in Thinking, Fast and Slow:

Generally, it is considered a weakness and a sign of vulnerability for clinicians to appear unsure. Confidence is valued over uncertainty and there is a prevailing censure against disclosing uncertainty to patients.

An unbiased appreciation of uncertainty is a cornerstone of rationality—but that is now what people and organizations want. Extreme uncertainty is paralyzing under dangerous circumstances, and the admission that one is merely guessing is especially unacceptable when the stakes are high. Acting on pretended knowledge is often the preferred solution.

And what about those who don’t succumb to this social pressure to let the Overconfidence bias run wild?

Kahneman writes, “Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of the clients.”

It’s important to structure environments that allow for uncertainty, or the system will reward the most overconfident, not the most rational, of the decision-makers.

Making perfect forecasts isn’t the goal–self-awareness is, in the form of wide confidence intervals. Kahneman again in Thinking, Fast and Slow writes:

For a number of years, professors at Duke University conducted a survey in which the chief financial officers of large corporations estimated the results of the S&P index over the following year. The Duke scholars collected 11,600 such forecasts and examined their accuracy. The conclusion was straightforward: financial officers of large corporations had no clue about the short-term future of the stock market; the correlation between their estimates and the true value was slightly less than zero! When they said the market would go down, it was slightly more likely than not that it would go up. these findings are not surprising. The truly bad news is that the CFOs did not appear to know that their forecasts were worthless.

You don’t have to be right. You just have to know that you’re not very likely to be right.

As always with the lollapalooza effect of overlapping, combining, and compounding psychological effects, this one has powerful partners in some of our other mental models. Overconfidence bias is often caused or exacerbated by: doubt-avoidance, inconsistency-avoidance, incentives, denial, believing-first-and-doubting-later, and the endowment effect.

So what are the ways of restraining Overconfidence bias?
The discipline to apply basic math, as prescribed by Munger: “One standard antidote to foolish optimism is trained, habitual use of the simple probability math of Fermat and Pascal, taught in my youth to high school sophomores.” (Pair with Fooled by Randomness).

And in Seeking Wisdom, Bevelin reminds us that “Overconfidence can cause unreal expectations and make us more vulnerable to disappointment.” A few sentences later he advises us to “focus on what can go wrong and the consequences.”

Build in some margin of safety in decisions. Know how you will handle things if they go wrong. Surprises occur in many unlikely ways. Ask: How can I be wrong? Who can tell me if I'm wrong?

Bias from Overconfidence is a Farnam Street mental model.

A System for Remembering What You Read

Last year, I read 161 books cover-to-cover. And that doesn't include the ones that I started to read and put down. In the process, I learned a lot about what works and what doesn't work. People often ask me how I remember what I read.

I have a system that I use for non-fiction books that enables me to remember quite a bit. And when I can't remember something, I generally know where to look to find the answers.

Here are some of the tips that work for me:

  • Learn How to Read a Book.
  • Start with the index, the table of contents, and the preface. This will give you a good sense of the book.
  • Be OK with deciding that now is not the time to read the book.
  • Read one book at a time.
  • Put it down if you lose interest.
  • Mark up the book while reading it. Questions. Thoughts. And, more important, connections to other ideas.
  • At the end of each chapter, without looking back, write some notes on the main points/arguments/take-aways. Then look back through the chapter and write down anything you missed.
  • Specifically note anything that was in the chapter that you can apply somewhere else.
  • When you're done with the book, take out a blank sheet of paper and explain the core ideas or arguments of the book to yourself. Where you have problems, go back and review your notes. This is the Feynman technique.
  • Put the book down for a week.
  • Pick the book back up, reread all of your notes/highlights/marginalia/etc. Time is a good filter — what's still important? Note this on the inside of the cover with a reference to the page number.
  • Put any notes that you want to keep in your commonplace book.

One thing that most people don't appreciate enough is that what you read makes a huge difference in how well you remember things.

We fail to remember a lot of the stuff we read because it's not building on any existing knowledge. We're often trying to learn complex things (that change rapidly) without understanding the basic things (which change slowly or not at all). Or, worse still, we're uncritically letting other people do the thinking for us. This is the adult equivalent of regurgitating the definition of a boldface word in our high school textbook.

Both of these habits lead to the illusion of knowledge and to overconfidence.

I'd argue that a better approach is to build a latticework of mental models. That is, acquire core multi-disciplinary knowledge and use that as your foundation. This is the best investment because this stuff doesn't change, or if it does, it changes really slowly. This knowledge becomes your foundation. This is what you build on. So when you read and connect things to the core knowledge, not only do you have a better idea of how things fit together, but you also strengthen those connections in your head.

If you're looking to acquire worldly wisdom, time is your best filter. It makes sense to focus on learning the core ideas over multiple disciplines. These remain constant. And when you have a solid foundation, it's easier to build upon because you connect what you're learning to that (now very solid) foundation.

This Time is Different: The Four Most Costly Words Ever Spoken

When we look at situations we're always looking for what's unique. We should, however, give more thought to similarities.

“This time is different” could be the 4 most costly words ever spoken. It's not the words that are costly so much as the conclusions they encourage us to draw.

We incorrectly think that differences are more valuable than similarities.

After all, anyone can see what's the same but it takes true insight to see what's different, right? We're all so busy trying to find differences that we forget to pay attention to what is the same.

Imagine sitting in a meeting where people are about to make the same mistake they made last year on the same decision. Let's say, for example, Jack has a history of messing up the annual tax returns. He's a good guy. He's nice to everyone. In fact, he buys everyone awesome Christmas presents. But the last three years—the period he's been in charge of tax returns—have been nothing short of a disaster causing more work for you and your department.

The assignment for the tax return comes up and Jack is once again nominated.

Before you have a chance to voice your concerns, one of your co-workers speaks up: “I know Jack has dropped the ball on this assignment in the past but I think this time is different. He's been working hard to make sure he's better organized.”

That's all it takes. Conversation over — everyone is focused on what's unique about this time and it's unlikely, despite ample evidence, that you'll be able to convince them otherwise.

In part, people want to believe in Jack because he's a nice guy. In part, we're all focused on why this time is different and we'll ignore evidence to the contrary.

Focusing on what's different makes it easy to forget historical context. We lose touch with the base rate. We only see the evidence that supports our view (confirmation bias). We become optimistic and overconfident.

History provides context. And what history shows us is that no matter how unique things are today there are a lot of similarities with the past.

Consider investors and the dotcom bubble. Collectively people saw this as unprecedented and unique, a massive transformation that was unparalleled.

That reasoning, combined with a blindness to what was the same about this situation and previous ones, encouraged us to draw conclusions that proved costly. We reasoned that everything would change and everyone who owned internet companies would prosper and the old non-internet companies would quickly go into bankruptcy.

All of a sudden profits didn't matter. Nor did revenue. They would come in time, we hoped. Market share mattered no matter how costly it was to acquire.

More than that, if you didn't buy now you'd miss out. These companies would take over the world and you'd be left out.

We got so caught up in what was different that we forgot to see what was the same.

And there were historical parallels: Automobiles, Radio, Television, and Airplanes to name a few. At the time these innovations completely transformed the world as well. You can consider them the dotcoms of yesteryear.

And how did these massively transformational industries end up for investors?

At one time there were allegedly over 70 different auto manufacturing operations in the United States. Only 3 of them survived (and a few of those even required government funds.)

If you catch yourself reasoning based on “this time is different” remember that you are probably speculating. While you may be right, odds are, this time is not different. You just haven't looked for the similarities.

Nate Silver: Confidence Kills Predictions

Best known for accurate election predictions, statistician Nate Silver is also the author of The Signal and the Noise: Why So Many Predictions Fail—But Some Don’t. Heather Bell, Managing Editor of Journal of Indexes, recently spoke with Silver.

IU: What do you see as the common theme among bad predictions? What most often leads people astray?
Silver: A lot of it is overconfidence. People tend to underestimate what the uncertainty that is intrinsic to a problem actually is. If you have someone estimate what they think a confidence interval is that’s supposed to cover 90 percent of all outcomes, it usually only covers 50 percent. You have upside outcomes and downside outcomes in the market certainly more often than people realize.

There are a variety of reasons for this. Part of it is that we can sometimes get stuck in the recent past and examples that are most familiar to us, kind of what Daniel Kahneman called “the availability heuristic,” where we assume that the current trend will always perpetuate itself, when actually it can be an anomaly or a fluke, or where we always think that the period we’re living through is the “signal,” so to speak. That’s often not true—sometimes you’re living in the outlier period, like when you have a housing bubble period that you haven’t historically had before.

Overconfidence is the core linkage between most of the failures of predictions that we’ve looked at. Obviously, you can look at that in a more technical sense and see where sometimes people are fitting models where they don’t have as much data as they think, but the root of it comes down to a failure to understand that it’s tough to be objective and that we often come at a problem with different biases and perverse incentives—and if we don’t check those, we tend to get ourselves into trouble.

IU: What standards or conditions must be met, in your opinion, for something to be considered “predictable”?
Silver: I tend not to think in terms of black and white absolutes. There are two ways to define “predictable,” I’d say. One is by asking, How well we are able to model the system? The other is more of a cosmic predictability: How intrinsically random is something over the long run?

I look at baseball as an example. Even the best teams only win about two-thirds of their games. Even the best hitters only get on base about 40 percent of the time. In that sense, baseball is highly unpredictable. In another sense though, baseball is very easy to measure relative to a lot of other things. It’s easy to set up models for it, and the statistics are of very high quality. A lot of smart people have worked on the problem. As a result, we are able to measure and quantify the uncertainty pretty accurately. We still can’t predict who’s going to win every game, but we are doing a pretty good job with that. Things are predictable in theory, but our capabilities are not nearly as strong.

Predictability is a tricky question, but I always say we almost always have some notion of what’s going to happen next, but it’s just never a perfect notion. The question is more, Where do you sit along that spectrum?

A Decision-Making Magic Trick

Two important nuggets from an interview with Chip Heath, co-author of Decisive (more here), on improving our ability to make better decisions:

A decision-making magic trick

The closest thing to a decision-making magic trick that I’ve found is the question, “What would you advise your best friend to do if they were in your situation?” So often when I ask that question, people blurt out an answer and their eyes get wide. They’re shocked at how easy it is when you just imagine you’re advising someone else.

This time isn't so different

businesses make decisions about mergers and acquisitions that are hundreds of millions of dollars, and to the senior leader it seems like, “Well this is a different situation than the last acquisition we made.” And yet in that room making the decision is a set of people who have probably seen a dozen acquisitions, but they don’t take the time to do even the equivalent of the three-out-of-five-stars rating that we would get from Amazon.com.

The kind of decisions that senior people make always present themselves as though they are completely different than anything else. Big decisions are subtle in a way because they all seem to come one at a time. The advantage of smaller decisions is we realize we are in a repeated situation where we’re going to see the same thing a lot.

Yet lots of big decisions end up having that property as well. If we take the time to move our mental spotlight around, we can always find other decisions that are similar to this one. We think the chief financial officer we’re trying to hire is in a unique position in the company, and that this is a unique position in time with unique demands. But the fact is, we’ve made other senior hires and we know how that process goes. Stepping back and taking in the broader context is just as useful for senior leaders as it is for the frontline worker who’s making a decision on the 35th mortgage application or the 75th customer complaint.