As Warren Buffett says, temperament is more important that IQ, because otherwise you take all the horsepower of your brain and dramatically reduce it. “A lot of people start out with 400-horsepower motors but only get a hundred horsepower of output,” he said. “It’s way better to have a 200-horsepower motor and get it all into output.”
With that in mind, Michael Mauboussin, the very first guest on our podcast, writes about five pitfalls to avoid that reduce the horsepower of your brain.
Five Pitfalls to Avoid
The first pitfall that Mauboussin mentions is overconfidence. Acting as though you are smarter than you are is a recipe for disaster.
Researchers have found that people consistently overrate their abilities, knowledge, and skill. This is especially true in areas outside of their expertise. For example, professional securities analysts and money managers were presented with ten requests for information that they were unlikely to know (e.g., total area of Lake Michigan in square miles). They were asked to respond to each question with both an answer and a “confidence range”—high and low boundaries within which they were 90% certain that the true number resides. On average, the analysts choose ranges wide enough to accommodate the correct answer only 64% of the time. Money managers were even less successful at 50%.
Edward Russo and Paul Schoemaker, in their book “Managing Overconfidence,” argue that this confidence quiz measures how well we recognize the gap between what we think we know and what we do know. They point out that good decision-making means knowing the limits of your knowledge. Warren Buffett echoes the point with his circle of competence concept. He argues that investors should define their circle of competence, or area of expertise, and stay within it. Overconfidence in our expertise can lead to poor decisions. In the words of Will Rogers, “It’s not what we don’t know that gets us into trouble, it’s what we know that ain’t so.”
Mauboussin suggests you know your circle of competence and not over-estimate your abilities. To compensate for inevitable error in this, he suggests adding a margin of safety. Another idea is to do the work necessary to have an opinion and test it with other people by seeking feedback.
2. Anchoring and Adjustment
This is when we are influenced by data or information, even if it’s not relevant, that impacts our future judgments. “In considering a decision,” Mauboussin writes, “we often give disproportionate weight to the first information we receive. As a result, initial impressions, ideas, estimates, or data anchor our subsequent thoughts.”
A good decision process helps counter this by ensuring you look at decisions from various angles, consider a wide variety of sources, start from zero, and adapt to reality.
3. Improper Framing
I’ll let Mauboussin explain this before I take over.
People’s decisions are affected by how a problem, or set of circumstances, is presented. Even the same problem framed in different—and objectively equal—ways can cause people to make different choices. One example is what Richard Thaler calls “mental accounting.” Say an investor buys a stock at $50 per share that surges to $100. Many investors divide the value of the stock into two distinct parts—the initial investment and the quick profit. And each is treated differently—the original investment with caution and the profit portion with considerably less discipline. Thaler and Eric Johnson call this “the house money effect.”
This effect is not limited to individuals. Hersh Shefrin documents how the committee in charge of Santa Clara University’s endowment portfolio succumbed to this effect. Because of strong market performance, the endowment crossed a preordained absolute level (the frame) ahead of the time line set by the university president. The result? The university took some of the “house money” and added riskier investment classes to its portfolio, including venture capital, hedge funds, and private placements. Classic economic theory assumes frame independence: all money is treated the same. But empirical evidence shows that the frame indeed shapes decisions.
One of the most significant insights from prospect theory is that people exhibit significant aversion to losses when making choices between risky outcomes, no matter how small the stakes. In fact, Kahneman and Tversky find that a loss has about two and a half times the impact of the gain the same size. In other words, people feel a lot worse about losses of given size than they feel good about a gain of similar magnitude. This leads to loss aversion.
To describe this loss aversion, Shefrin and Meir Statman coined the term “disposition effect,” which they amusingly suggest is shorthand for “predisposition toward get-evenitis.” Since it is difficult for investors to make peace with their losses, they tend to sell their winners too soon and hold on to their losers too long. This is because they don’t want to take a loss on a stock. They want to at least get even despite the fact that the original rationale for purchasing the stock no longer appears valid. This is an important insight for all investors, including those that adopt the expectations investing approach.
This is often influenced by incentives as well, because we tend to see the world through our incentives and not how it really is.
Consider this example from Charlie Munger of how Lloyd’s Insurance rewarded people:
They were paid a percentage of the gross volume that went through. And paying everybody a percentage of the gross, when what you’re really interested in is the net, is a system — given the natural bias of human beings toward doing what’s in their own interest even though it has terrible consequences for other people — that really did Lloyd’s in.
People were rewarded for doing stupid things because from their frame of reference they acted rationally. It’s hard to see the reality of a system you are a part of. To counter this bias, you need to step outside the system. This can be done by adjusting time frames (thinking about the long term and short term), considering second and third order effects, looking at the issue in different ways, and seeking out conflicting opinions.
4. Irrational Escalation of a Commitment
Sometimes we just keep digging when the best thing to do is cut our losses. Past decisions create what economists call sunk costs, which are past investments that cannot be recovered. And if we’re the ones who made the decisions for those investments, we’re likely to rationalize that the future is brighter. Otherwise, we’d have to admit that we were wrong and that’s painful.
While the irrational escalation of a commitment can sometimes pay off, it’s not a probabilistic way to think about things.
Sticking to a good decision process is a good way to avoid irrational escalation of a commitment. Other ideas include considering only future benefits and looking inside yourself to see if you’re trying to avoid admitting a mistake. As Daniel Dennett says, “you should become a connoisseur of your own mistakes.”
5. Confirmation Trap
This comes from justifying your point of view and is one we can easily observe in others and fail to see in ourselves (again, this comes down to relativity.)
Investors tend to seek out information that supports their existing point of view while avoiding information that contradicts their opinion. This trap not only affects where investors go for information but also how they interpret the information they receive—too much weight is given to confirming evidence and not enough to disconfirming evidence. Investors often fall into the confirmation trap after making an investment decision. For example, once investors purchase a stock, they seek evidence that confirms their thesis and dismiss or discount information that disconfirms it. This leads to a loss of objectivity.
It’s this loss of objectivity that kills us. The way to counter this is to “ask questions and conduct analysis that challenges your most cherished and firmly held beliefs” and seek out disconfirming evidence, something Charles Darwin mastered.
Still Curious? Follow your brain over to Grey Thinking.