Tag: Hindsight Bias

Philip Tetlock: Ten Commandments for Aspiring Superforecasters

The Knowledge Project interview with Philip Tetlock deconstructs our ability to make accurate predictions into specific components. He learned through his work on The Good Judgment Project.

In Superforecasting: The Art and Science of Prediction, Tetlock and Dan Gardner (his co-author), set out to distill the ten key themes that have been “experimentally demonstrated to boost accuracy” in the real-world.

1. Triage

Focus on questions where your hard work is likely to pay off. Don’t waste time either on easy “clocklike” questions (where simple rules of thumb can get you close to the right answer) or on impenetrable “cloud-like” questions (where even fancy statistical models can’t beat the dart-throwing chimp). Concentrate on questions in the Goldilocks zone of difficulty, where effort pays off the most.

For instance, don't ask, “Who will win the world series in 2050?” That's impossible to forecast and unknowable. The question becomes more interesting when we come closer to home. Asking in April who will win the World Series for the upcoming season and how much justifiable confidence we can have in that answer is a different proposition. While we have low confidence in who will win, we can have a lot more than trying to predict the 2050 winner. At the worst we can narrow the range of outcomes. This allows us to move back on the continuum from uncertainty to risk.

Certain classes of outcomes have well-deserved reputations for being radically unpredictable (e.g., oil prices, currency markets). But we usually don’t discover how unpredictable outcomes are until we have spun our wheels for a while trying to gain analytical traction. Bear in mind the two basic errors it is possible to make here. We could fail to try to predict the potentially predictable or we could waste our time trying to predict the unpredictable. Which error would be worse in the situation you face?

2. Break seemingly intractable problems into tractable sub-problems.

This is Fermi-style thinking. Enrico Fermi designed the first atomic reactor. When he wasn't doing that he loved to tackle challenging questions such as “How many piano tuners are in Chicago?” At first glance this seems very difficult. Fermi started by decomposing the problem into smaller parts and putting them into the buckets of knowable and unknowable. By working at a problem this way you expose what you don't know or, as Tetlock and Gardner put it, you “flush ignorance into the open.” It's better to air your assumptions and discover your errors quickly than to hide behind jargon and fog. Superforecasters are excellent at Fermi-izing — even when it comes to seemingly unquantifiable things like love.

The surprise is how often remarkably good probability estimates arise from a remarkably crude series of assumptions and guesstimates.

3. Strike the right balance between inside and outside views.

Echoing Michael Mauboussin, who cautioned that we should pay attention to what's the same, Tetlock and Gardner add a historical perspective:

Superforecasters know that there is nothing new under the sun. Nothing is 100% “unique.” Language purists be damned: uniqueness is a matter of degree. So superforecasters conduct creative searches for comparison classes even for seemingly unique events, such as the outcome of a hunt for a high-profile terrorist (Joseph Kony) or the standoff between a new socialist government in Athens and Greece’s creditors. Superforecasters are in the habit of posing the outside-view question: How often do things of this sort happen in situations of this sort?

The planning fallacy is a derivative of this.

4. Strike the right balance between under- and overreacting to evidence.

Belief updating is to good forecasting as brushing and flossing are to good dental hygiene. It can be boring, occasionally uncomfortable, but it pays off in the long term. That said, don’t suppose that belief updating is always easy because it sometimes is. Skillful updating requires teasing subtle signals from noisy news flows— all the while resisting the lure of wishful thinking.

Savvy forecasters learn to ferret out telltale clues before the rest of us. They snoop for nonobvious lead indicators, about what would have to happen before X could, where X might be anything from an expansion of Arctic sea ice to a nuclear war in the Korean peninsula. Note the fine line here between picking up subtle clues before everyone else and getting suckered by misleading clues.

The key here is a rational Bayesian updating of your beliefs. This is the same ethos behind Charlie Munger's thoughts on killing your best loved ideas. The world doesn't work the way we want it to but it does signal to us when things change. If we pay attention and adapt we let the world do most of the work for us.

5. Look for the clashing causal forces at work in each problem.

For every good policy argument, there is typically a counterargument that is at least worth acknowledging. For instance, if you are a devout dove who believes that threatening military action never brings peace, be open to the possibility that you might be wrong about Iran. And the same advice applies if you are a devout hawk who believes that soft “appeasement” policies never pay off. Each side should list, in advance, the signs that would nudge them toward the other.

There are no paint-by-number rules here. Synthesis is an art that requires reconciling irreducibly subjective judgments. If you do it well, engaging in this process of synthesizing should transform you from a cookie-cutter dove or hawk into an odd hybrid creature, a dove-hawk, with a nuanced view of when tougher or softer policies are likelier to work.

If you really want to have fun at meetings (and simultaneously decrease your popularity with your bosses) start asking what would cause them to change their mind. Never forget that having an opinion is hard work. You really need to concentrate and rag on the problem.

6. Strive to distinguish as many degrees of doubt as the problem permits but no more.

This could easily be called nuance matters. The more degrees of uncertainty you can distinguish the better.

As in poker, you have an advantage if you are better than your competitors at separating 60/ 40 bets from 40/ 60— or 55/ 45 from 45/ 55. Translating vague-verbiage hunches into numeric probabilities feels unnatural at first but it can be done. It just requires patience and practice.

7. Strike the right balance between under- and overconfidence, between prudence and decisiveness.

Superforecasters understand the risks both of rushing to judgment and of dawdling too long near “maybe.” They routinely manage the trade-off between the need to take decisive stands (who wants to listen to a waffler?) and the need to qualify their stands (who wants to listen to a blowhard?). They realize that long-term accuracy requires getting good scores on both calibration and resolution— which requires moving beyond blame-game ping-pong. It is not enough just to avoid the most recent mistake. They have to find creative ways to tamp down both types of forecasting errors— misses and false alarms— to the degree a fickle world permits such uncontroversial improvements in accuracy.

8. Look for the errors behind your mistakes but beware of rearview-mirror hindsight biases.

It's easy to justify or rationalize your failure. Don't. Own it and keep score with a decision journal. You want to learn where you went wrong and determine ways to get better. And don't just look at failures. Evaluate successes as well so you can determine when you were just plain lucky.

9. Bring out the best in others and let others bring out the best in you.

Master the fine art of team management, especially perspective taking (understanding the arguments of the other side so well that you can reproduce them to the other’s satisfaction), precision questioning (helping others to clarify their arguments so they are not misunderstood), and constructive confrontation (learning to disagree without being disagreeable). Wise leaders know how fine the line can be between a helpful suggestion and micromanagerial meddling or between a rigid group and a decisive one or between a scatterbrained group and an open-minded one.

10. Master the error-balancing bicycle.

Implementing each commandment requires balancing opposing errors. Just as you can’t learn to ride a bicycle by reading a physics textbook, you can’t become a superforecaster by reading training manuals. Learning requires doing, with good feedback that leaves no ambiguity about whether you are succeeding—“ I’m rolling along smoothly!”— or whether you are failing—“ crash!”

As with anything, doing more of it doesn't mean you're getting better at it. You need to do more than just go through the motions.  The way to get better is deliberate practice.

And finally …

“It is impossible to lay down binding rules,” Helmuth von Moltke warned, “because two cases will never be exactly the same.” Guidelines (or maps) are the best we can do in a world where nothing represents the whole. As George Box said: “All models are false. Some are useful.”

***

Mark Steed, a former member of The Good Judgment Project offered us 13 ways to make better decisions.

Fooled By Randomness

fooled by randomness

I don't want you to make the same mistake I did.

I waited too long before reading Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets by Nassim Taleb. He wrote the book before the Black Swan and Antifragile, which propelled him into intellectual celebrity. Interestingly, Fooled by Randomness contains semi-explored gems of the ideas that would later go on to become the best-selling books The Black Swan and Antifragile.

***
Hindsight Bias

Part of the argument that Fooled by Randomness presents is that when we look back at things that have happened we see them as less random than they actually were.

It is as if there were two planets: the one in which we actually live and the one, considerably more deterministic, on which people are convinced we live. It is as simple as that: Past events will always look less random than they were (it is called the hindsight bias). I would listen to someone’s discussion of his own past realizing that much of what he was saying was just backfit explanations concocted ex post by his deluded mind.

***
The Courage of Montaigne

Writing on Montaigne as the role model for the modern thinker, Taleb also addresses his courage:

It certainly takes bravery to remain skeptical; it takes inordinate courage to introspect, to confront oneself, to accept one’s limitations— scientists are seeing more and more evidence that we are specifically designed by mother nature to fool ourselves.

***
Probability

Fooled by Randomness is about probability, not in a mathematical way but as skepticism.

In this book probability is principally a branch of applied skepticism, not an engineering discipline. …

Probability is not a mere computation of odds on the dice or more complicated variants; it is the acceptance of the lack of certainty in our knowledge and the development of methods for dealing with our ignorance. Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem or a brain teaser. Mother nature does not tell you how many holes there are on the roulette table , nor does she deliver problems in a textbook way (in the real world one has to guess the problem more than the solution).

Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem” which is fascinating given how we tend to solve problems. In decisions under uncertainty, I discussed how risk and uncertainty are different things, which creates two types of ignorance.

Most decisions are not risk-based, they are uncertainty-based and you either know you are ignorant or you have no idea you are ignorant. There is a big distinction between the two. Trust me, you'd rather know you are ignorant.

***
Randomness Disguised as Non-Randomness

The core of the book is about luck that we understand as skill or “randomness disguised as non-randomness (that is determinism).”

This problem manifests itself most frequently in the lucky fool, “defined as a person who benefited from a disproportionate share of luck but attributes his success to some other, generally very precise, reason.”

Such confusion crops up in the most unexpected areas, even science, though not in such an accentuated and obvious manner as it does in the world of business. It is endemic in politics, as it can be encountered in the shape of a country’s president discoursing on the jobs that “he” created, “his” recovery, and “his predecessor’s” inflation.

These lucky fools are often fragilistas — they have no idea they are lucky fools. For example:

[W]e often have the mistaken impression that a strategy is an excellent strategy, or an entrepreneur a person endowed with “vision,” or a trader a talented trader, only to realize that 99.9% of their past performance is attributable to chance, and chance alone. Ask a profitable investor to explain the reasons for his success; he will offer some deep and convincing interpretation of the results. Frequently, these delusions are intentional and deserve to bear the name “charlatanism.”

This does not mean that all success is luck or randomness. There is a difference between “it is more random than we think” and “it is all random.”

Let me make it clear here : Of course chance favors the prepared! Hard work, showing up on time, wearing a clean (preferably white) shirt, using deodorant, and some such conventional things contribute to success— they are certainly necessary but may be insufficient as they do not cause success. The same applies to the conventional values of persistence, doggedness and perseverance: necessary, very necessary. One needs to go out and buy a lottery ticket in order to win. Does it mean that the work involved in the trip to the store caused the winning? Of course skills count, but they do count less in highly random environments than they do in dentistry.

No, I am not saying that what your grandmother told you about the value of work ethics is wrong! Furthermore, as most successes are caused by very few “windows of opportunity,” failing to grab one can be deadly for one’s career. Take your luck!

That last paragraph connects to something Charlie Munger once said: “Really good investment opportunities aren't going to come along too often and won't last too long, so you've got to be ready to act. Have a prepared mind.

Taleb thinks of success in terms of degrees, so mild success might be explained by skill and labour but outrageous success “is attributable variance.”

***
Luck Makes You Fragile

One thing Taleb hits on that really stuck with me is that “that which came with the help of luck could be taken away by luck (and often rapidly and unexpectedly at that). The flipside, which deserves to be considered as well (in fact it is even more of our concern), is that things that come with little help from luck are more resistant to randomness.” How Antifragile.

Taleb argues this is the problem of induction, “it does not matter how frequently something succeeds if failure is too costly to bear.”

***
Noise and Signal

We are confused between noise and signal.

…the literary mind can be intentionally prone to the confusion between noise and meaning, that is, between a randomly constructed arrangement and a precisely intended message. However, this causes little harm; few claim that art is a tool of investigation of the Truth— rather than an attempt to escape it or make it more palatable. Symbolism is the child of our inability and unwillingness to accept randomness; we give meaning to all manner of shapes; we detect human figures in inkblots.

All my life I have suffered the conflict between my love of literature and poetry and my profound allergy to most teachers of literature and “critics.” The French thinker and poet Paul Valery was surprised to listen to a commentary of his poems that found meanings that had until then escaped him (of course, it was pointed out to him that these were intended by his subconscious).

If we're concerned about situations where randomness is confused with non randomness should we also be concerned with situations where non randomness is mistaken for randomness, which would result in signal being ignored?

First, I am not overly worried about the existence of undetected patterns. We have been reading lengthy and complex messages in just about any manifestation of nature that presents jaggedness (such as the palm of a hand, the residues at the bottom of Turkish coffee cups, etc.). Armed with home supercomputers and chained processors, and helped by complexity and “chaos” theories, the scientists, semiscientists, and pseudoscientists will be able to find portents. Second, we need to take into account the costs of mistakes; in my opinion, mistaking the right column for the left one is not as costly as an error in the opposite direction. Even popular opinion warns that bad information is worse than no information at all.

If you haven't yet, pick up a copy of Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets. Don't make the same mistake I did and wait to read this important book.

(image via)

Michael Mauboussin: Two Tips to Improve The Quality of Your Decisions

Michael Mauboussin, chief investment strategist at Legg Mason and our first interview on the podcast, offers two simple techniques to improve the quality of your decision making: a decision journal and a checklist.

1. Create a decision journal and starting using it. 

Many years ago when I first met Danny Kahneman, and Kahneman is one of the preeminent psychologists in the world who won a Nobel Prize for economics in 2002, even though he's never taught an economics class.

When I pose him the question, what is a single thing an investor can do to improve his or her performance, he said almost without hesitation, go down to a local drugstore and buy a very cheap notebook and start keeping track of your decisions. And the specific idea is whenever you're making a consequential decision, something going in or out of the portfolio, just take a moment to think, write down what you expect to happen, why you expect it to happen and then actually, and this is optional, but probably a great idea, is write down how you feel about the situation, both physically and even emotionally. Just, how do you feel? I feel tired. I feel good, or this stock is really draining me. Whatever you think.

The key to doing this is that it prevents something called hindsight bias, which is no matter what happens in the world. We tend to look back on our decision-making process, and we tilt it in a way that looks more favorable to us, right? So we have a bias to explain what has happened.

When you've got a decision-making journal, it gives you accurate and honest feedback of what you were thinking at that time. And so there can be situations, by the way, you buy a stock and it goes up, but it goes up for reasons very different than what you thought was going to happen. And having that feedback in a way to almost check yourself periodically is extremely valuable. So that's, I think, a very inexpensive; it's actually not super time consuming, but a very, very valuable way of giving yourself essential feedback because our minds won't do it normally.

2. Use a checklist. 

Mauboussin: So the best work on this I've seen is by Atul Gawande, who is a surgeon in Boston who wrote a book a couple of years ago called The Checklist Manifesto, and one of the points he makes in there is that when you go from field to field, wherever checklists have been used correctly and with fidelity, they've been extremely effective in proving outcomes. So we all know none of us would step on an airplane today without the pilot having gone through the checklist. It's been a big move into medicine, especially for example, in surgery where checklists have really made substantial inroads in reducing infections, for example, and hence mortality, and other areas like construction elsewhere.

So the question is, how do you become more systematic in applying what you know? And I'll just mention one other thing on this. There are two; Gawande talks about two kinds of checklists. By the way, this branch is right out of aviation. One is called a do-confirm checklist, a do-confirm, and that just basically says, Hey, just do your normal analysis the way you've always done it and been trained to do that, but stop periodically just to confirm that you've covered all the bases. So as an analyst that might say, hey, I'm going to do a really thorough evaluation work. I might look very carefully at return on capital trends. I might study the competitive strategy position. You are just going to do all that stuff, but you're going to stop every now and then, just to check to make sure you've done everything.

The second one is called, the second kind of checklist, is called a read-do checklist. This is when you get into a difficult situation, for example you're a pilot and one of your engines goes out, the redo will guide how you should approach that problem. So you don't have to think about it so much, you just sort of go through it systematically. And so for an investor that might be hey, what happens when a company misses a quarter? What happens when they have a negative announcement or an executive departure? Sometimes that means sell the stock. Sometimes that means buy more. Sometimes it means do nothing, and a read-do checklist can help guide some of that thinking as well. So it's really a way to be structured and consistent in your analysis.

Michael Mauboussin Explains How Cognitive Biases Lead You Astray

In this video, Michael Mauboussin, Chief Investment Strategist at Legg Mason, shares three mistakes about how your brain leads you astray with cognitive biases.

One tip, Mauboussin offers to counter hindsight bias, is to keep a decision-making journal.

Michael Mauboussin is the author of More More Than You Know: Finding Financial Wisdom in Unconventional Places and more recently, Think Twice: Harnessing the Power of Counterintuition.

17 Management Lessons from Ray Dalio

Ray Dalio, the sixty-one-year-old founder of Bridewater Associates, the world’s biggest hedge fund, offers the following management advice. Dalio says “Taken together, these principles are meant to paint a picture of a process for the systematic pursuit of truth and excellence and for the rewards that accompany this pursuit. I put them in writing for people to consider in order to help Bridgewater and the people I care about most.”

1. Ego prevents growth. 

Two of the biggest impediments to truth and excellence are people’s ego’s and organizational bureaucracy. Most people like compliments and agreement, and they dislike criticisms and conflict. Yet recognizing mistakes and weaknesses is essential for rapid improvement and excellence. In our culture, there is nothing embarrassing about making mistakes and having weaknesses.

[…]

We need and admire people who can suspend their egos to get at truth and evolve toward excellence, so we ignore ego-based impediments to truth. We have a different type of environment in which some behaviors discouraged elsewhere are rewarded here (like challenging one’s superiors), and some behaviors encouraged elsewhere are punished here (like speaking behind a subordinate’s back).”

2. Think and act in a principled way and expect others to as well.

all outcomes are manifestations of forces that are at work to produce them, so whenever looking at specific outcomes, think about the forces that are behind them. Constantly ask yourself, “What is this symptomatic of?”

3. If you don’t mind being wrong on the way to being right, you will learn a lot

I once had a ski instructor who had taught Michael Jordan, the greatest basketball player of all time, how to ski. He explained how Jordan enjoyed his mistakes and got the most out of them. At the start of high school, Jordan was a mediocre basketball player; he became great because he loved using his mistakes to improve. I see it all the time. Intelligent people who are open to recognizing and learning from their mistakes substantially outperform people with the same abilities who aren't open in the same way.

4. Mistakes are ok. Not correcting them is not. 

Create a culture in which it is OK to fail but unacceptable not to identify, analyze and learn from mistakes. … A common mistake is to depersonalize the mistake, saying “we didn’t handle this well” rather than “Harry didn’t handle this well.” Again, this is because people are often uncomfortable connecting specific mistakes to specific people because of ego sensitivities. … it is essential that the diagnosis connect the mistakes to the specific individuals by name.

5. Everyone might have a voice but not all opinions are equally valued. 

Not all people’s opinions are equally valuable. Still that important distinction is often unacknowledged in discussions. Prevent this by looking at people’s track records, noting their credentials, and evaluating how their arguments hold up when challenged.

6. There is a difference between debate, discussion, and teaching. 

Debate is generally among approximate equals; discussion is open-minded exploration among people of various levels of understanding; and teaching is between people of different levels of understanding.

7. Know when to keep your mouth shut.

Imagine if a group of us were trying to learn how to play golf with Tiger Woods, and he and a new golfer were debating how to swing the club. Would it be helpful or harmful and plain silly to treat their points of view equally, because they have different levels of believability? It is better to listen to what Tiger Woods has to say, without constant interruptions by some know-nothing arguing with him.

“Two of the biggest impediments to truth and excellence are people’s ego’s and organizational bureaucracy.”

— Ray Dalio

8. Be careful not to lose personal responsibility via group decision making.

Too often groups will make a decision to do something without assigning personal responsibilities so it is not clear who is supposed to do what.

9. Don’t pick your battles. Fight them all.

If you see something wrong, even small things, deal with it. that is because 1) small things can be as symptomatic of serious underlying problems as big things, so looking into them, finding what they are symptomatic of, and resolving them will prevent big problems; 2) resolving small differences with people will prevent a more serious divergence of your views; and 3) in trying to help to train people, constant reinforcement of the desired behavior is helpful. The more battles you fight, the more opportunities you will have to get to know each other and the faster the evolutionary process will be.

10. All problems are manifestations of their root causes. Find the root cause. 

Keep asking why? and don’t forget to examine problems with people. In fact, since most things are done or not done because someone decided to do them or not to do them a certain way, most root causes can be traced to specific people, especially “the responsible party.” When the problem is attributable to a person, you have to ask why the person made the mistake to get at the real root cause. For example, a root cause discovery process might go something like this: The problem was due to bad programming. Why was there bad programming? Because Harry programmed it badly. Why did Harry program it badly? Because he wasn’t well trained and because he was in a rush? Why wasn’t he well trained? Did his manger know that he wasn’t well trained and let him do the job anyway or did he not know?

11 Avoid Monday-morning quarterbacking.

That is “evaluating the merits of past decisions based on what you know now versus what you could have reasonably known at the time of the decision. Do this by asking the question, “what should an intelligent person have known in that situation,” as well as having a deep understanding of the person who made the decision (how do they think, what type of person are they, did they learn from the situation, etc?)”

12. Don’t undermine personal accountability with vagueness.

Use specific names. For example, don’t say “we” or “they” handled it badly. Also avoid: “We should…” or “We are…” Who is we? Exactly who should, who made a mistake, or who did a great job? Use specific names.

13. Self-sufficiency increases efficiency and accountability.

Try to equip departments to be as self-sufficient as possible to enhance efficiency. We do this because we don’t want to create a bureaucracy that forces departments to requisition resources from a pool that lacks the focus to do the job. For example, while people often argue that we should have a technology department, I am against that because building technology is a task, not a goal in and of itself. You build technology to…(fill in the blank, e.g., help service clients, help market, etc.). Keeping the tech resources outside the department means you would have people from various departments arguing about whether their project is more important than someone else’s in order to get resources, which isn’t good for efficiency. The tech people would be evaluated and managed by bureaucrats rather than the people they do the work for.

“Mistakes are ok. Not correcting them is not.”

— Ray Dalio

14. Constantly worry about what you are missing.

Even if you acknowledge you are a dumb shit and are following the principles and are designing around your weaknesses, understand that you still might be missing things. You will get better and be safer this way.

15. Identify the variables that really matter. 

Distinguish the important things from the unimportant things and deal with the important things first. Get the important things done very well, but not perfectly, because that's all you have time for. Chances are you won't have to deal with the unimportant things, which is better than not having time to deal with the important things. I often hear people say “wouldn't it be good to do this or that,” referring to nice-to-do rather than important things: they must be wary of those nice-to-do’s distracting them far more important things that need to be done.”

16. Use the phrase, “by and large”

Too often I hear discussions fail to progress when a statement is made and the person to whom it is made to replies “not always,” leading to a discussion of the exceptions rather than the rule. For example, a statement like “the people in the XYZ Department are working too many hours” might lead to a response like “not all of them are; Sally and Bill are working normal hours,” which could lead to a discussion of whether Sally and Bill are working too long, which derails the discussion. Because nothing is 100% true, conversations can get off track if they turn to whether exceptions exist, which is especially foolish if both parties agree that the statement is by and large true. To avoid this problem, the person making such statements might use the term “by and large,” like “by and large, the people in the XYZ Department are working too many hours.” People hearing that should consider whether it is a “by and large” statement and treat it accordingly.”

17. Most importantly

a) build the organization around goals rather than around tasks; b) make people personally responsible for achieving these goals; c) make departments as self-sufficient as possible so that they have control over the resources they need to achieve the goals; d) have the clearest possible delineation of responsibilities and reporting lines; e) have agreed-upon goals and tasks that everyone knows (from the people in the departments to the people outside the departments who oversee them); and f) hold people accountable for achieving the goals and doing the tasks.”

 

To learn more check out this excellent profile of Dalio, which appeared in the New Yorker and make sure to read all of Dalio's management principles.

* * *

Looking for more? Read The Contrarian's Guide to Leadership.

In his book Power: Why Some People Have It And Others Don’t, Jeffrey Pfeffer argues that intelligence and high performance are not needed to build power.

 

Is Everything Obvious Once You Know The Answer?

Reading Duncan Watts new book Everything is Obvious: Once You Know The Answer can make you uncomfortable.

Common sense is particularly well adapted to handling the complexity of everyday situations. We get intro trouble when we project our common sense to situations outside the realm of everyday life.

Applying common sense in these areas, Watts argues, “turns out to suffer from a number of errors that systematically mislead us. Yet because of the way we learn from experience—even experiences that are never repeated or that take place in other times and places—the failings of commonsense reasoning are rarely apparent to us.”

We think we have the answers but we don't. Most real-world problems are more complex than we think. “When policy makers sit down, say, to design some scheme to alleviate poverty, they invariably rely on their own common-sense ideas about why it is that poor people are poor, and therefore how best to help them.” This is where we get into trouble. “A quick look at history,” Watts argues, “suggests that when common sense is used for purposes beyond the everyday, it can fail spectacularly.”

According to Watts, commonsense reasoning suffers from three types of errors, which reinforce one another. First, is that our mental model of the individual behaviour is systematically flawed. Second, our mental model of complex system (collective behaviour) is equally flawed. Lastly—and most interesting, in my view—is that “we learn less from history than we think we do, and that this misperception in turn skews our perception of the future.”

Whenever something interesting happens—a book by an unknown author rocketing to the top of the best-seller list, an unknown search engine increasing in value more than 100,000 times in less than 10 years, the housing bubble collapsing—we instinctively want to know why. We look for an explanation. “In this way,” Watts says, “we deceive ourselves into believing that we can make predictions that are impossible.”

“By providing ready explanations for whatever particular circumstances the world throws at us, commonsense explanations give us the confidence to navigate from day to day and relieve us of the burden of worrying about whether what we think we know is really true, or is just something we happen to believe.”

Once we know the outcome, our brains weave a clever story based on the aspects of the situation that seem relevant (at least, relevant in hindsight). We convince ourselves that we fully understand things that we don't.

Is Netflix successful, as Reed Hastings argues, because of their culture? Which aspects of their culture make them successful? Do companies with a similar culture exist that fail? “The paradox of common sense, then, is that even as it helps us make sense of the world, it can actively undermine our ability to understand it.”

The key to improving your ability to make decisions then is to figure out what kind of predictions can we make and how we can improve our accuracy.

One problem with making predictions is knowing what variables to look at and how to weigh them. Even if we get the variables and relative importance of one factor to another correct, these predictions also reflect how much the future will resemble the past. As Warren Buffett says “the rearview mirror is always clearer than the windshield.”

Relying on historical data is problematic because of the frequency of big strategic decisions. “If you could make millions, or even hundreds, such bets,” Watts argues, “it would make sense to got with the historical probability. But when facing a decisions about whether or not to lead the country into war, or to make some strategic acquisition, you cannot count on getting more than one attempt. … making one-off strategic decisions is therefore ill suited to statistical models or crowd wisdom.”

Watts finds it ironic that organizations using the best practices in strategy planning can also be the most vulnerable to planning errors. This is the strategy paradox.

Michael Raynor, author of The Strategy Paradox, argues that the main cause of strategic failure is not bad strategy but great strategy that happens to be wrong. Bad strategy is characterized by lack of vision, muddled leadership, and inept execution, which is more likely to lead to mediocrity than colossal failure. Great strategy, on the other hand, is marked by clarity of vision, bold leadership, and laser-focused execution. Great strategy can lead to great successes as it did with the iPod but it can also lead to enormous failures as it did with Betamax. “Whether great strategy succeeds or fails therefore depends entirely on whether the initial vision happens to be right or not. And that is not just difficult to know in advance, but impossible.” Raynor argues that the solution to this is to develop methods for planning that account for strategic uncertainty. (I'll eventually get around to reviewing the Strategy Paradox—It was a great read.)

Rather than trying to predict an impossible future, another idea is to react to changing circumstances as rapidly as possible, dropping alternatives that are not working no matter how promising they seem and diverting resources to those that are succeeding. This sounds an awful lot like evolution (variation and selection).

Watts and Raynor's solution to overcome our inability to predict the future echos Peter Palchinsky's principles. The Palchinsky Principles, as said by Tim Harford in Adapt (review) are “first, seek out new ideas and try new things; second, when trying something new do it on a scale where failure is survivable; third, seek out feedback and learn from your mistakes as you go along.”

Of course this experimental approach has limits. The US can't go to war with half of Iraq with one strategy and the other half with a different approach to see which one works best. Watts says “for decisions like these, it's unlikely that an experimental approach will be of much help.”

In the end, Watts concludes that planners need to learn to behave more “like what the development economist William Easterly calls searchers.” As Easterly put it:

A Planner thinks he already knows the answer; he thinks of poverty as a technical engineering problem that his answers will solve. A Searcher admits he doesn't know the answers in advance; he believes that poverty is a complicated tangle of political, social, historical, institutional, and technological factors…and hopes to find answers to individual problems by trial and error…A Planner believes outsiders know enough to impose solutions. A Searcher believes only insiders have enough knowledge to find solutions, and that most solutions must be homegrown.

Still curious? Read Everything is Obvious: Once You Know The Answer.
12