The Art of Thinking Clearly

The Art of Thinking Clearly

Rolf Dobelli’s book, The Art of Thinking Clearly, is a compendium of systematic errors in decision making. While the list of fallacies is not complete, it’s a great launching pad into the best of what others have already figured out.

To avoid frivolous gambles with the wealth I had accumulated over the course of my literary career, I began to put together a list of … systematic cognitive errors, complete with notes and personal anecdotes — with no intention of ever publishing them. The list was originally designed to be used by me alone. Some of these thinking errors have been known for centuries; others have been discovered in the last few years. Some came with two or three names attached to them. … Soon I realized that such a compilation of pitfalls was not only useful for making investing decisions but also for business and personal matters. Once I had prepared the list, I felt calmer and more levelheaded. I began to recognize my own errors sooner and was able to change course before any lasting damage was done. And, for the first time in my life, I was able to recognize when others might be in the thrall of these very same systematic errors. Armed with my list, I could now resist their pull — and perhaps even gain an upper hand in my dealings.

Dobelli’s goal is to learn to recognize and evade the biggest errors in thinking. In so doing, he believes we might “experience a leap in prosperity. We need no extra cunning, no new ideas, no unnecessary gadgets, no frantic hyperactivity—all we need is less irrationality.”

Let’s take a look at some of the content.

Guarding Against Survivorship Bias

People systematically overestimate their chances of success. Guard against it by frequently visiting the graves of once-promising projects, investments, and careers. It is a sad walk but one that should clear your mind.

Pattern Recognition

When it comes to pattern recognition, we are oversensitive. Regain your scepticism. If you think you have discovered a pattern, first consider it pure chance. If it seems too good to be true, find a mathematician and have the data tested statistically.

Fighting Against Confirmation Bias

[T]ry writing down your beliefs — whether in terms of worldview, investments, marriage, health care, diet, career strategies — and set out to find disconfirming evidence. Axing beliefs that feel like old friends is hard work but imperative.

Dating Advice and Contrast

If you are seeking a partner, never go out in the company of your supermodel friends. People will find you less attractive than you really are. Go alone or, better yet, take two ugly friends.

Think Different

Fend it off (availability bias) by spending time with people who think different than you do—people whose experiences and expertise are different from yours.

Guard Against Chauffeur Knowledge

Be on the lookout for chauffeur knowledge. Do not confuse the company spokesperson, the ringmaster, the newscaster, the schmoozer, the verbiage vendor, or the cliche generator with those who possess true knowledge. How do you recognize the difference? There is a clear indicator: True experts recognize the limits of what they know and what they do not know.

The Swimmer’s Body Illusion

Professional swimmers don’t have perfect bodies because they train extensively. Rather, they are good swimmers because of their physiques. How their bodies are designed is a factor for selection and not the result of their activities. … Whenever we confuse selection factors with results, we fall prey to what Taleb calls the swimmer’s body illusion. Without this illusion, half of advertising campaigns would not work. But this bias has to do with more than just the pursuit of chiseled cheekbones and chests. For example, Harvard has the reputation of being a top university. Many highly successful people have studied there. Does this mean that Harvard is a good school? We don’t know. Perhaps the school is terrible, and it simply recruits the brightest students around.

Peer Pressure

A simple experiment, carried out in the 1950s by legendary psychologist Solomon Asch, shows how peer pressure can warp common sense. A subject is shown a line drawn on paper, and next to it three lines—numbered 1, 2, and 3—one shorter, one longer, and one the same length as the original one. He or she must indicate which of the three lines corresponds to the original one. If the person is alone in the room, he gives correct answers because the task is really quite simple. Now five other people enter the room; they are all actors, which the subject does not know. One after another, they give wrong answers, saying “number 1,” although it’s very clear that number 3 is the correct answer. Then it is the subject’s turn again. In one-third of cases, he will answer incorrectly to match the other people’s responses

Rational Decision Making and The Sunk Cost Fallacy

The sunk cost fallacy is most dangerous when we have invested a lot of time, money, energy, or love in something. This investment becomes a reason to carry on, even if we are dealing with a lost cause. The more we invest, the greater the sunk costs are, and the greater the urge to continue becomes. … Rational decision making requires you to forget about the costs incurred to date. No matter how much you have already invested, only your assessment of the future costs and benefits counts.

Avoid Negative Black Swans

But even if you feel compelled to continue as such, avoid surroundings where negative Black Swans thrive. This means: Stay out of debt, invest your savings as conservatively as possible, and get used to a modest standard of living—no matter whether your big breakthrough comes or not

Disconfirming Evidence
“We all are learning, modifying, or destroying ideas all the time. Rapid destruction of your ideas when the time is right is one of the most valuable qualities you can acquire. You must force yourself to consider arguments on the other side.” — Charlie Munger

The confirmation bias is alive and well in the business world. One example: An executive team decides on a new strategy. The team enthusiastically celebrates any sign that the strategy is a success. Everywhere the executives look, they see plenty of confirming evidence, while indications to the contrary remain unseen or are quickly dismissed as “exceptions” or “special cases.” They have become blind to disconfirming evidence.

(Update: While the book will likely make you smarter, there is some question as to where some of the ideas came from.)

Still curious? Read The Art of Thinking Clearly.

The Lost Genius of Irrationality

The design of better heuristics … seems to be an extraordinary area for improving social behavior.

Rory Sutherland gave a great talk (below) at TEDxOxford, which is well worth your time. Rory explains how some rules work for individuals but not groups, the effect of naming a behavior, the origins of the word designated driver, and the “football illusion.”

Some rules work from the assumption of human agency but once you understand the extent to which our actions are actually influenced by the actions of others it completely falls down. Rory offers the example of a speed camera.

The person in the right hand lane (left hand lane in North America) should at all times go a bit faster than the person in the lane to the left of them. And so what the speed camera does is it tries to encourage everybody to drive at the same speed in both lanes, which is deeply inimitable to us instinctively because everything about us says the right lane must go a bit faster. And there are good reasons for this. If you don’t have a difference in speed between the lanes it’s impossible for people to change lanes when they need to.

How to Make Better Decisions In Life And Work

You’re probably not as effective at making decisions as you could be.

Don’t worry. I’m going to show you how you can make better decisions in work and life.

We’re going to explore Chip and Dan Heaths’ new book, Decisive. It’s going to help us make better decisions both as individuals and in groups.

But before we get into that, you should think about a tough decision you’re grappling with right now. Having a decision working in your mind as you’re reading this post will help make the advice in here real.

Ok, let’s dig in.

“A remarkable aspect of your mental life is that you are rarely stumped … The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it.”
Daniel Kahneman in Thinking, Fast and Slow

We’re quick to jump to conclusions because we give too much weight to the information in front of us and we fail to search for new information, which might disprove our thoughts.

Nobel Prize winning Psychologist Daniel Kahneman called this tendency “what you see is all there is.” But that’s not the only reason we don’t make good decisions — there are many others. We’re overconfident. We look for information that fits our thoughts and ignore information that doesn’t. We are overly influenced by authority. We choose the short-term over the long-term. Once we’ve made a decision we find it hard to change our mind. In short our brains are flawed. I could go on.

Knowing about these and other biases isn’t enough; it doesn’t help us fix the problem. We need a framework for making decisions. In Decisive, the Heaths introduce a four-step process designed to counteract many biases.

In keeping with Kahneman’s visual metaphor, the Heaths refer to the tendency to see only what’s in front of us as a “spotlight” effect.

And that, in essence, is the core difficulty of decision making. What’s in the spotlight will rarely be everything we need to make good decisions, but we won’t always remember to shift the light.

Most of us rarely use a process for thinking about things. If we do use one it’s likely to be the pros-and-cons list. While better than nothing, this approach is still deeply flawed because it doesn’t really account for biases.

The Four Villains of Decision Making

  1. Narrow Framing: “… the tendency to define our choices to narrowly, to see them in binary terms. We ask, “Should I break up with my partner or not?” instead of “What are the ways I could make this relationship better?”
  2. Confirmation Bias: “When people have the opportunity to collect information from the world, they are more likely to select information that supports their preexisting attitudes, beliefs, and actions.” We pretend we want the truth, yet all we really want is reassurance.
  3. Short-term Emotion: “When we’ve got a difficult decision to make, our feelings churn. We replay the same arguments in our head. We agonize about our circumstances. We change our minds from day to day. If our decision was represented on a spreadsheet, none of the numbers would be changing—there’s no new information being added—but it doesn’t feel that way in our heads.”
  4. Overconfidence: “People think they know more than they do about how the future will unfold.”

The Heaths came up with a process to help us overcome these villains and make better choices. “We can’t deactivate our biases, but … we can counteract them with the right discipline.” The nature of each of the four decision-making villains suggests a strategy for how to defeat it.

1. You encounter a choice. But narrow framing makes you miss options. So … Widen Your Options. How can you expand your sent of choices? …

2. You analyze your options. But the confirmation bias leads you to gather self-serving information. So … Reality-Test Your Assumptions. How can you get outside your head and collect information you can trust? …

3. You make a choice. But short-term emotion will often tempt you to make the wrong one. So … Attain Distance Before Deciding. How can you overcome short-term emotion and conflicted feelings to make better choices? …

4. Then you live with it. But you’ll often be overconfident about how the future will unfold. So … Prepare to Be Wrong. How can we plan for an uncertain future so that we give our decisions the best chance to succeed? …

They call this WRAP. “At its core, the WRAP model urges you to switch from “auto spotlight” to manual spotlight.

WRAP

All in all this was a great book. We focus our efforts on analysis. If a decision is wrong the analysis must have been the problem. Not only does this ignore the fact that you can have ‘bad outcomes’ with good decisions but it also places your spotlight on the analysis at the cost of the process by which the decision was made. More to come on this …

Read this next: What Matters More in Decisions: Analysis or Process?

What Matters More in Decisions: Analysis or Process?

Think of the last major decision your company made.

Maybe it was an acquisition, a large purchase, or perhaps it was whether to launch a new product.

Odds are three things went into into that decision: (1) It probably relied on the insights of a few key executives; (2) it involved some sort of fact gathering and analysis; and (3) it was likely enveloped in some sort of decision process—whether formal or informal—that translated the analysis into a decision.

Now how would you rate the quality of your organization’s strategic decisions?

If you’re like most executives, the answer wouldn’t be positive:

In a recent McKinsey Quarterly survey of 2,207 executives, only 28 percent said that the quality of strategic decisions in their companies was generally good, 60 percent thought that bad decisions were about as frequent as good ones, and the remaining 12 percent thought good decisions were altogether infrequent.

How could it be otherwise?

Product launches are frequently behind schedule and over budget. Strategic plans often ignore even the anticipated response of competitors. Mergers routinely fail to live up to the promises made in press releases. The persistence of these problems across time and organizations, both large and small, would indicate that we can make better decisions.

Looking at how organizations make decisions is a good place to start if we’re trying to improve the quality of decisions and remove cognitive biases.

While we often make decisions with our gut, these decisions leave us susceptible to biases. To counter the gut decision a lot of organizations gather data and analyze decisions. The widespread belief is that analysis reduces biases.

But is putting your faith in analysis any better than using your gut? What does the evidence say? Is there a better way?

Dan Lovallo and Olivier Sibony set to find out. Lovallo is a professor at the University of Sydney and Olivier is a director at McKinsey & Company. Together they studied 1,048 “major” business decisions over five years.

What they discovered will surprise you.

Most of the decisions were not made based on gut calls but rather rigorous analysis. In short, most people did the all the leg work we think we’re supposed to do: they delivered large quantities of detailed analysis. Yet this wasn’t enough. “Our research indicates that, contrary to what one might assume, good analysis in the hands of managers who have good judgment won’t naturally yield good decisions.”

These two quotes by Warren Buffett and Charlie Munger explain how analysis can easily go astray.

I have no use whatsoever for projections or forecasts. They create an illusion of apparent precision. The more meticulous they are, the more concerned you should be. We never look at projections … — Warren Buffett

[Projections] are put together by people who have an interest in a particular outcome, have a subconscious bias, and its apparent precision makes it fallacious. They remind me of Mark Twain’s saying, ‘A mine is a hole in the ground owned by a liar.’ Projections in America are often a lie, although not an intentional one, but the worst kind because the forecaster often believes them himself. — Charlie Munger

But Lovallo and Sibony didn’t only look at analysis, they also asked executives about the process. Did they, for example, “explicitly explore and discuss major uncertainties or discuss viewpoints that contradicted the senior leader’s?”

So what matters more, process or analysis? After comparing the results they determined that “process mattered more than analysis—by a factor of six.

This finding does not mean that analysis is unimportant, as a closer look at the data reveals: almost no decisions in our sample made through a very strong process were backed by very poor analysis. Why? Because one of the things an unbiased decision-making process will do is ferret out poor analysis. The reverse is not true; superb analysis is useless unless the decision process gives it a fair hearing.

To illustrate the weakness of how most organizations make decisions, Sibony used an interesting analogy: the legal system.

Imagine walking into a courtroom where the trial consists of a prosecutor presenting PowerPoint slides. In 20 pretty compelling charts, he demonstrates why the defendant is guilty. The judge then challenges some of the facts of the presentation, but the prosecutor has a good answer to every objection. So the judge decides, and the accused man is sentenced.

That wouldn’t be due process, right? So if you would find this process shocking in a courtroom, why is it acceptable when you make an investment decision? Now of course, this is an oversimplification, but this process is essentially the one most companies follow to make a decision. They have a team arguing only one side of the case. The team has a choice of what points it wants to make and what way it wants to make them. And it falls to the final decision maker to be both the challenger and the ultimate judge. Building a good decision-making process is largely ensuring that these flaws don’t happen.

Understanding biases doesn’t make you immune to them. A disciplined decision process is the best place to improve the quality of decisions and guard against common decision-making biases.

Still curious? Read this next: A process to make better decisions.

The inspiration for this post comes from Chip and Dan Heath in Decisive.

Searching Google, And Finding Ourselves

Nicolas Carr, author of The Shallows: What the Internet Is Doing To Our Brain, speaks to how the digital age is transforming what it means to ‘search.’

When we talk about “searching” these days, we’re almost always talking about using Google to find something online.

That’s a big change for a word that long carried existential connotations — a word that had been bound up in our sense of what it meant to be human. We didn’t just search for car keys or missing socks. We searched for truth, for meaning, for transcendence. Searching was an act of exploration that took us out into the world, beyond ourselves, in order to know the world, and ourselves, more fully.

In its original form, the Google search engine did just that. It transported us out into a messy and confusing world — the world of the web — with the intent of helping us make sense of it.

But that’s less true now. Google’s big goal is no longer to read the web. It’s to read us.

… These days, Google’s search engine doesn’t push us outward so much as turn us inward. It gives us information that fits the pattern of behavior and thinking we’ve displayed in the past. It reinforces our biases rather than challenging them, and subverts the act of searching in its most meaningful sense.

As Eli Pariser writes in The Filter Bubble: “When technology’s job is to show you the world, it ends up sitting between you and reality, like a camera lens.”

(h/t Annie)

Still curious? Check out The Filter Bubble — What the Internet is Hiding From You and DuckDuckGo.

The 12 cognitive biases that prevent you from being rational

i09 produced a great overview of some cognitive biases.

First, the difference between cognitive biases and logical fallacies:

A logical fallacy is an error in logical argumentation (e.g. ad hominem attacks, slippery slopes, circular arguments, appeal to force, etc.). A cognitive bias, on the other hand, is a genuine deficiency or limitation in our thinking — a flaw in judgment that arises from errors of memory, social attribution, and miscalculations (such as statistical errors or a false sense of probability).

Confirmation Bias

We love to agree with people who agree with us. It’s why we only visit websites that express our political opinions, and why we mostly hang around people who hold similar views and tastes.

Ingroup Bias

Ultimately, the ingroup bias causes us to overestimate the abilities and value of our immediate group at the expense of people we don’t really know.

Gambler’s Fallacy

We tend to put a tremendous amount of weight on previous events, believing that they’ll somehow influence future outcomes.

Post-Purchase Rationalization (aka commitment and consistency bias)

[A] kind of built-in mechanism that makes us feel better after we make crappy decisions, especially at the cash register. Also known as Buyer’s Stockholm Syndrome, it’s a way of subconsciously justifying our purchases — especially expensive ones.

Neglecting Probability (aka Insensitivity To Base Rates)

It’s the same phenomenon that makes us worry about getting killed in an act of terrorism as opposed to something far more probable, like falling down the stairs or accidental poisoning.

Observational Selection Bias (Availability Bias?)

This is that effect of suddenly noticing things we didn’t notice that much before — but we wrongly assume that the frequency has increased.

Status-Quo Bias

We humans tend to be apprehensive of change, which often leads us to make choices that guarantee that things remain the same, or change as little as possible.

Negativity Bias

People tend to pay more attention to bad news — and it’s not just because we’re morbid. Social scientists theorize that it’s on account of our selective attention and that, given the choice, we perceive negative news as being more important or profound.

Bandwagon Effect (aka social proof)

Though we’re often unconscious of it, we love to go with the flow of the crowd.

Projection Bias

We tend to assume that most people think just like us — though there may be no justification for it. This cognitive shortcoming often leads to a related effect known as the false consensus bias where we tend to believe that people not only think like us, but that they also agree with us.

The Current Moment Bias

We humans have a really hard time imagining ourselves in the future and altering our behaviors and expectations accordingly.

Anchoring Effect

Also known as the relativity trap, this is the tendency we have to compare and contrast only a limited set of items. It’s called the anchoring effect because we tend to fixate on a value or number that in turn gets compared to everything else.

Still curious? Check out the Farnam Street Latticework of Mental Models.

“Everyone has filters to select information that receives attention.”

These excerpts were taken from Roger Fisher’s excellent book Getting It Done: How to Lead When You’re Not in Charge.

Everyone has filters to select information that receives attention. If we don’t consciously choose them we fall back on unconscious ones. Typically these default rules for selecting data limit the useful information we receive. Like a magician’s sleight of hand, they direct attention away from true action.

Do you think that what you know is more important than what you don’t?

We often equate the information we have with the information that should guide a decision. This creates two difficulties. First, we assume that what we don’t already know isn’t worth knowing. We stop looking for information when there may still be important things to learn. Secondly, we think that because we have a certain pierce of information, it should figure into the decision: “If it is true, it is relevant.” Years are spent fighting over what happened yesterday for every day that is spent figuring out what out to be done tomorrow.

Are you biased toward vivid data?

We all pay undue attention to a good story. Information that has an emotional impact gobbles up attention. Dry information is ignored. As you try to accomplish something you may find that the bankruptcy of one competitor holds your attention more than an improved product being marketed by another. At work no one has packaged important information to make it easy to take in.

Are you trapped in your own point of view?

Russians have a saying that “everyone looks at the world from the bell tower of his own village.” We all know that we tend to judge ourselves more charitably than we judge others. You notice your own contributions to a success more than those of others. You tend to minimize, even to yourself, the role you played in a failure. Because we focus on data that favor us or cast others in a bad light, we neglect large amounts of relevant information.

What Is Critical Thinking?

Based on our dysfunctional national dialogue, Hamilton College Professor Paul Gary Wyckoff articulates the critical thinking skills he wants his students to learn.

1. The ability to think empirically, not theoretically. By this I mean the habit of constantly checking one’s views against evidence from the real world, and the courage to change positions if better explanations come along. …

2. The ability to think in terms of multiple, rather than single, causes. When you drop a book, it will fall on the floor — a single-cause event. But most of the interesting things in the world have multiple causes; educational success, for example, is affected by a student’s aptitude, but also by the educational achievements of the student’s parents, the quality of the school he or she attends, and the attitudes and intelligence of the other students in that school. In such cases, simple comparisons become unreliable guides to action, because the effects of intervening variables haven’t been screened out. …

3. The ability to think in terms of the sizes of things, rather than only in terms of their direction. Our debates are largely magnitude-free, but decisions in a world with constrained resources always demand a sense of the sizes of various effects. …

4. The ability to think like foxes, not hedgehogs. In his seminal book, Expert Political Judgment, Philip Tetlock followed Isaiah Berlin in distinguishing between hedgehogs, who know one big thing and apply that understanding to everything around them, and foxes, who know many small things and pragmatically apply a “grab bag” of knowledge to make modest predictions about the world. In his study of hundreds of foreign policy experts over 20 years, Tetlock showed that foxes outperform hedgehogs in making predictions, and hence tend to make better decisions. …

5. The ability to understand one’s own biases. An expanding literature in psychology and behavioral economics suggests that we are full of unconscious biases, and a failure to understand these biases contributes to poor decision-making. Perhaps the most common and dangerous of these is confirmation bias, the tendency to seek out information in accordance with our previous views and ignore or dismiss information contrary to those views. …