Rolf Dobelli’s book, The Art of Thinking Clearly, is a compendium of systematic errors in decision making. While the list of fallacies is not complete, it’s a great launching pad into the best of what others have already figured out.
To avoid frivolous gambles with the wealth I had accumulated over the course of my literary career, I began to put together a list of … systematic cognitive errors, complete with notes and personal anecdotes — with no intention of ever publishing them. The list was originally designed to be used by me alone. Some of these thinking errors have been known for centuries; others have been discovered in the last few years. Some came with two or three names attached to them. … Soon I realized that such a compilation of pitfalls was not only useful for making investing decisions but also for business and personal matters. Once I had prepared the list, I felt calmer and more levelheaded. I began to recognize my own errors sooner and was able to change course before any lasting damage was done. And, for the first time in my life, I was able to recognize when others might be in the thrall of these very same systematic errors. Armed with my list, I could now resist their pull — and perhaps even gain an upper hand in my dealings.
Dobelli’s goal is to learn to recognize and evade the biggest errors in thinking. In so doing, he believes we might “experience a leap in prosperity. We need no extra cunning, no new ideas, no unnecessary gadgets, no frantic hyperactivity—all we need is less irrationality.”
Let’s take a look at some of the content.
Guarding Against Survivorship Bias
People systematically overestimate their chances of success. Guard against it by frequently visiting the graves of once-promising projects, investments, and careers. It is a sad walk but one that should clear your mind.
When it comes to pattern recognition, we are oversensitive. Regain your scepticism. If you think you have discovered a pattern, first consider it pure chance. If it seems too good to be true, find a mathematician and have the data tested statistically.
Fighting Against Confirmation Bias
[T]ry writing down your beliefs — whether in terms of worldview, investments, marriage, health care, diet, career strategies — and set out to find disconfirming evidence. Axing beliefs that feel like old friends is hard work but imperative.
Dating Advice and Contrast
If you are seeking a partner, never go out in the company of your supermodel friends. People will find you less attractive than you really are. Go alone or, better yet, take two ugly friends.
Fend it off (availability bias) by spending time with people who think different than you do—people whose experiences and expertise are different from yours.
Guard Against Chauffeur Knowledge
Be on the lookout for chauffeur knowledge. Do not confuse the company spokesperson, the ringmaster, the newscaster, the schmoozer, the verbiage vendor, or the cliche generator with those who possess true knowledge. How do you recognize the difference? There is a clear indicator: True experts recognize the limits of what they know and what they do not know.
The Swimmer’s Body Illusion
Professional swimmers don’t have perfect bodies because they train extensively. Rather, they are good swimmers because of their physiques. How their bodies are designed is a factor for selection and not the result of their activities. … Whenever we confuse selection factors with results, we fall prey to what Taleb calls the swimmer’s body illusion. Without this illusion, half of advertising campaigns would not work. But this bias has to do with more than just the pursuit of chiseled cheekbones and chests. For example, Harvard has the reputation of being a top university. Many highly successful people have studied there. Does this mean that Harvard is a good school? We don’t know. Perhaps the school is terrible, and it simply recruits the brightest students around.
A simple experiment, carried out in the 1950s by legendary psychologist Solomon Asch, shows how peer pressure can warp common sense. A subject is shown a line drawn on paper, and next to it three lines—numbered 1, 2, and 3—one shorter, one longer, and one the same length as the original one. He or she must indicate which of the three lines corresponds to the original one. If the person is alone in the room, he gives correct answers because the task is really quite simple. Now five other people enter the room; they are all actors, which the subject does not know. One after another, they give wrong answers, saying “number 1,” although it’s very clear that number 3 is the correct answer. Then it is the subject’s turn again. In one-third of cases, he will answer incorrectly to match the other people’s responses
Rational Decision Making and The Sunk Cost Fallacy
The sunk cost fallacy is most dangerous when we have invested a lot of time, money, energy, or love in something. This investment becomes a reason to carry on, even if we are dealing with a lost cause. The more we invest, the greater the sunk costs are, and the greater the urge to continue becomes. … Rational decision making requires you to forget about the costs incurred to date. No matter how much you have already invested, only your assessment of the future costs and benefits counts.
Avoid Negative Black Swans
But even if you feel compelled to continue as such, avoid surroundings where negative Black Swans thrive. This means: Stay out of debt, invest your savings as conservatively as possible, and get used to a modest standard of living—no matter whether your big breakthrough comes or not
“We all are learning, modifying, or destroying ideas all the time. Rapid destruction of your ideas when the time is right is one of the most valuable qualities you can acquire. You must force yourself to consider arguments on the other side.” — Charlie Munger
The confirmation bias is alive and well in the business world. One example: An executive team decides on a new strategy. The team enthusiastically celebrates any sign that the strategy is a success. Everywhere the executives look, they see plenty of confirming evidence, while indications to the contrary remain unseen or are quickly dismissed as “exceptions” or “special cases.” They have become blind to disconfirming evidence.
Still curious? Read The Art of Thinking Clearly.