Tag: Availability Bias

The Art of Thinking Clearly

(Update: While the book will likely make you smarter, there is some question as to where some of the ideas came from.)

The Art of Thinking Clearly

Rolf Dobelli's book, The Art of Thinking Clearly, is a compendium of systematic errors in decision making. While the list of fallacies is not complete, it's a great launching pad into the best of what others have already figured out.

To avoid frivolous gambles with the wealth I had accumulated over the course of my literary career, I began to put together a list of … systematic cognitive errors, complete with notes and personal anecdotes — with no intention of ever publishing them. The list was originally designed to be used by me alone. Some of these thinking errors have been known for centuries; others have been discovered in the last few years. Some came with two or three names attached to them. … Soon I realized that such a compilation of pitfalls was not only useful for making investing decisions but also for business and personal matters. Once I had prepared the list, I felt calmer and more levelheaded. I began to recognize my own errors sooner and was able to change course before any lasting damage was done. And, for the first time in my life, I was able to recognize when others might be in the thrall of these very same systematic errors. Armed with my list, I could now resist their pull — and perhaps even gain an upper hand in my dealings.

Dobelli's goal is to learn to recognize and evade the biggest errors in thinking. In so doing, he believes we might “experience a leap in prosperity. We need no extra cunning, no new ideas, no unnecessary gadgets, no frantic hyperactivity—all we need is less irrationality.”

Let's take a look at some of the content.

Guarding Against Survivorship Bias

People systematically overestimate their chances of success. Guard against it by frequently visiting the graves of once-promising projects, investments, and careers. It is a sad walk but one that should clear your mind.

Pattern Recognition

When it comes to pattern recognition, we are oversensitive. Regain your scepticism. If you think you have discovered a pattern, first consider it pure chance. If it seems too good to be true, find a mathematician and have the data tested statistically.

Fighting Against Confirmation Bias

[T]ry writing down your beliefs — whether in terms of worldview, investments, marriage, health care, diet, career strategies — and set out to find disconfirming evidence. Axing beliefs that feel like old friends is hard work but imperative.

Dating Advice and Contrast

If you are seeking a partner, never go out in the company of your supermodel friends. People will find you less attractive than you really are. Go alone or, better yet, take two ugly friends.

Think Different

Fend it off (availability bias) by spending time with people who think different than you do—people whose experiences and expertise are different from yours.

Guard Against Chauffeur Knowledge

Be on the lookout for chauffeur knowledge. Do not confuse the company spokesperson, the ringmaster, the newscaster, the schmoozer, the verbiage vendor, or the cliche generator with those who possess true knowledge. How do you recognize the difference? There is a clear indicator: True experts recognize the limits of what they know and what they do not know.

The Swimmer's Body Illusion

Professional swimmers don’t have perfect bodies because they train extensively. Rather, they are good swimmers because of their physiques. How their bodies are designed is a factor for selection and not the result of their activities. … Whenever we confuse selection factors with results, we fall prey to what Taleb calls the swimmer’s body illusion. Without this illusion, half of advertising campaigns would not work. But this bias has to do with more than just the pursuit of chiseled cheekbones and chests. For example, Harvard has the reputation of being a top university. Many highly successful people have studied there. Does this mean that Harvard is a good school? We don’t know. Perhaps the school is terrible, and it simply recruits the brightest students around.

Peer Pressure

A simple experiment, carried out in the 1950s by legendary psychologist Solomon Asch, shows how peer pressure can warp common sense. A subject is shown a line drawn on paper, and next to it three lines—numbered 1, 2, and 3—one shorter, one longer, and one the same length as the original one. He or she must indicate which of the three lines corresponds to the original one. If the person is alone in the room, he gives correct answers because the task is really quite simple. Now five other people enter the room; they are all actors, which the subject does not know. One after another, they give wrong answers, saying “number 1,” although it’s very clear that number 3 is the correct answer. Then it is the subject’s turn again. In one-third of cases, he will answer incorrectly to match the other people’s responses

Rational Decision Making and The Sunk Cost Fallacy

The sunk cost fallacy is most dangerous when we have invested a lot of time, money, energy, or love in something. This investment becomes a reason to carry on, even if we are dealing with a lost cause. The more we invest, the greater the sunk costs are, and the greater the urge to continue becomes. … Rational decision making requires you to forget about the costs incurred to date. No matter how much you have already invested, only your assessment of the future costs and benefits counts.

Avoid Negative Black Swans

But even if you feel compelled to continue as such, avoid surroundings where negative Black Swans thrive. This means: Stay out of debt, invest your savings as conservatively as possible, and get used to a modest standard of living—no matter whether your big breakthrough comes or not

Disconfirming Evidence

Munger Destroy ideas

The confirmation bias is alive and well in the business world. One example: An executive team decides on a new strategy. The team enthusiastically celebrates any sign that the strategy is a success. Everywhere the executives look, they see plenty of confirming evidence, while indications to the contrary remain unseen or are quickly dismissed as “exceptions” or “special cases.” They have become blind to disconfirming evidence.

Still curious? Read The Art of Thinking Clearly.

How you can instantly improve your marriage

Most of us see what we want to see.

If we're arguing with a spouse, we're going to start seeing all of their faults. After all, it's not my fault it's your fault. Once we've labeled someone as, say, selfish, it becomes self-reinforcing thanks to the availability and confirmation bias. Our views become so clouded that we can't appreciate the positive attributes about our partner.

Not only do we search for information that agrees with us but we fail to notice anything to the contrary. “We see,” writes Aaron Beck in his book Love is Never Enough, “each other through the bias of negative frames.”

There is a way for couples to fight the tendency to only notice what's wrong: keep a “marriage diary,” with a list of all the things your partner does that you like.

In his book, Beck describes a couple, Karen and Ted, who are having marriage troubles. Beck suggested they keep a marriage diary.

After I proposed to Karen and Ted that each take notes of what the other did that was pleasing during the previous week, Karen reported the following:

Ted was great. I was really upset by some of my clients. They are a real pain. … Anyhow, I told Ted about it. He was very sympathetic. He didn't try to tell me what to do. He said that if he was in my position, he would probably feel frustrated, too. He said that my clients are tough to deal with. I felt a lot better.

Each of Ted's actions pleased Karen, who remarked, “They were like presents.” Although Ted had done similar things for Karen in the past, they had been erased from her memory because of the negative view of Ted.

The same was also true for Ted.

Psychologist Mark Kane Goldstein has used this method to help husbands and wives keep “track of their partner's pleasant actions.”

Each spouse is given several sheets of graph paper on which to record whatever his or her partner does that is pleasing. The spouse rates these acts on a ten-point scale, indicating degree of satisfaction. Dr. Goldstein found that 70 percent of the couples who tried this simple method reported an improvement in their relationship.

Simply by shifting their focus away from the negative and onto little pleasures, couples were more aware of their satisfaction.

Psychologists call this “considering the opposite.”

The Media’s Coverage of School Shootings

Jason Kottke dug up a Roger Ebert review of Gus Van Sant's Elephant, a fictionalized account of a Columbine-like school shooting. Ebert discusses the media's behavior while reporting these kind of events.

Let me tell you a story. The day after Columbine, I was interviewed for the Tom Brokaw news program. The reporter had been assigned a theory and was seeking sound bites to support it. “Wouldn't you say,” she asked, “that killings like this are influenced by violent movies?” No, I said, I wouldn't say that. “But what about ‘Basketball Diaries'?” she asked. “Doesn't that have a scene of a boy walking into a school with a machine gun?” The obscure 1995 Leonardo Di Caprio movie did indeed have a brief fantasy scene of that nature, I said, but the movie failed at the box office (it grossed only $2.5 million), and it's unlikely the Columbine killers saw it.

The reporter looked disappointed, so I offered her my theory. “Events like this,” I said, “if they are influenced by anything, are influenced by news programs like your own. When an unbalanced kid walks into a school and starts shooting, it becomes a major media event. Cable news drops ordinary programming and goes around the clock with it. The story is assigned a logo and a theme song; these two kids were packaged as the Trench Coat Mafia. The message is clear to other disturbed kids around the country: If I shoot up my school, I can be famous. The TV will talk about nothing else but me. Experts will try to figure out what I was thinking. The kids and teachers at school will see they shouldn't have messed with me. I'll go out in a blaze of glory.”

In short, I said, events like Columbine are influenced far less by violent movies than by CNN, the NBC Nightly News and all the other news media, who glorify the killers in the guise of “explaining” them. I commended the policy at the Sun-Times, where our editor said the paper would no longer feature school killings on Page 1. The reporter thanked me and turned off the camera. Of course the interview was never used. They found plenty of talking heads to condemn violent movies, and everybody was happy.

The evolutionary roots of human behaviour

Anthony Gottlieb writing in the New Yorker:

Indeed, the guilty secret of psychology and of behavioral economics is that their experiments and surveys are conducted almost entirely with people from Western, industrialized countries, mostly of college age, and very often students of psychology at colleges in the United States. This is particularly unfortunate for evolutionary psychologists, who are trying to find universal features of our species. American college kids, whatever their charms, are a laughable proxy for Homo sapiens. The relatively few experiments conducted in non-Western cultures suggest that the minds of American students are highly unusual in many respects, including their spatial cognition, responses to optical illusions, styles of reasoning, coöperative behavior, ideas of fairness, and risk-taking strategies. Joseph Henrich and his colleagues at the University of British Columbia concluded recently that U.S. college kids are “one of the worst subpopulations one could study” when it comes to generalizing about human psychology. Their main appeal to evolutionary psychologists is that they’re readily available. Man’s closest relatives are all long extinct; breeding experiments on humans aren’t allowed (they would take far too long, anyway); and the mental life of our ancestors left few fossils.

He concludes:

Barash muses, at the end of his book, on the fact that our minds have a stubborn fondness for simple-sounding explanations that may be false. That’s true enough, and not only at bedtime. It complements a fondness for thinking that one has found the key to everything. Perhaps there’s an evolutionary explanation for such proclivities.

Still curious? Check out Barash's book, Homo Mysterious: Evolutionary Puzzles of Human Nature, for yourself.

The Noise Bottleneck: When More Information is Harmful


When consuming information, we strive for more signal and less noise. The problem is a cognitive illusion: we feel like the more information we consume the more signal we receive.

While this is probably true on an absolute basis, Nassim Taleb argues in this excerpt from Antifragile, that it is not true on a relative basis. He calls is the noise bottleneck.

Taleb argues that as you consume more data and the ratio of noise to signal increases, the less you know about what's going on and the more inadvertent trouble you are likely to cause.


The Institutionalization Of Neuroticism

Imagine someone of the type we call neurotic in common parlance. He is wiry, looks contorted, and speaks with an uneven voice. His necks moves around when he tries to express himself. When he has a small pimple his first reaction is to assume that it is cancerous, that the cancer is of the lethal type, and that it has already spread. His hypochondria is not just in the medical department: he incurs a small setback in business and reacts as if bankruptcy were both near and certain. In the office, he is tuned to every single possible detail, systematically transforming every molehill into a mountain. The last thing you want in life is to be in the same car with him when stuck in traffic on your way to an important appointment. The expression overreact was designed with him in mind: he does not have reactions, just overreactions.

Compare him to someone with the opposite temperament, imperturbable, with the calm under fire that is considered necessary to become a leader, military commander or a mafia godfather. Usually unruffled and immune to small information —they can impress you with their self-control in difficult circumstances. For a sample of a composed, call and pondered voice, listen to interview of “Sammy the Bull” Salvatore Gravano who was involved in the murder of nineteen people (all competing mobsters). He speaks with minimal effort. In the rare situations when he is angry, unlike with the neurotic fellow, everyone knows it and takes it seriously.

The supply of information to which we are exposed under modernity is transforming humans from the equable second fellow to the neurotic first. For the purpose of our discussion, the second fellow only reacts to real information, the first largely to noise. The difference between the two fellows will show us the difference between noise and signal. Noise is what you are supposed to ignore; signal what you need to heed.

Indeed, we have been loosely mentioning “noise” earlier in the book; time to be precise about it. In science, noise is a generalization beyond the actual sound to describe random information that is totally useless for any purpose, and that you need to clean up to make sense of what you are listening to. Consider, for examples, elements in an encrypted message that have absolutely no meaning, just randomized letters to confuse the spies, or the hiss you hear on a telephone line and that you try to ignore in order to just focus on the voice of your interlocutor.

Noise and Signal

If you want to accelerate someone’s death, give him a personal doctor.

One can see from the tonsillectomy story that access to data increases intervention —as with neuroticism. Rory Sutherland signaled to me that those with a personal doctor on staff should be particularly vulnerable to naive interventionism, hence iatrogenics; doctors need to justify their salaries and prove to themselves that they have some work ethics, something “doing nothing” doesn’t satisfy (Editor's note: the same forces apply to leaders, managers, etc.). Indeed at the time of writing the personal doctor or the late singer Michael Jackson is being sued for something that is equivalent to overintervention-to-stifle-antifragility (but it will take the law courts a while before they become familiar with the concept). Conceivably, the same happened to Elvis Prestley. So with overmedicated politicians and heads of state.

Likewise those in corporations or in policymaking (like Fragilista Greenspan) endowed with a sophisticated statistics department and therefore getting a lot of “timely” data are capable of overreacting and mistaking noise for information —Greenspan kept an eye on such fluctuations as the sales of vacuum cleaners in Cleveland “to get a precise idea about where the economy is going”, and, of course micromanaged us into chaos.

In business and economic decision-making, data causes severe side effects —data is now plentiful thanks to connectivity; and the share of spuriousness in the data increases as one gets more immersed into it. A not well discussed property of data: it is toxic in large quantities —even in moderate quantities.

The previous two chapters showed how you can use and take advantage of noise and randomness; but noise and randomness can also use and take advantage of you, particularly when totally unnatural —the data you get on the web or thanks to the media.

The more frequently you look at data, the more noise you are disproportionally likely to get (rather than the valuable part called the signal); hence the higher the noise to signal ratio. And there is a confusion, that is not psychological at all, but inherent in the data itself. Say you look at information on a yearly basis, for stock prices or the fertilizer sales of your father-in-law’s factory, or inflation numbers in Vladivostock. Assume further that for what you are observing, at the yearly frequency the ratio of signal to noise is about one to one (say half noise, half signal) —it means that about half of changes are real improvements or degradations, the other half comes from randomness. This ratio is what you get from yearly observations. But if you look at the very same data on a daily basis, the composition would change to 95% noise, 5% signal. And if you observe data on an hourly basis, as people immersed in the news and markets price variations do, the split becomes 99.5% noise to .5% signal. That is two hundred times more noise than signal —which is why anyone who listens to news (except when very, very significant events take place) is one step below sucker.

There is a biological story with information. I have been repeating that in a natural environment, a stressor is information. So too much information would be too much stress, exceeding the threshold of antifragility. In medicine, we are discovering the healing powers of fasting, as the avoidance of too much hormonal rushes that come with the ingestion of food. Hormones convey information to the different parts of our system and too much of it confuses our biology. Here again, as with the story of the news received at too high a frequency, too much information becomes harmful. And in Chapter x (on ethics) I will show how too much data (particularly when sterile) causes statistics to be completely meaningless.

Now let’s add the psychological to this: we are not made to understand the point, so we overreact emotionally to noise. The best solution is to only look at very large changes in data or conditions, never small ones.

Just as we are not likely to mistake a bear for a stone (but likely to mistake a stone for a bear), it is almost impossible for someone rational with a clear, uninfected mind, one who is not drowning in data, to mistake a vital signal, one that matters for his survival, for noise. Significant signals have a way to reach you. In the tonsillectomies, the best filter would have been to only consider the children who are very ill, those with periodically recurring throat inflammation.

There was even more noise coming from the media and its glorification of the anecdote. Thanks to it, we are living more and more in virtual reality, separated from the real world, a little bit more every day, while realizing it less and less. Consider that every day, 6,200 persons die in the United States, many of preventable causes. But the media only reports the most anecdotal and sensational cases (hurricanes, freak incidents, small plane crashes) giving us a more and more distorted map of real risks. In an ancestral environment, the anecdote, the “interesting” is information; no longer today. Likewise, by presenting us with explanations and theories the media induces an illusion of understanding the world.

And the understanding of events (and risks) on the part of members of the press is so retrospective that they would put the security checks after the plane ride, or what the ancients call post bellum auxilium, send troops after the battle. Owing to domain dependence, we forget the need to check our map of the world against reality. So we are living in a more and more fragile world, while thinking it is more and more understandable.

To conclude, the best way to mitigate interventionism is to ration the supply of information, as naturalistically as possible. This is hard to accept in the age of the internet. It has been very hard for me to explain that the more data you get, the less you know what’s going on, and the more iatrogenics you will cause.


The noise bottleneck is really a paradox. We think the more information we consume the more signal we'll consume. Only the mind doesn't work like that. When the volume of information increases, our ability to comprehend the relevant from the irrelevant becomes compromised. We place too much emphasis on irrelevant data and lose sight of what's really important.

Still Curious? Read The Pot Belly of Ignorance.

Source (image via)

Why First Impressions Don’t Matter Much For Experiences

A recent article in the WSJ, “Hidden Ways Hotels Court Guests Faster”, focused on how hotels are trying to dazzle guests with first impressions.

Jeremy McCarthy, a hotel executive, argues this is why “upon arriving to a luxury hotel, you are often greeted in the lobby by a friendly face, an offer to assist with your luggage, and sometimes a welcome beverage or a refreshing chilled towel to help wipe away the stress of travel.”

Research, however, seems to show that, while we remember people by first impressions, we don't really remember experiences the same way. With experiences, we seem to remember the peak moments and how they end. McCarthy writes:

An example of the research that supports this “peak-end” theory, is the work on colonoscopy patients done by psychologist Daniel Kahneman. Kahneman found that after a painful colonoscopy treatment, patients would forget about the overall duration of the pain they experienced and would instead remember their experience based on the peak moments of pain and on how it ended.

A patient whose colonoscopy lasted an agonizing 25 minutes, for example (Patient B), would rate the experience better and would happily come back a year later for his follow-up appointment, as long as the treatment ended with less pain. Another patient (Patient A), who only had around 8 minutes of total pain, wouldn’t come back next year because he remembers the pain of how the experience ended.

The implications of this are pretty clear. If you run a hotel, for example, you want to focus more on the departure than the arrival.

I'm left with more questions about this research than answers, so if you know of any good books/blogs/articles on this please pass them along.