Over 400,000 people visited Farnam Street last month to learn howto make better decisions, create new ideas, and avoid stupid errors. With more than 100,000 subscribers to our popular weekly digest, we've become an online intellectual hub. To learn more about we what do, start here.
Future Babble: Why expert predictions fail and why we believe them anyway
Future Babble has come out to mixed reviews. I think the book would interest anyone seeking wisdom.
Here are some of my notes:
First a little background: Predictions fail because the world is too complicated to be predicted with accuracy and we’re wired to avoid uncertainty. However, we shouldn’t blindly believe experts. The world is divided into two: foxes and hedgehogs. The fox knows many things whereas the hedgehog knows one big thing. Foxes beat hedgehogs when it comes to making predictions.
What we should ask is, in a non-linear world, why would we think oil prices can be predicted. Practically since the dawn of the oil industry in the nineteenth century, experts have been forecasting the price of oil. They’ve been wrong ever since. And yet this dismal record hasn’t caused us to give up on the enterprise of forecasting oil prices.
One of psychology’s fundamental insights, wrote psychologist Daniel Gilbert, is that judgements are generally the products of non-conscious systems that operate quickly, on the basis scant evidence, and in a routine manner, and then pass their hurried approximations to consciousness, which slowly and deliberately adjust them. … (one consequence of this is that) Appearance equals reality. In the ancient environment in which our brains evolved, that as a good rule, which is why it became hard-wired into the brain and remains there to this day. (an example of this) as psychologists have shown, people often stereotype “baby-faced” adults as innocent, helpless, and needy.
We have a hard time with randomness. If we try, we can understand it intellectually, but as countless experiments have shown, we don’t get it intuitively. This is why someone who plunks one coin after another into a slot machine without winning will have a strong and growing sense—the gambler’s fallacy—that a jackpot is “due,” even though every result is random and therefore unconnected to what came before. … and people believe that a sequence of random coin tosses that goes “THTHHT” is far more likely than the sequence “THTHTH” even though they are equally likely.
People are particularly disinclined to see randomness as the explanation for an outcome when their own actions are involved. Gamblers rolling dice tend to concentrate and throw harder for higher numbers, softer for lower. Psychologists call this the “illusion of control.” … they also found the illusion is stronger when it involves prediction. In a sense, the “illusion of control” should be renamed the “illusion of prediction.”
Overconfidence is a universal human trait closely related to an equally widespread phenomenon known as “optimism bias.” Ask smokers about the risk of getting lung cancer from smoking and they’ll say it’s high. But their risk? Not so high. … The evolutionary advantage of this bias is obvious: It encourages people to take action and makes them more resilient in the face of setbacks.
… How could so many experts have been so wrong? … A crucial component of the answer lies in psychology. For all the statistics and reasoning involved, the experts derived their judgements, to one degree or another, from what they felt to be true. And in doing so they were fooled by a common bias. … This tendency to take current trends and project them into the future is the starting point of most attempts to predict. Very often. it’s also the end point. That’s not necessarily a bad thing. After all, tomorrow typically is like today. Current trends do tend to continue. But not always. Change happens. And the further we look into the future, the more opportunity there is for current rends to be modified, bent, or reversed. Predicting the future by projecting the present is like driving with no hands. It works while you are on a long stretch of straight road but even a gentle curve is trouble, and a sharp turn always ends in a flaming wreck.
When people attempt to judge how common something is—or how likely it is to happen in the future—they attempt to think of an example of that thing. If an example is recalled easily, it must be common. If it’s harder to recall, it must be less common. … Again, this is not a conscious calculation. The “availability heuristic” is a tool of the unconscious mind.
“deviating too far from consensus leaves one feeling potentially ostracized from the group, with the risk that one may be terminated.” (Robert Shiller) … It’s tempting to think that only ordinary people are vulnerable to conformity, that esteemed experts could not be so easily swayed. Tempting, but wrong. As Shiller demonstrated, “groupthink” is very much a disease that can strike experts. In fact, psychologist Irving Janis coined the term “groupthink” to describe expert behavior. In his 1972 classic, Victims of Groupthink, Janis investigated four high-level disasters: the defence of Pearl Harbour, the Bay of Pigs invasion, and escalation of the wars in Korea and Vietnam and demonstrated that conformity among highly educated, skilled, and successful people working in their fields of expertise was a root cause in each case.
(On corporate use of scenario planning)… Scenarios are not predictions, emphasizes Peter Schwartz, the guru of scenario planning. “They are tools for testing choices.” The idea is to have a clever person dream up a number of very different futures, usually three to four. … Managers then consider the implications of each, forcing them out of the rut of the status quo, and thinking about what they would do if confronted with real change. The ultimate goal is to make decisions that would stand up well in a wide variety of contexts. No one denies there maybe some value in such exercises. But how much value? The consultants who offer scenario planning services are understandably bullish, but ask them for evidence and they typically point to examples of scenarios that accurately foreshadowed the future. That is silly, frankly. For one thing, it contradicts their claim that scenarios are not predictions and al the misses would have to be considered, and the misses vastly outnumber the hits. … Consultants also cite the enormous popularity of scenario planning as proof of its enormous value… Lack of evidence aside, there are more disturbing reasons to be wary of scenarios. Remember that what drives the availably heuristic is not how many examples the mind can recall but how easily they are recalled. … and what are scenarios? Vivid, colourful, dramatic stories. Nothing could be easier to remember or recall. And so being exposed to a dramatic scenario about (whatever)… will make the depicted events feel much more likely to happen.
(on not having control) At its core, torture is a process of psychological destruction. and that process almost always begins with the torturer explicitly telling the victim he his powerless. “I decide when you can eat and sleep. I decide when you suffer, how you suffer, if it will end. I decide if you live or die.” …Knowing what will happen in the future is a form of control, even if we cannot change what will happen. …Uncertainty is potent… people who experienced the mild-but-unpredictable shocks experienced much more fear than those who got the strong-but-predictable shocks.
Our profound aversion to uncertainty helps explain what would otherwise be a riddle: Why do people pay so much attention to dark and scary predictions? Why do gloomy forecasts so often outnumber optimistic predictions, take up more media space, and sell more books? Part of this predilection for gloom is simply an outgrowth of what is sometimes called negativity bias: our attention is drawn more swiftly by bad news or images, and we are more likely to remember them than cheery information….People who’s brains gave priority to bad news were much less likely to be eaten by lions or die some other pleasant death. … (negative) predictions are supported by our intuitive pessimism, so they feel right to us. And that conclusion is bolstered by our attraction to certainty. As strange as it sounds, we want to believe the expert predicting a dark future is less tormenting then suspecting it. Certainty is always preferable to uncertainty, even when what’s certain is disaster.
Researchers have also shown that financial advisors who express considerable confidence in their stock forecasts are more trusted than those who are less confident, even when their objective records are the same. … This “confidence heuristic” like the availability heuristics, isn’t necessarily a conscious decision path. We may not actually say to ourselves “she’s so sure of herself she must be right”…
(on our love for stories) Confirmation bias also plays a critical role for the very simple reason that none of us is a blank slate. Every human brain is a vast warehouse of beliefs and assumptions about the world and how it works. Psychologists call these “schemas.” We love stories that fit our schemas; they’re the cognitive equivalent of beautiful music. But a story that doesn’t fit – a story that contradicts basic beliefs – is dissonant.
… What makes this mass delusion possible is the different emphasis we put on predictions that hit and those that miss. We ignore misses, even when they lie scattered by the dozen at our feet; we celebrate hits, even when we have to hunt for them and pretend there was more to them that luck.
By giving us the sense that we should have predicted what is now the present, or even that we actually did predict it when we did not, it strong suggests that we can predict the future. This is an illusion, and yet it seems only logical – which makes it a particularly persuasive illusion.