Tag: Do Something Syndrome

The Noise Bottleneck: When More Information is Harmful


When consuming information, we strive for more signal and less noise. The problem is a cognitive illusion: we feel like the more information we consume the more signal we receive.

While this is probably true on an absolute basis, Nassim Taleb argues in this excerpt from Antifragile, that it is not true on a relative basis. He calls is the noise bottleneck.

Taleb argues that as you consume more data and the ratio of noise to signal increases, the less you know about what's going on and the more inadvertent trouble you are likely to cause.


The Institutionalization Of Neuroticism

Imagine someone of the type we call neurotic in common parlance. He is wiry, looks contorted, and speaks with an uneven voice. His necks moves around when he tries to express himself. When he has a small pimple his first reaction is to assume that it is cancerous, that the cancer is of the lethal type, and that it has already spread. His hypochondria is not just in the medical department: he incurs a small setback in business and reacts as if bankruptcy were both near and certain. In the office, he is tuned to every single possible detail, systematically transforming every molehill into a mountain. The last thing you want in life is to be in the same car with him when stuck in traffic on your way to an important appointment. The expression overreact was designed with him in mind: he does not have reactions, just overreactions.

Compare him to someone with the opposite temperament, imperturbable, with the calm under fire that is considered necessary to become a leader, military commander or a mafia godfather. Usually unruffled and immune to small information —they can impress you with their self-control in difficult circumstances. For a sample of a composed, call and pondered voice, listen to interview of “Sammy the Bull” Salvatore Gravano who was involved in the murder of nineteen people (all competing mobsters). He speaks with minimal effort. In the rare situations when he is angry, unlike with the neurotic fellow, everyone knows it and takes it seriously.

The supply of information to which we are exposed under modernity is transforming humans from the equable second fellow to the neurotic first. For the purpose of our discussion, the second fellow only reacts to real information, the first largely to noise. The difference between the two fellows will show us the difference between noise and signal. Noise is what you are supposed to ignore; signal what you need to heed.

Indeed, we have been loosely mentioning “noise” earlier in the book; time to be precise about it. In science, noise is a generalization beyond the actual sound to describe random information that is totally useless for any purpose, and that you need to clean up to make sense of what you are listening to. Consider, for examples, elements in an encrypted message that have absolutely no meaning, just randomized letters to confuse the spies, or the hiss you hear on a telephone line and that you try to ignore in order to just focus on the voice of your interlocutor.

Noise and Signal

If you want to accelerate someone’s death, give him a personal doctor.

One can see from the tonsillectomy story that access to data increases intervention —as with neuroticism. Rory Sutherland signaled to me that those with a personal doctor on staff should be particularly vulnerable to naive interventionism, hence iatrogenics; doctors need to justify their salaries and prove to themselves that they have some work ethics, something “doing nothing” doesn’t satisfy (Editor's note: the same forces apply to leaders, managers, etc.). Indeed at the time of writing the personal doctor or the late singer Michael Jackson is being sued for something that is equivalent to overintervention-to-stifle-antifragility (but it will take the law courts a while before they become familiar with the concept). Conceivably, the same happened to Elvis Prestley. So with overmedicated politicians and heads of state.

Likewise those in corporations or in policymaking (like Fragilista Greenspan) endowed with a sophisticated statistics department and therefore getting a lot of “timely” data are capable of overreacting and mistaking noise for information —Greenspan kept an eye on such fluctuations as the sales of vacuum cleaners in Cleveland “to get a precise idea about where the economy is going”, and, of course micromanaged us into chaos.

In business and economic decision-making, data causes severe side effects —data is now plentiful thanks to connectivity; and the share of spuriousness in the data increases as one gets more immersed into it. A not well discussed property of data: it is toxic in large quantities —even in moderate quantities.

The previous two chapters showed how you can use and take advantage of noise and randomness; but noise and randomness can also use and take advantage of you, particularly when totally unnatural —the data you get on the web or thanks to the media.

The more frequently you look at data, the more noise you are disproportionally likely to get (rather than the valuable part called the signal); hence the higher the noise to signal ratio. And there is a confusion, that is not psychological at all, but inherent in the data itself. Say you look at information on a yearly basis, for stock prices or the fertilizer sales of your father-in-law’s factory, or inflation numbers in Vladivostock. Assume further that for what you are observing, at the yearly frequency the ratio of signal to noise is about one to one (say half noise, half signal) —it means that about half of changes are real improvements or degradations, the other half comes from randomness. This ratio is what you get from yearly observations. But if you look at the very same data on a daily basis, the composition would change to 95% noise, 5% signal. And if you observe data on an hourly basis, as people immersed in the news and markets price variations do, the split becomes 99.5% noise to .5% signal. That is two hundred times more noise than signal —which is why anyone who listens to news (except when very, very significant events take place) is one step below sucker.

There is a biological story with information. I have been repeating that in a natural environment, a stressor is information. So too much information would be too much stress, exceeding the threshold of antifragility. In medicine, we are discovering the healing powers of fasting, as the avoidance of too much hormonal rushes that come with the ingestion of food. Hormones convey information to the different parts of our system and too much of it confuses our biology. Here again, as with the story of the news received at too high a frequency, too much information becomes harmful. And in Chapter x (on ethics) I will show how too much data (particularly when sterile) causes statistics to be completely meaningless.

Now let’s add the psychological to this: we are not made to understand the point, so we overreact emotionally to noise. The best solution is to only look at very large changes in data or conditions, never small ones.

Just as we are not likely to mistake a bear for a stone (but likely to mistake a stone for a bear), it is almost impossible for someone rational with a clear, uninfected mind, one who is not drowning in data, to mistake a vital signal, one that matters for his survival, for noise. Significant signals have a way to reach you. In the tonsillectomies, the best filter would have been to only consider the children who are very ill, those with periodically recurring throat inflammation.

There was even more noise coming from the media and its glorification of the anecdote. Thanks to it, we are living more and more in virtual reality, separated from the real world, a little bit more every day, while realizing it less and less. Consider that every day, 6,200 persons die in the United States, many of preventable causes. But the media only reports the most anecdotal and sensational cases (hurricanes, freak incidents, small plane crashes) giving us a more and more distorted map of real risks. In an ancestral environment, the anecdote, the “interesting” is information; no longer today. Likewise, by presenting us with explanations and theories the media induces an illusion of understanding the world.

And the understanding of events (and risks) on the part of members of the press is so retrospective that they would put the security checks after the plane ride, or what the ancients call post bellum auxilium, send troops after the battle. Owing to domain dependence, we forget the need to check our map of the world against reality. So we are living in a more and more fragile world, while thinking it is more and more understandable.

To conclude, the best way to mitigate interventionism is to ration the supply of information, as naturalistically as possible. This is hard to accept in the age of the internet. It has been very hard for me to explain that the more data you get, the less you know what’s going on, and the more iatrogenics you will cause.


The noise bottleneck is really a paradox. We think the more information we consume the more signal we'll consume. Only the mind doesn't work like that. When the volume of information increases, our ability to comprehend the relevant from the irrelevant becomes compromised. We place too much emphasis on irrelevant data and lose sight of what's really important.

Still Curious? Read The Pot Belly of Ignorance.

Source (image via)

Nassim Taleb: People Kept Telling Me I Was an Idiot

Nassim Taleb at UPenn talking about anti-fragility:

There’s something called action bias. People think that doing something is necessary. Like in medicine and a lot of places. Like every time I have an MBA—except those from Wharton, because they know what’s going on!—they tell me, “Give me something actionable.” And when I was telling them, “Don’t sell out-of-the-money options,” when I give them negative advice, they don’t think it’s actionable. So they say, “Tell me what to do.” All these guys are bust. They don’t understand: you live long by not dying, you win in chess by not losing—by letting the other person lose. So negative investment is not a sissy strategy. It is an active one.

“The average doesn't matter when you're fragile.”

Watch the video:

Related: Intervention Bias

Still curious? Nassim Taleb is the author of The Black SwanFooled By Randomness, and most recently, The Bed of Procrustes.

Suppressing Volatility Makes the World Less Predictable and More Dangerous

I recommend reading Nassim Taleb's recent article (PDF) in Foreign Affairs. It's the ultimate example of iatrogenics by the fragilista.

If you don't have time here are my notes:

  • Complex systems that have artificially suppressed volatility tend to become extremely fragile, while at the same time exhibiting not visible risks.
  • Seeking to restrict variability seems to be good policy (who does not prefer stability to chaos?), so it is with very good intentions that policymakers unwittingly increase the risk of major blowups.
  • Because policymakers believed it was better to do something than to do nothing, they felt obligated to heal the economy rather than wait and see if it healed on its own.
  • Those who seek to prevent volatility on the grounds that any and all bumps in the road must be avoided paradoxically increase the probability that a tail risk will cause a major explosion. Consider as a thought experiment a man placed in artificially sterilized environment for a decade and then invited to take a ride on a crowded subway; he would be expected to die quickly.
  • But although these controls might work in some rare situations, in the long-term effect of any such system is an eventual and extremely costly blowup whose cleanup costs can far exceed the benefits accrued.
  • … Government interventions are laden with unintended—and unforeseen—consequences, particularly in complex systems, so humans must work with nature by tolerating systems that absorb human imperfections rather than seek to change them.
  • Although it is morally satisfying, the film (inside job) naively overlooks the fact that humans have always been dishonest and regulators have always been behind the curve.
  • Humans must try to resist the illusion of control: just as foreign policy should be intelligence-proof (it should minimize its reliance on the competence of information-gathering organizations and the predictions of “experts” in what are inherently unpredictable domains), the economy should be regulator-proof, given that some regulations simply make the system itself more fragile.
  • The “turkey problem” occurs when a naive analysis of stability is derived from the absence of past variations.
  • Imagine someone who keep adding sand to a sand pile without any visible consequence, until suddenly the entire pile crumbles. It would be foolish to blame the collapse on the last grain of sand rather than the structure of the pile, but that is what people do consistently, and that is the policy error.
  • As with a crumbling sand pile, it would be foolish to attribute the collapse of a fragile bridge to the last truck that crossed it, and even more foolish to try to predict in advance which truck might bring it down.
  • Obama's mistake illustrates the illusion of local causal chains—that is, confusing catalysts for causes and assuming that one can known which catalyst will produce which effect.
  • Governments are wasting billions of dollars on attempting to predict events that are produced by interdependent systems and are therefore not statistically understandable at the individual level.
  • Most explanations being offered for the current turmoil in the Middle East follow the “catalysts as causes” confusion. The riots in Tunisia and Egypt were initially attributed to rising commodity prices, not to stifling and unpopular dictatorships.
  • Again, the focus is wrong even if the logic is comforting. It is the system and its fragility, not events, that must be studied—what physicists call “percolation theory,” in which the properties of the terrain are studied rather than those of a single element of the terrain.
  • Humans fear randomness—a healthy ancestral trait inherited from a different environment. Whereas in the past, which was a more linear world, this trait enhanced, fitness and increased changes of survival, it can have the reverse effect in today's complex world, making volatility take the shape of nasty Black Swans hiding behind deceptive periods of “great moderation.”
  • But alongside the “catalysts as causes” confusion sit tow mental biases: the illusion of control and the action bias (the illusion that doing something is always better than doing nothing.) This leads to the desire to impose man-made solutions. Greenspan's actions were harmful, but it would have been hard to justify inaction in a democracy where the incentive is to always promise a better outcome than the other guy, regardless of the actual delayed cost.
  • As Seneca wrote in De clementia, “repeated punishment, while it crushes the hatred of a few, stirs the hatred of all … just as trees that have been trimmed throw out again countless branches.”
  • The Romans were wise enough to know that only a free man under Roman law could be trusted to engage in a contract; by extension, only a free people can be trusted to abide by a treaty.
  • As Jean-Jacques Rousseau put it, “A little bit of agitation gives motivation to the soul, and what really makes the species prosper is not peace so much as freedom.” With freedom comes some unpredictable fluctuation. This is one of life's packages: there is no freedom without noise—and no stability without volatility.


Still curious? Nassim Taleb newest book is Antifragile: Things That Gain from Disorder. He is also the author of The Black SwanFooled By Randomness, and The Bed of Procrustes.

Problem Solving Tools

Problem Solving Tools

Do you know of any good problem solving tools? Well, I didn't. My approach seemed to consist mostly of dumb luck.

That works most of the time, but feels inadequate for someone looking to improve their ability to make good decisions.

So I did what any person preferring reading to reality TV does and purchased a lot of books on problem solving. I convinced myself that an investment of a little time to find some problem solving tools, which marginally improved my ability to effectively solve problems would pay off handsomely over a long lifetime.

The book with the most problem solving tools is one that I didn't think I'd enjoy at all: Problem Solving 101 by Ken Wantanabe.

This book offered a simple way to deal with problems that I can still recall today: (1) understand the current situation; (2) identify the root cause of the problem; (3) develop an effective action plan; and (4) execute until the problem is solved. While simple—and remarkably effective—the process is not easy to execute.

If you've ever found yourself in the middle of a problem solving meeting you know that our bias towards action causes us to want to skip steps 1 and 2. We're prone to action. We want to shoot first and ask questions later.

This bias makes the simple four step approach above almost painful. If we think we understand the problem our minds naturally see steps 1 and 2 as a waste. The next time you find yourself in an unfortuante problem solving meeting ask yourself a few questions – are we addressing a problem or a symptom. If you're addressing a problem, does everyone in the room agree on the problem? How will we know we've solved the problem?

Think about how doctors diagnose patients.

When you visit a Dr. they first ask you questions about your symptoms and then take your temperature. They might run a blood test or two. Maybe order an X-Ray. They collect information that can be used to identify the root cause of your illness. After they've determined, and hopefully confirmed, a diagnosis they decide what to prescribe. While the process isn't the most efficient, it leads to good outcomes more often than not.

If you want to learn to solve problems better, you should buy problem solving 101. If you're really motivated, cut your cable subscription and read judgment and managerial decision making too. Exercising your brain is time well spent.

Still Curious? Check out these books on decision making.

Predicting the Improbable

One natural human bias is that we tend to draw strong conclusions based on few observations. This bias, misconceptions of chance, shows itself in many ways including the gambler and hot hand fallacies. Such biases may induce public opinion and the media to call for dramatic swings in policies or regulation in response to highly improbable events. These biases are made even worse by our natural tendency to “do something.”


An event like an earthquake happens, making it more available in our mind.

We think the event is more probable than evidence would support so we run out and buy earthquake insurance. Over many years as the earthquake fades from our mind (making it less available) we believe, paradoxically, that the risk is lower (based on recent evidence) so we cancel our policy. …

Some events are hard to predict. This becomes even more complicated when you consider not only predicting the event but the timing of the event as well. This article below points out that experts, like the rest of us, base their predictions on inference from observing the past and are just as prone to biases as the rest of us.

Why do people over infer from recent events?

There are two plausible but apparently contradicting intuitions about how people over-infer from observing recent events.

The gambler's fallacy claims that people expect rapid reversion to the mean.

For example, upon observing three outcomes of red in roulette, gamblers tend to think that black is now due and tend to bet more on black (Croson and Sundali 2005).

The hot hand fallacy claims that upon observing an unusual streak of events, people tend to predict that the streak will continue. (See Misconceptions of Chance)

The hot hand fallacy term originates from basketball where players who scored several times in a row are believed to have a “hot hand”, i.e. are more likely to score at their next attempt.

Recent behavioural theory has proposed a foundation to reconcile the apparent contradiction between the two types of over-inference. The intuition behind the theory can be explained with reference to the example of roulette play.

A person believing in the law of small numbers thinks that small samples should look like the parent distribution, i.e. that the sample should be representative of the parent distribution. Thus, the person believes that out of, say 6, spins 3 should be red and 3 should be black (ignoring green). If observed outcomes in the small sample differ from the 50:50 ratio, immediate reversal is expected. Thus, somebody observing 2 times red in 6 consecutive spins believes that black is “due” on the 3rd spin to restore the 50:50 ratio.

Now suppose such person is uncertain about the fairness of the roulette wheel. Upon observing an improbable event (6 times red in 6 spins, say), the person starts to doubt about the fairness of the roulette wheel because a long streak does not correspond to what he believes a random sequence should look like. The person then revises his model of the data generating process and starts to believe the event on streak is more likely. The upshot of the theory is that the same person may at first (when the streak is short) believe in reversion of the trend (the gambler’s fallacy) and later – when the streak is long – in continuation of the trend (the hot hand fallacy).

Continue Reading

Information Without Context

Information without context is falsely empowering and incredibly dangerous.

As an adult, have you ever picked up a child's shape-sorter and tried to put the square item through the round hole? Of course not. Adults know better — or at least we're supposed to. Yet we often take square solutions and cram them into round problems.

Consider, for example, a project that falls behind schedule. A project manager is apt to adopt whatever solution worked the last time a project was falling behind schedule. If more people were added last time and that produced a successful outcome why not do it again? Our tendency to stick with what has worked in the past, regardless of why it worked, creates a powerful illusion that we are solving the problem or doing the right thing.

When posed a difficult question by an informed reporter, politicians often answer something related but simpler. The politician treats what should be a complex topic as something black and white and portrays the topic as simpler than it really is (reductive bias). In the corporate world we do the same thing when we take something that worked previously (or somewhere else) and blindly apply it to the next problem without giving due consideration to why it worked.

Maybe we're just becoming an intellectually lazy society constantly looking for then next soundbite from “experts” on how to do something better.  We like the easy solution.

In Think Twice, Michael Mauboussin writes: “Consultants, researchers, and practitioners often observe some success, seek common attributes among them and proclaim that those attributes can lead others to succeed. This simply does not work.”

Our brains may be adult, yet we demonstrate a very childlike level of consideration. Decision makers often fail to ask key questions, such as: What's different about this project? Under which circumstances is adding more people likely to work? and, Am I doing this because someone else is doing it?

Adopting best practices has become the reason to do something in and of itself.  It is, after all, hard to challenge logic of best practices. But what do best practices mean? Whom are they best for? What makes them successful? Can we replicate them in our company? Culture? Circumstance? Do we have the necessary skills? What are the side effects? What are the incentives? … More often than not, we embrace a solution without understanding under which conditions it succeeds or fails.

I think there are some parallels between business decision making and medicine. In Medicine our understanding of the particulars can never be complete: misdiagnosing a patient is common so doctors look at each patient as a new mystery.

A doctor, applying the same thoughtlessness spewed by management consultants might, reasonably, determine that all people with a fever have a cold. However, we know people are more complex than this simple correlation. Medical practitioners know the difference between correlation and cause. A fever by itself tells the doctor something but not everything. It could indicate a cold and it could be something more serious. Doctors, like good decision makers, check the context and seek out information that might disprove their diagnosis.