Tag: Antifragile

The Green Lumber Fallacy: The Difference between Talking and Doing

“Clearly, it is unrigorous to equate skills at doing with skills at talking.”
— Nassim Taleb


Before we get to the meat, let's review an elementary idea in biology that will be relevant to our discussion.

If you're familiar with evolutionary theory, you know that populations of organisms are constantly subjected to “selection pressures” — the rigors of their environment which lead to certain traits being favored and passed down to their offspring and others being thrown into the evolutionary dustbin.

Biologists dub these advantages in reproduction “fitness” — as in, the famously lengthening of giraffe necks gave them greater “fitness” in their environment because it helped them reach high up, untouched leaves.

Fitness is generally a relative concept: Since organisms must compete for scarce resources, their fitnesses are measured in the sense of giving a reproductive advantage over one another.

Just as well, a trait that might provide great fitness in one environment may be useless or even disadvantageous in another. (Imagine draining a pond: Any fitness advantages held by a really incredible fish becomes instantly worthless without water.) Traits also relate to circumstance. An advantage at one time could be a disadvantage at another and vice versa.

This makes fitness an all-important concept in biology: Traits are selected for if they provide fitness to the organism within a given environment.

Got it? OK, let's get back to the practical world.


The Black Swan thinker Nassim Taleb has an interesting take on fitness and selection in the real world:  People who are good “doers” and people who are good “talkers” are often selected for different traits. Be careful not to mix them up.

In his book Antifragile, Taleb uses this idea to invoke a heuristic he'd once used when hiring traders on Wall Street:

The more interesting their conversation, the more cultured they are, the more they will be trapped into thinking that they are effective at what they are doing in real business (something psychologists call the halo effect, the mistake of thinking that skills in, say, skiing translate unfailingly into skills in managing a pottery workshop or a bank department, or that a good chess player would be a good strategist in real life).

Clearly, it is unrigorous to equate skills at doing with skills at talking. My experience of good practitioners is that they can be totally incomprehensible–they do not have to put much energy into turning their insights and internal coherence into elegant style and narratives. Entrepreneurs are selected to be doers, not thinkers, and doers do, they don't talk, and it would be unfair, wrong, and downright insulting to measure them in the talk department.

In other words, the selection pressures on an entrepreneur are very different from those on a corporate manager or bureaucrat: Entrepreneurs and risk takers succeed or fail not so much on their ability to talk, explain, and rationalize as their ability to get things done.

While the two can often go together, Nassim figured out that they frequently don't. We judge people as ignorant when it's really us who are ignorant.

When you think about it, there's no a priori reason great intellectualizing and great doing must go together: Being able to hack together an incredible piece of code gives you great fitness in the world of software development, while doing great theoretical computer science probably gives you better fitness in academia. The two skills don't have to be connected. Great economists don't usually make great investors.

But we often confuse the two realms.  We're tempted to think that a great investor must be fluent in behavioral economics or a great CEO fluent in Mckinsey-esque management narratives, but in the real world, we see this intuition constantly in violation.

The investor Walter Schloss worked from 9-5, barely left his office, and wasn't considered an entirely high IQ man, but he compiled one of the great investment records of all time. A young Mark Zuckerberg could hardly be described as a prototypical manager or businessperson, yet somehow built one of the most profitable companies in the world by finding others that complemented his weaknesses.

There are a thousand examples: Our narratives about the type of knowledge or experience we must have or the type of people we must be in order to become successful are often quite wrong; in fact, they border on naive. We think people who talk well can do well, and vice versa. This is simply not always so.

We won't claim that great doers cannot be great talkers, rationalizers, or intellectuals. Sometimes they are. But if you're seeking to understand the world properly, it's good to understand that the two traits are not always co-located. Success, especially in some “narrow” area like plumbing, programming, trading, or marketing, is often achieved by rather non-intellectual folks. Their evolutionary fitness doesn't come from the ability to talk, but do. This is part of reality.


Taleb calls this idea the Green Lumber Fallacy, after a story in the book What I Learned Losing a Million Dollars. Taleb describes it in Antifragile:

In one of the rare noncharlatanic books in finance, descriptively called What I Learned Losing a Million Dollars, the protagonist makes a big discovery. He remarks that a fellow named Joe Siegel, one of the most successful traders in a commodity called “green lumber,” actually thought it was lumber painted green (rather than freshly cut lumber, called green because it had not been dried). And he made it his profession to trade the stuff! Meanwhile the narrator was into grand intellectual theories and narratives of what caused the price of commodities to move and went bust.

It is not just that the successful expert on lumber was ignorant of central matters like the designation “green.” He also knew things about lumber that nonexperts think are unimportant. People we call ignorant might not be ignorant.

The fact that predicting the order flow in lumber and the usual narrative had little to do with the details one would assume from the outside are important. People who do things in the field are not subjected to a set exam; they are selected in the most non-narrative manager — nice arguments don't make much difference. Evolution does not rely on narratives, humans do. Evolution does not need a word for the color blue.

So let us call the green lumber fallacy the situation in which one mistakes a source of visible knowledge — the greenness of lumber — for another, less visible from the outside, less tractable, less narratable.

The main takeaway is that the real causative factors of success are often hidden from usWe think that knowing the intricacies of green lumber are more important than keeping a close eye on the order flow. We seduce ourselves into overestimating the impact of our intellectualism and then wonder why “idiots” are getting ahead. (Probably hustle and competence.)

But for “skin in the game” operations, selection and evolution don't care about great talk and ideas unless they translate into results. They care what you do with the thing more than that you know the thing. They care about actually avoiding risk rather than your extensive knowledge of risk management theories. (Of course, in many areas of modernity there is no skin in the game, so talking and rationalizing can be and frequently are selected for.)

As Taleb did with his hiring heuristic, this should teach us to be a little skeptical of taking good talkers at face value, and to be a little skeptical when we see “unexplainable” success in someone we consider “not as smart.” There might be a disconnect we're not seeing because we're seduced by narrative. (A problem someone like Lee Kuan Yew avoided by focusing exclusively on what worked.)

And we don't have to give up our intellectual pursuits in order to appreciate this nugget of wisdom; Taleb is right, but it's also true that combining the rigorous, skeptical knowledge of “what actually works” with an ever-improving theory structure of the world might be the best combination of all — selected for in many more environments than simple git-er-done ability, which can be extremely domain and environment dependent. (The green lumber guy might not have been much good outside the trading room.)

After all, Taleb himself was both a successful trader and the highest level of intellectual. Even he can't resist a little theorizing.

Intervention Bias: When to Step in and When To Leave Things Along

Nassim Taleb, author of many books—The Black SwanFooled By Randomness, and The Bed of Procrustes—is in the process of writing a new book on Fragility. He's taken the sample chapter down from his website but I thought his thoughts were well worth pondering.

He's taken the sample chapter down from his website but I thought his thoughts on our human bias to intervene were worth pondering.

We humans have a natural, seemingly innate, bias to think that systems do not improve on their own, without our intervention or guidance —coupled with the Aristotelian notion that we ourselves know where things should be going.

The intervention bias is the direct result of our refusal to accept antifragility, manifested, as we saw, in the absence of a direct word for it. So let us examine a broader aspect of the idea with some psychological traits we humans have.

The absence of a direct word for antifragility from main human vocabularies is quite alarming. You can therefore see that the topic of this book is not just antifragility, but also the defects leading to its absence from human vocabulary.

There is a mental defect psychologists call illusion of control that lead to a default to action rather than inaction, even when the benefits of inaction might be greater than those of action. So the intervention bias” (do something seems better than doing nothing, which is fine except that there are cases in which it gets us in trouble). The illusion of control was meant to show how “irrational” (according to some norm of behavior) we humans can be by giving ourselves the illusion to manage the uncontrollable around us: for instance gamblers cannot resist the pressure to do something in order to improve the outcome, such as throw the die with violence when they need a high number, or throw it softly in order to get a low one. Traders cannot resist wearing the same “lucky” shirt (often unwashed) to improve their day and feel they need to find similar way to take control of their destiny. This mental bias leads to all manner of patently “irrational” actions such as belief in paranormal, alternative medicine and many such actions often put under the umbrella magical thinking. Now the irony is that while this bias was devised to expose patently nonscientific fields, it largely affects many things you learn in college, particularly in social science. Many matters we deem scientific are just the fruit of that very illusion of control masquerading as science with, of course, actions to “improve” mankind.

Why is the scientific illusion of control worse than that of the pedestrian version? Because, tout simplement, these gamblers superstitions are benign, not much worse than doing nothing —they may be even beneficial in hidden ways, and in the right environment. But a doctor tinkering with your system or an army playing with a complex system with opaque causal links, say by invading Iraq, giving chemicals to kids and threatening their brain balance, or intervening in the environment, is far worse than nothing.

This variant of the illusion of control leads to the denigration of acts of omission (not doing something, letting things run their own course, leaving nature or the human body alone) as compared to doing something (such as operating on a patient or prescribing medication). This, we will see is the reason medicine used, until recent history, to kill more patients than it saved (and did not even get close to realizing it), and economists of the sophisticated equation-carrying variety, I will hope to convince you, have been particularly harmful to the economic health of societies —central bankers , and finance ministers, by tinkering with economic life, have caused massive instability.

Taleb posted an interesting table regarding intervention bias.