Suppressing Volatility Makes the World Less Predictable and More Dangerous
I recommend reading Nassim Taleb's recent article (PDF) in Foreign Affairs. It's the ultimate example of iatrogenics by the fragilista.
If you don't have time here are my notes:
- Complex systems that have artificially suppressed volatility tend to become extremely fragile, while at the same time exhibiting not visible risks.
- Seeking to restrict variability seems to be good policy (who does not prefer stability to chaos?), so it is with very good intentions that policymakers unwittingly increase the risk of major blowups.
- Because policymakers believed it was better to do something than to do nothing, they felt obligated to heal the economy rather than wait and see if it healed on its own.
- Those who seek to prevent volatility on the grounds that any and all bumps in the road must be avoided paradoxically increase the probability that a tail risk will cause a major explosion. Consider as a thought experiment a man placed in artificially sterilized environment for a decade and then invited to take a ride on a crowded subway; he would be expected to die quickly.
- But although these controls might work in some rare situations, in the long-term effect of any such system is an eventual and extremely costly blowup whose cleanup costs can far exceed the benefits accrued.
- … Government interventions are laden with unintended—and unforeseen—consequences, particularly in complex systems, so humans must work with nature by tolerating systems that absorb human imperfections rather than seek to change them.
- Although it is morally satisfying, the film (inside job) naively overlooks the fact that humans have always been dishonest and regulators have always been behind the curve.
- Humans must try to resist the illusion of control: just as foreign policy should be intelligence-proof (it should minimize its reliance on the competence of information-gathering organizations and the predictions of “experts” in what are inherently unpredictable domains), the economy should be regulator-proof, given that some regulations simply make the system itself more fragile.
- The “turkey problem” occurs when a naive analysis of stability is derived from the absence of past variations.
- Imagine someone who keep adding sand to a sand pile without any visible consequence, until suddenly the entire pile crumbles. It would be foolish to blame the collapse on the last grain of sand rather than the structure of the pile, but that is what people do consistently, and that is the policy error.
- As with a crumbling sand pile, it would be foolish to attribute the collapse of a fragile bridge to the last truck that crossed it, and even more foolish to try to predict in advance which truck might bring it down.
- Obama's mistake illustrates the illusion of local causal chains—that is, confusing catalysts for causes and assuming that one can known which catalyst will produce which effect.
- Governments are wasting billions of dollars on attempting to predict events that are produced by interdependent systems and are therefore not statistically understandable at the individual level.
- Most explanations being offered for the current turmoil in the Middle East follow the “catalysts as causes” confusion. The riots in Tunisia and Egypt were initially attributed to rising commodity prices, not to stifling and unpopular dictatorships.
- Again, the focus is wrong even if the logic is comforting. It is the system and its fragility, not events, that must be studied—what physicists call “percolation theory,” in which the properties of the terrain are studied rather than those of a single element of the terrain.
- Humans fear randomness—a healthy ancestral trait inherited from a different environment. Whereas in the past, which was a more linear world, this trait enhanced, fitness and increased changes of survival, it can have the reverse effect in today's complex world, making volatility take the shape of nasty Black Swans hiding behind deceptive periods of “great moderation.”
- But alongside the “catalysts as causes” confusion sit tow mental biases: the illusion of control and the action bias (the illusion that doing something is always better than doing nothing.) This leads to the desire to impose man-made solutions. Greenspan's actions were harmful, but it would have been hard to justify inaction in a democracy where the incentive is to always promise a better outcome than the other guy, regardless of the actual delayed cost.
- As Seneca wrote in De clementia, “repeated punishment, while it crushes the hatred of a few, stirs the hatred of all … just as trees that have been trimmed throw out again countless branches.”
- The Romans were wise enough to know that only a free man under Roman law could be trusted to engage in a contract; by extension, only a free people can be trusted to abide by a treaty.
- As Jean-Jacques Rousseau put it, “A little bit of agitation gives motivation to the soul, and what really makes the species prosper is not peace so much as freedom.” With freedom comes some unpredictable fluctuation. This is one of life's packages: there is no freedom without noise—and no stability without volatility.
Still curious? Nassim Taleb newest book is Antifragile: Things That Gain from Disorder. He is also the author of The Black Swan, Fooled By Randomness, and The Bed of Procrustes.