Tag: James Gleick

The Butterfly Effect: Everything You Need to Know About This Powerful Mental Model

“You could not remove a single grain of sand from its place without thereby … changing something throughout all parts of the immeasurable whole.”

— Fichte, The Vocation of Man (1800)
***

The Basics

In one of Stephen King’s greatest works, 11/22/63, a young man named Jake discovers a portal in a diner’s pantry which leads back to 1958. After a few visits and some experiments, Jake deduces that altering history is possible. However long he stays in the past, only two minutes go by in the present. He decides to live in the past until 1963 so he can prevent the assassination of President John F. Kennedy, believing that this change will greatly benefit humanity. After years of stalking Lee Harvey Oswald, Jake manages to prevent him from shooting Kennedy.

Upon returning to the present, he expects to find the world improved as a result. Instead, the opposite has happened. Earthquakes occur everywhere, his old home is in ruins, and nuclear war has destroyed much of the world. (As King wrote in an article for Marvel Spotlight, “Not good to fool with Father Time.”) Distraught, Jake returns to 1958 once again and resets history.

In addition to being a masterful work of speculative fiction, 11/22/63 is a classic example of how everything in the world is connected together.

The butterfly effect is the idea that small things can have non-linear impacts on a complex system. The concept is imagined with a butterfly flapping its wings and causing a typhoon.

Of course, a single act like the butterfly flapping its wings cannot cause a typhoon. Small events can, however, serve as catalysts that act on starting conditions.

And as John Gribbin writes in his cult-classic work Deep Simplicity, “some systems … are very sensitive to their starting conditions, so that a tiny difference in the initial ‘push’ you give them causes a big difference in where they end up, and there is feedback, so that what a system does affects its own behavior.”

In the foreword to The Butterfly Effect in Competitive Markets by Dr. Rajagopal, Tom Breuer writes:

Simple systems, with few variables, can nonetheless show unpredictable and sometimes chaotic behavior…[Albert] Libchaber conducted a series of seminal experiments. He created a small system in his lab to study convection (chaotic system behavior) in a cubic millimeter of helium. By gradually warming this up from the bottom, he could create a state of controlled turbulence. Even this tightly controlled environment displayed chaotic behavior: complex unpredictable disorder that is paradoxically governed by “orderly” rules.

… [A] seemingly stable system (as in Libchaber’s 1 ccm cell of helium) can be exposed to very small influences (like heating it up a mere 0.001 degree), and can transform from orderly convection into wild chaos. Although [such systems are] governed by deterministic phenomena, we are nonetheless unable to predict how [they] will behave over time.

What the Butterfly Effect Is Not

The point of the butterfly effect is not to get leverage. As General Stanley McChrystal writes in Team of Teams:

In popular culture, the term “butterfly effect” is almost always misused. It has become synonymous with “leverage”—the idea of a small thing that has a big impact, with the implication that, like a lever, it can be manipulated to a desired end. This misses the point of Lorenz’s insight. The reality is that small things in a complex system may have no effect or a massive one, and it is virtually impossible to know which will turn out to be the case.

Benjamin Franklin offered a poetic perspective in his variation of a proverb that’s been around since the 14th century in English and the 13th century in German, long before the identification of the butterfly effect:

For want of a nail the shoe was lost,
For want of a shoe the horse was lost,
For want of a horse the rider was lost,
For want of a rider the battle was lost,
For want of a battle the kingdom was lost,
And all for the want of a horseshoe nail.

The lack of one horseshoe nail could be inconsequential, or it could indirectly cause the loss of a war. There is no way to predict which outcome will occur. (If you want an excellent kids book to start teaching this to your children, check out If You Give a Mouse a Cookie.)

In this post, we will seek to unravel the butterfly effect from its many incorrect connotations, and build an understanding of how it affects our individual lives and the world in general.

Edward Lorenz and the Discovery of the Butterfly Effect

“It used to be thought that the events that changed the world were things like big bombs, maniac politicians, huge earthquakes, or vast population movements, but it has now been realized that this is a very old-fashioned view held by people totally out of touch with modern thought. The things that change the world, according to Chaos theory, are the tiny things. A butterfly flaps its wings in the Amazonian jungle, and subsequently a storm ravages half of Europe.”

— from Good Omens, by Terry Pratchett and Neil Gaiman
***

Although the concept of the butterfly effect has long been debated, the identification of it as a distinct effect is credited to Edward Lorenz (1917–2008). Lorenz was a meteorologist and mathematician who successfully combined the two disciplines to create chaos theory. During the 1950s, Lorenz searched for a means of predicting the weather, as he found linear models to be ineffective.

In an experiment to model a weather prediction, he entered the initial condition as 0.506, instead of 0.506127. The result was surprising: a somewhat different prediction. From this, he deduced that the weather must turn on a dime. A tiny change in the initial conditions had enormous long-term implications. By 1963, he had formulated his ideas enough to publish an award-winning paper entitled Deterministic Nonperiodic Flow. In it, Lorenz writes:

Subject to the conditions of uniqueness, continuity, and boundedness … a central trajectory, which in a certain sense is free of transient properties, is unstable if it is nonperiodic. A noncentral trajectory … is not uniformly stable if it is nonperiodic, and if it is stable at all, its very stability is one of its transient properties, which tends to die out as time progresses. In view of the impossibility of measuring initial conditions precisely, and thereby distinguishing between a central trajectory and a nearby noncentral trajectory, all nonperiodic trajectories are effectively unstable from the point of view of practical prediction.

In simpler language, he theorized that weather prediction models are inaccurate because knowing the precise starting conditions is impossible, and a tiny change can throw off the results. In order to make the concept understandable to non-scientific audiences, Lorenz began to use the butterfly analogy.

A small error in the initial data magnifies over time.

In speeches and interviews, he explained that a butterfly has the potential to create tiny changes which, while not creating a typhoon, could alter its trajectory. A flapping wing represents the minuscule changes in atmospheric pressure, and these changes compound as a model progresses. Given that small, nearly imperceptible changes can have massive implications in complex systems, Lorenz concluded that attempts to predict the weather were impossible. Elsewhere in the paper, he writes:

If, then, there is any error whatever in observing the present state—and in any real system such errors seem inevitable—an acceptable prediction of an instantaneous state in the distant future may well be impossible.

… In view of the inevitable inaccuracy and incompleteness of weather observations, precise very-long-range forecasting would seem to be nonexistent.

Lorenz always stressed that there is no way of knowing what exactly tipped a system. The butterfly is a symbolic representation of an unknowable quantity.

Furthermore, he aimed to contest the use of predictive models that assume a linear, deterministic progression and ignore the potential for derailment. Even the smallest error in an initial setup renders the model useless as inaccuracies compound over time. The exponential growth of errors in a predictive model is known as deterministic chaos. It occurs in most systems, regardless of their simplicity or complexity.

The butterfly effect is somewhat humbling—a model that exposes the flaws in other models. It shows science to be less accurate than we assume, as we have no means of making accurate predictions due to the exponential growth of errors.

Prior to the work of Lorenz, people assumed that an approximate idea of initial conditions would lead to an approximate prediction of the outcome. In Chaos: Making a New Science, James Gleick writes:

The models would churn through complicated, somewhat arbitrary webs of equations, meant to turn measurements of initial conditions … into a simulation of future trends. The programmers hoped the results were not too grossly distorted by the many unavoidable simplifying assumptions. If a model did anything too bizarre … the programmers would revise the equations to bring the output back in line with expectation… Models proved dismally blind to what the future would bring, but many people who should have known better acted as though they believed the results.

One theoretician declared, “The basic idea of Western science is that you don’t have to take into account the falling of a leaf on some planet in another galaxy when you’re trying to account for the motion of a billiard ball on a pool table on earth.”

An illustration of two weather conditions with very slightly different initial conditions. The trajectories are similar at first, before deviating further and further.

Lorenz’s findings were revolutionary because they proved this assumption to be entirely false. He found that without a perfect idea of initial conditions, predictions are useless—a shocking revelation at the time.

During the early days of computers, many people believed they would enable us to understand complex systems and make accurate predictions. People had been slaves to weather for millennia, and now they wanted to take control. With one innocent mistake, Lorenz shook the forecasting world, sending ripples which (appropriately) spread far beyond meteorology.

Ray Bradbury, the Butterfly Effect, and the Arrow of Time

Ray Bradbury’s classic science fiction story A Sound of Thunder predates the identification of chaos theory and the butterfly effect. Set in 2055, it tells of a man named Eckels who travels back 65 million years to shoot a dinosaur. Warned not to deviate from the tour guide’s plan, Eckels (along with his guide and the guide’s assistant) heads off to kill a Tyrannosaurus Rex who was going to die soon anyway when a falling tree lands on it. Eckels panics at the sight of the creature and steps off the path, leaving his guide to kill the T Rex. The guide is enraged and orders Eckels to remove the bullets before the trio returns to 2055. Upon arrival, they are confused to find that the world has changed. Language is altered and an evil dictator is now in charge. A confused Eckels notices a crushed butterfly stuck to his boot and realizes that in stepping off the path, he killed the insect and changed the future. Bradbury writes:

Eckels felt himself fall into a chair. He fumbled crazily at the thick slime on his boots. He held up a clod of dirt, trembling, “No, it cannot be. Not a little thing like that. No!”

Embedded in the mud, glistening green and gold and black, was a butterfly, very beautiful and very dead.

“Not a little thing like that! Not a butterfly!” cried Eckels.

It fell to the floor, an exquisite thing, a small thing that could upset balances and knock down a line of small dominoes and then big dominoes and then gigantic dominoes, all down the years across Time. Eckels' mind whirled. It couldn't change things. Killing one butterfly couldn't be that important! Could it?

Bradbury envisioned the passage of time as fragile and liable to be disturbed by minor changes. In the decades since the publication of A Sound of Thunder, physicists have examined its accuracy. Obviously, we cannot time–travel, so there is no way of knowing how plausible the story is, beyond predictive models. Bradbury’s work raises the questions of what time is and whether it is deterministic.

Physicists refer to the Arrow of Time—the non-reversible progression of entropy (disorder.) As time moves forward, matter becomes more and more chaotic and does not spontaneously return to its original state. If you break an egg, it remains broken and cannot spontaneously re-form, for example. The Arrow of Time gives us a sense of past, present, and future. Arthur Eddington (the astronomer and physicist who coined the term) explained:

Let us draw an arrow arbitrarily. If as we follow the arrow we find more and more of the random element in the state of the world, then the arrow is pointing towards the future; if the random element decreases the arrow points towards the past. That is the only distinction known to physics. This follows at once if our fundamental contention is admitted that the introduction of randomness is the only thing which cannot be undone.

In short, the passage of time as we perceive it does exist, conditional to the existence of entropy. As long as entropy is non-reversible, time can be said to exist. The closest thing we have to a true measurement of time is a measurement of entropy. If the progression of time is nothing but a journey towards chaos, it makes sense for small changes to affect the future by amplifying chaos.

We do not yet know if entropy creates time or is a byproduct of it. Subsequently, we cannot know if changing the past would change the future. Would stepping on a butterfly shift the path of entropy? Did Eckels move off the path out of his own free will, or was that event predetermined? Was the dictatorial future he returned to always meant to be?

These interconnected concepts — the butterfly effect, chaos theory, determinism, free will, time travel — have captured many imaginations since their discoveries. Films ranging from It’s a Wonderful Life to Donnie Darko and the eponymous Butterfly Effect have explored the complexities of cause and effect. Once again, it is important to note that works of fiction tend to view the symbolic butterfly as the cause of an effect. According to Lorenz’s original writing, though, the point is that small details can tip the balance without being identifiable.

The Butterfly Effect in Business

Marketplaces are, in essence, chaotic systems that are influenced by tiny changes. This makes it difficult to predict the future, as the successes and failures of businesses can appear random. Periods of economic growth and decline sprout from nowhere. This is the result of the exponential impact of subtle stimuli—the economic equivalent of the butterfly effect. Breuer explains:

We live in an interconnected, or rather a hyper-connected society. Organizations and markets “behave” like networks. This triggers chaotic (complex) rather than linear behavior.

Preparing for the future and seeing logic in the chaos of consumer behaviour is not easy. Once-powerful giants collapse as they fall behind the times. Tiny start-ups rise from the ashes and take over industries. Small alterations in existing technology transform how people live their lives. Fads capture everyone’s imagination, then disappear.

Businesses have two options in this situation: build a timeless product or service, or race to keep up with change. Many businesses opt for a combination of the two. For example, Doc Martens continues selling the classic 1460 boot, while bringing out new designs each season. This approach requires extreme vigilance and attention to consumer desires, in an attempt to both remain relevant and appear timeless. Businesses leverage the compounding impact of small tweaks that aim to generate interest in all they have to offer.

In The Butterfly Effect in Competitive Markets, Dr. Rajagopal writes that

most global firms are penetrating bottom-of-the-pyramid market segments by introducing small changes in technology, value perceptions, [and] marketing-mix strategies, and driving production on an unimagined scale of magnitude to derive a major effect on markets. …Procter & Gamble, Kellogg’s, Unilever, Nestlé, Apple, and Samsung, have experienced this effect in their business growth…. Well-managed companies drive small changes in their business strategies by nipping the pulse of consumers….

Most firms use such effect by making a small change in their strategy in reference to produce, price, place, promotion, … posture (developing corporate image), and proliferation…to gain higher market share and profit in a short span.

For most businesses, incessant small changes are the most effective way to produce the metaphorical typhoon. These iterations keep consumers engaged while preserving brand identity. If these small tweaks fail, the impact is hopefully not too great. But if they succeed and compound, the rewards can be monumental.

By nature, all markets are chaotic, and what seem like inconsequential alterations can propel a business up or down. Rajagopal explains how the butterfly effect connects to business:

Globalization and frequent shifts in consumer preferences toward products and services have accelerated chaos in the market due to the rush of firms, products, and business strategies. Chaos theory in markets addresses the behavior of strategic and dynamic moves of competing firms that are highly sensitive to existing market conditions triggering the butterfly effect.

The initial conditions (economic, social, cultural, political) in which a business sets up are vital influences on its success or failure. Lorenz found that the smallest change in the preliminary conditions created a different outcome in weather predictions, and we can consider the same to be true for businesses. The first few months and years are a crucial time, when rates of failure are highest and the basic brand identity forms. Any of the early decisions, achievements, or mistakes have the potential to be the wing flap that creates a storm.

Benoit Mandelbrot on the Butterfly Effect in Economics

International economies can be thought of as a single system, wherein each part influences the others. Much like the atmosphere, the economy is a complex system in which we see only the visible outcomes—rain or shine, boom or bust. With the advent of globalization and improved communication technology, the economy is even more interconnected than in the past. One episode of market volatility can cause problems for the entire system. The butterfly effect in economics refers to the compounding impact of small changes. As a consequence, it is nearly impossible to make accurate predictions for the future or to identify the precise cause of an inexplicable change. Long periods of stability are followed by sudden declines, and vice versa.

Benoit Mandelbrot (the “father of fractals”) began applying the butterfly effect to economics several decades ago. In a 1999 article for Scientific American, he explained his findings. Mandelbrot saw how unstable markets could be, and he cited an example of a company which saw its stock drop 40% in one day, followed by another 6%, before rising by 10%—the typhoon created by an unseen butterfly. When Benoit looked at traditional economic models, he found that they did not even allow for the occurrence of such events. Standard models denied the existence of dramatic market shifts. Benoit writes in Scientific American:

According to portfolio theory, the probability of these large fluctuations would be a few millionths of a millionth of a millionth of a millionth. (The fluctuations are greater than 10 standard deviations.) But in fact, one observes spikes on a regular basis—as often as every month—and their probability amounts to a few hundredths.

If these changes are unpredictable, what causes them? Mandelbrot’s answer lay in his work on fractals. To explain fractals would require a whole separate post, so we will go with Mandelbrot’s own simplified description: “A fractal is a geometric shape that can be separated into parts, each of which is a reduced-scale version of the whole.” He goes on to explain the connection:

In finance, this concept is not a rootless abstraction but a theoretical reformulation of a down-to-earth bit of market folklore—namely that movements of a stock or currency all look alike when a market chart is enlarged or reduced so that it fits the same time and price scale. An observer then cannot tell which of the data concern prices that change from week to week, day to day or hour to hour. This quality defines the charts as fractal curves and makes available many powerful tools of mathematical and computer analysis.”

In a talk, Mandelbrot held up his coffee and declared that predicting its temperature in a minute is impossible, but in an hour is perfectly possible. He applied the same concept to markets that change in dramatic ways in the short term. Even if a long-term pattern can be deduced, it has little use for those who trade on a shorter timescale.

Mandelbrot explains how his fractals can be used to create a more useful model of the chaotic nature of the economy:

Instead, multifractals can be put to work to “stress-test” a portfolio. In this technique, the rules underlying multifractals attempt to create the same patterns of variability as do the unknown rules that govern actual markets. Multifractals describe accurately the relation between the shape of the generator and the patterns of up-and-down swings of prices to be found on charts of real market data… They provide estimates of the probability of what the market might do and allow one to prepare for inevitable sea changes. The new modeling techniques are designed to cast a light of order into the seemingly impenetrable thicket of the financial markets. They also recognize the mariner’s warning that, as recent events demonstrate, deserves to be heeded: On even the calmest sea, a gale may be just over the horizon.

In The Misbehaviour of Markets, Mandelbrot and Richard Hudson expand upon the topic of financial chaos. They begin with a discussion of the infamous 2008 crash and its implications:

The worldwide market crash of autumn 2008 had many causes: greedy bankers, lax regulators and gullible investors, to name a few. But there is also a less-obvious cause: our all-too-limited understanding of how markets work, how prices move and how risks evolve. …

Markets are complex, and treacherous. The bone-chilling fall of September 29, 2008—a 7 percent, 777 point plunge in the Dow Jones Industrial Average—was, in historical terms, just a particularly dramatic demonstration of that fact. In just a few hours, more than $1.6 trillion was wiped off the value of American industry—$5 trillion worldwide.

Mandelbrot and Hudson believe that the 2008 credit crisis can be attributed in part to the increasing confidence in financial predictions. People who created computer models designed to guess the future failed to take into account the butterfly effect. No matter how complex the models became, they could not create a perfect picture of initial conditions or account for the compounding impact of small changes. Just as people believed they could predict and therefore control the weather before Lorenz published his work, people thought they could do the same for markets until the 2008 crash proved otherwise. Wall Street banks trusted their models of the future so much that they felt safe borrowing growing sums of money for what was, in essence, gambling. After all, their predictions said such a crash was impossible. Impossible or not, it happened.

According to Mandelbrot and Hudson, predictive models view markets as “a risky but ultimately … manageable world.” As with meteorology, economic predictions are based on approximate ideas of initial conditions—ideas that, as we know, are close to useless. As Mandelbrot and Hudson write:

[C]auses are usually obscure. … The precise market mechanism that links news to price, cause to effect, is mysterious and seems inconsistent. Threat of war: Dollar falls. Threat of war: Dollar rises. Which of the two will actually happen? After the fact, it seems obvious; in hindsight, fundamental analysis can be reconstituted and is always brilliant. But before the fact, both outcomes may seem equally likely.

In the same way that apparently similar weather conditions can create drastically different outcomes, apparently similar market conditions can create drastically different outcomes. We cannot see the extent to which the economy is interconnected and we cannot identify where the butterfly lies. Mandelbrot and Hudson disagree with the view of the economy as separate from other parts of our world. Everything connects:

No one is alone in this world. No act is without consequences for others. It is a tenet of chaos theory that, in dynamical systems, the outcome of any process is sensitive to its starting point—or in the famous cliché, the flap of a butterfly’s wings in the Amazon can cause a tornado in Texas. I do not assert that markets are chaotic…. But clearly, the global economy is an unfathomably complicated machine. To all the complexity of the physical world… you add the psychological complexity of men acting on their fleeting expectations….

Why do people prefer to blame crashes (such as the 2008 credit crisis) on the folly of those in the financial industry? Jonathan Cainer provides a succinct explanation:

Why do we love the idea that people might be secretly working together to control and organise the world? Because we do not like to face the fact that our world runs on a combination of chaos, incompetence, and confusion.

Historic Examples of the Butterfly Effect

“A very small cause which escapes our notice determines a considerable effect that we cannot fail to see, and then we say the effect is due to chance. If we knew exactly the laws of nature and the situation of the universe at the initial moment, we could predict exactly the situation of that same universe at a succeeding moment. But even if it were the case that the natural laws had no longer any secret for us, we could still only know the initial situation *approximately*. If that enabled us to predict the succeeding situation with *the same approximation*, that is all we require, and we should say that the phenomenon had been predicted, that it is governed by laws. But it is not always so; it may happen that small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. Prediction becomes impossible, and we have the fortuitous phenomenon.”

— Jules Henri Poincaré (1854–1912)

***

Many examples exist of instances where a tiny detail led to a dramatic change. In each case, the world we live in could be different if the situation had been reversed. Here are some examples of how the butterfly effect has shaped our lives.

  • The bombing of Nagasaki. The US initially intended to bomb the Japanese city of Kuroko, with the munitions factory as a target. On the day the US planned to attack, cloudy weather conditions prevented the factory from being seen by military personnel as they flew overhead. The airplane passed over the city three times before the pilots gave up. Locals huddled in shelters heard the hum of the airplane preparing to drop the nuclear bomb and prepared for their destruction. Except Kuroko was never bombed. Military personnel decided on Nagasaki as the target due to improved visibility. The implications of that split-second decision were monumental. We cannot even begin to comprehend how different history might have been if that day had not been cloudy. Kuroko is sometimes referred to as the luckiest city in Japan, and those who lived there during the war are still shaken by the near miss.
  • The Academy of Fine Arts in Vienna rejecting Adolf Hitler’s application, twice. In the early 1900s, a young Hitler applied for art school and was rejected, possibly by a Jewish professor. By his own estimation and that of scholars, this rejection went on to shape his metamorphosis from a bohemian aspiring artist into the human manifestation of evil. We can only speculate as to how history would have been different. But it is safe to assume that a great deal of tragedy could have been avoided if Hitler had applied himself to watercolors, not to genocide.
  • The assassination of Archduke Franz Ferdinand. A little-known fact about the event considered to be the catalyst for both world wars is that it almost didn’t happen. On the 28th of June, 1914, a teenage Bosnian-Serb named Gavrilo Princip went to Sarajevo with two other nationalists in order to assassinate the Archduke. The initial assassination attempt failed; a bomb or grenade exploded beneath the car behind the Archduke’s and wounded its occupants. The route was supposed to have been changed after that, but the Archduke’s driver didn’t get the message. Had he actually taken the alternate route, Princip would not have been on the same street as the car and would not have had the chance to shoot the Archduke and his wife that day. Were it not for a failure of communication, both world wars might never have happened.
  • The Chernobyl disaster. In 1986, a test at the Chernobyl nuclear plant went awry and released 400 times the radiation produced by the bombing of Hiroshima. One hundred fifteen thousand people were evacuated from the area, with many deaths and birth defects resulting from the radiation. Even today, some areas remain too dangerous to visit. However, it could have been much worse. After the initial explosion, three plant workers volunteered to turn off the underwater valves to prevent a second explosion. It has long been believed that the trio died as a result, although there is now some evidence this may not have been the case. Regardless, diving into a dark basement flooded with radioactive water was a heroic act. Had they failed to turn off the valve, half of Europe would have been destroyed and rendered uninhabitable for half a million years. Russia, Ukraine, and Kiev also would have become unfit for human habitation. Whether they lived or not, the three men—Alexei Ananenko, Valeri Bezpalov and Boris Baranov—stilled the wings of a deadly butterfly. Indeed, the entire Chernobyl disaster was the result of poor design and the ineptitude of staff. The long-term result (in addition to the impact on residents of the area) was a widespread anxiety towards nuclear plants and bias against nuclear power, leading to a preference for fossil fuels. Some people have speculated that Chernobyl is responsible for the acceleration of global warming, as countries became unduly slow to adopt nuclear power.
  • The Cuban Missile Crisis. We all may owe our lives to a single Russian Navy officer named Vasili Arkhipov, who has been called “the man who saved the world.” During the Cuban Missile Crisis, Arkhipov was stationed on a nuclear-armed submarine near Cuba. American aircraft and ships began using depth charges to signal the submarine that it should surface so it could be identified. With the submarine submerged too deep to monitor radio signals, the crew had no idea what was going on in the world above. The captain, Savitsky, decided the signal meant that war had broken out and he prepared to launch a nuclear torpedo. Everyone agreed with him—except Arkhipov. Had the torpedo launched, nuclear clouds would have hit Moscow, London, East Anglia and Germany, before wiping out half of the British population. The result could have been a worldwide nuclear holocaust, as countries retaliated and the conflict spread. Yet within an overheated underwater room, Arkhipov exercised his veto power and prevented a launch. Without the courage of one man, our world could be unimaginably different.

From these handful of examples, it is clear how fragile the world is, and how dire the effects of tiny events can be on starting conditions.

We like to think we can predict the future and exercise a degree of control over powerful systems such as the weather and the economy. Yet the butterfly effect shows that we cannot. The systems around us are chaotic and entropic, prone to sudden change. For some kinds of systems, we can try to create favorable starting conditions and be mindful of the kinds of catalysts that might act on those conditions – but that’s as far as our power extends. If we think that we can identify every catalyst and control or predict outcomes, we are only setting ourselves up for a fall.

Comment or Share on Facebook | Discuss or Retweet on Twitter

Richard Feynman on Teaching Math to Kids and the Lessons of Knowledge

Legendary scientist Richard Feynman was famous for his penetrating insight and clarity of thought. Famous for not only the work he did to garner a Nobel Prize, but also for the lucidity of explanations of ordinary things such as why trains stay on the tracks as they go around a curve, how we look for new laws of science, how rubber bands work, and the beauty of the natural world.

Feynman knew the difference between knowing the name of something and knowing something. And was often prone to telling the emperor they had no clothes as this illuminating example from James Gleick's book Genius: The Life and Science of Richard Feynman shows.

Educating his children gave him pause as to how the elements of teaching should be employed. By the time his son Carl was four, Feynman was “actively lobbying against a first-grade science book proposed for California schools.”

It began with pictures of a mechanical wind-up dog, a real dog, and a motorcycle, and for each the same question: “What makes it move?” The proposed answer—“ Energy makes it move”— enraged him.

That was tautology, he argued—empty definition. Feynman, having made a career of understanding the deep abstractions of energy, said it would be better to begin a science course by taking apart a toy dog, revealing the cleverness of the gears and ratchets. To tell a first-grader that “energy makes it move” would be no more helpful, he said, than saying “God makes it move” or “moveability makes it move.”

Feynman proposed a simple test for whether one is teaching ideas or mere definitions: “Without using the new word which you have just learned, try to rephrase what you have just learned in your own language. Without using the word energy, tell me what you know now about the dog’s motion.”

The other standard explanations were equally horrible: gravity makes it fall, or friction makes it wear out. You didn't get a pass on learning because you were a first-grader and Feynman's explanations not only captured the attention of his audience—from Nobel winners to first-graders—but also offered true knowledge. “Shoe leather wears out because it rubs against the sidewalk and the little notches and bumps on the sidewalk grab pieces and pull them off.” That is knowledge. “To simply say, ‘It is because of friction,’ is sad, because it’s not science.”

Richard Feynman on Teaching

Choosing Textbooks for Grade Schools

In 1964 Feynman made the rare decision to serve on a public commission for choosing mathematics textbooks for California's grade schools. As Gleick describes it:

Traditionally this commissionership was a sinecure that brought various small perquisites under the table from textbook publishers. Few commissioners— as Feynman discovered— read many textbooks, but he determined to read them all, and had scores of them delivered to his house.

This was the era of new math in children's textbooks: introducing high-level concepts, such as set theory and non decimal number systems into grade school.

Feynman was skeptical of this approach but rather than simply let it go, he popped the balloon.

He argued to his fellow commissioners that sets, as presented in the reformers’ textbooks, were an example of the most insidious pedantry: new definitions for the sake of definition, a perfect case of introducing words without introducing ideas.

A proposed primer instructed first-graders: “Find out if the set of the lollipops is equal in number to the set of the girls.”

To Feynman this was a disease. It confused without adding precision to the normal sentence: “Find out if there are just enough lollipops for the girls.”

According to Feynman, specialized language should wait until it is needed. (In case you're wondering, he argued the peculiar language of set theory is rarely, if ever, needed —only in understanding different degrees of infinity—which certainly wasn't necessary at a grade-school level.)

Feynman convincingly argued this was knowledge of words without actual knowledge. He wrote:

It is an example of the use of words, new definitions of new words, but in this particular case a most extreme example because no facts whatever are given…. It will perhaps surprise most people who have studied this textbook to discover that the symbol ∪ or ∩ representing union and intersection of sets … all the elaborate notation for sets that is given in these books, almost never appear in any writings in theoretical physics, in engineering, business, arithmetic, computer design, or other places where mathematics is being used.

The point became philosophical.

It was crucial, he argued, to distinguish clear language from precise language. The textbooks placed a new emphasis on precise language: distinguishing “number” from “numeral,” for example, and separating the symbol from the real object in the modern critical fashion— pupil for schoolchildren, it seemed to Feynman. He objected to a book that tried to teach a distinction between a ball and a picture of a ball— the book insisting on such language as “color the picture of the ball red.”

“I doubt that any child would make an error in this particular direction,” Feynman said, adding:

As a matter of fact, it is impossible to be precise … whereas before there was no difficulty. The picture of a ball includes a circle and includes a background. Should we color the entire square area in which the ball image appears all red? … Precision has only been pedantically increased in one particular corner when there was originally no doubt and no difficulty in the idea.

In the real world absolute precision can never be reached and the search for degrees of precision that are not possible (but are desirable) causes a lot of folly.

Feynman has his own ideas for teaching children mathematics.

***

Process vs. Outcome

Feynman proposed that first-graders learn to add and subtract more or less the way he worked out complicated integrals— free to select any method that seems suitable for the problem at hand.A modern-sounding notion was, The answer isn’t what matters, so long as you use the right method. To Feynman no educational philosophy could have been more wrong. The answer is all that does matter, he said. He listed some of the techniques available to a child making the transition from being able to count to being able to add. A child can combine two groups into one and simply count the combined group: to add 5 ducks and 3 ducks, one counts 8 ducks. The child can use fingers or count mentally: 6, 7, 8. One can memorize the standard combinations. Larger numbers can be handled by making piles— one groups pennies into fives, for example— and counting the piles. One can mark numbers on a line and count off the spaces— a method that becomes useful, Feynman noted, in understanding measurement and fractions. One can write larger numbers in columns and carry sums larger than 10.

To Feynman the standard texts were flawed. The problem

29
+3

was considered a third-grade problem because it involved the concept of carrying. However, Feynman pointed out most first-graders could easily solve this problem by counting 30, 31, 32.

He proposed that kids be given simple algebra problems (2 times what plus 3 is 7) and be encouraged to solve them through the scientific method, which is tantamount to trial and error. This, he argued, is what real scientists do.

“We must,” Feynman said, “remove the rigidity of thought.” He continued “We must leave freedom for the mind to wander about in trying to solve the problems…. The successful user of mathematics is practically an inventor of new ways of obtaining answers in given situations. Even if the ways are well known, it is usually much easier for him to invent his own way— a new way or an old way— than it is to try to find it by looking it up.”

It was better in the end to have a bag of tricks at your disposal that could be used to solve problems than one orthodox method. Indeed, part of Feynman's genius was his ability to solve problems that were baffling others because they were using the standard method to try and solve them. He would come along and approach the problem with a different tool, which often led to simple and beautiful solutions.

***

If you give some thought to how Farnam Street helps you, one of the ways is by adding to your bag of tricks so that you can pull them out when you need them to solve problems. We call these tricks mental models and they work kinda like lego — interconnecting and reinforcing one another. The more pieces you have, the more things you can build.

Complement this post with Feynman's excellent advice on how to learn anything.

Claude Shannon: The Man Who Turned Paper Into Pixels

"The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning."— Claude Shannon (1948)
“The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning.”— Claude Shannon (1948)

Claude Shannon is the most important man you've probably never heard of. If Alan Turing is to be considered the father of modern computing, then the American mathematician Claude Shannon is the architect of the Information Age.

The video, created by the British filmmaker Adam Westbrook, echoes the thoughts of Nassim Taleb that boosting the signal does not mean you remove the noise, in fact, just the opposite: you amplify it.

Any time you try to send a message from one place to another something always gets in the way. The original signal is always distorted. Where ever there is signal there is also noise.

So what do you do? Well, the best anyone could do back then was to boost the signal. But then all you do is boost the noise.

Thing is we were thinking about information all wrong. We were obsessed with what a message meant.

A Renoir and a receipt? They’re different, right? Was there a way to think of them in the same way? Like so many breakthroughs the answer came from an unexpected place. A brilliant mathematician with a flair for blackjack.

***

The transistor was invented in 1948, at Bell Telephone Laboratories. This remarkable achievement, however, “was only the second most significant development of that year,” writes James Gleick in his fascinating book: The Information: A History, a Theory, a Flood. The most important development of 1948 and what still underscores modern technology is the bit.

An invention even more profound and more fundamental came in a monograph spread across seventy-nine pages of The Bell System Technical Journal in July and October. No one bothered with a press release. It carried a title both simple and grand “A Mathematical Theory of Communication” and the message was hard to summarize. But it was a fulcrum around which the world began to turn. Like the transistor, this development also involved a neologism: the word bit, chosen in this case not by committee but by the lone author, a thirty-two-year -old named Claude Shannon. The bit now joined the inch, the pound, the quart, and the minute as a determinate quantity— a fundamental unit of measure.

But measuring what? “A unit for measuring information,” Shannon wrote, as though there were such a thing, measurable and quantifiable, as information.

[…]

Shannon’s theory made a bridge between information and uncertainty; between information and entropy; and between information and chaos. It led to compact discs and fax machines, computers and cyberspace, Moore’s law and all the world’s Silicon Alleys. Information processing was born, along with information storage and information retrieval. People began to name a successor to the Iron Age and the Steam Age.

Gleick also recounts the relationship between Turing and Shannon:

In 1943 the English mathematician and code breaker Alan Turing visited Bell Labs on a cryptographic mission and met Shannon sometimes over lunch, where they traded speculation on the future of artificial thinking machines. (“ Shannon wants to feed not just data to a Brain, but cultural things!” Turing exclaimed. “He wants to play music to it!”)

Commenting on vitality of information, Gleick writes:

(Information) pervades the sciences from top to bottom, transforming every branch of knowledge. Information theory began as a bridge from mathematics to electrical engineering and from there to computing. … Now even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level— an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions.… If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.” The cells of an organism are nodes in a richly interwoven communications network, transmitting and receiving, coding and decoding. Evolution itself embodies an ongoing exchange of information between organism and environment.

The bit is the very core of the information age.

The bit is a fundamental particle of a different sort: not just tiny but abstract— a binary digit, a flip-flop, a yes-or-no. It is insubstantial, yet as scientists finally come to understand information, they wonder whether it may be primary: more fundamental than matter itself. They suggest that the bit is the irreducible kernel and that information forms the very core of existence.

In the words of John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, information gives rise to “every it— every particle, every field of force, even the spacetime continuum itself.”

This is another way of fathoming the paradox of the observer: that the outcome of an experiment is affected, or even determined, when it is observed. Not only is the observer observing, she is asking questions and making statements that must ultimately be expressed in discrete bits. “What we call reality,” Wheeler wrote coyly, “arises in the last analysis from the posing of yes-no questions.” He added: “All things physical are information-theoretic in origin, and this is a participatory universe.” The whole universe is thus seen as a computer —a cosmic information-processing machine.

The greatest gift of Prometheus to humanity was not fire after all: “Numbers, too, chiefest of sciences, I invented for them, and the combining of letters, creative mother of the Muses’ arts, with which to hold all things in memory .”

Information technologies are both relative in the time they were created and absolute in terms of the significance. Gleick writes:

The alphabet was a founding technology of information. The telephone, the fax machine, the calculator, and, ultimately, the computer are only the latest innovations devised for saving, manipulating, and communicating knowledge. Our culture has absorbed a working vocabulary for these useful inventions. We speak of compressing data, aware that this is quite different from compressing a gas. We know about streaming information, parsing it, sorting it, matching it, and filtering it. Our furniture includes iPods and plasma displays, our skills include texting and Googling, we are endowed, we are expert, so we see information in the foreground. But it has always been there. It pervaded our ancestors’ world, too, taking forms from solid to ethereal, granite gravestones and the whispers of courtiers. The punched card, the cash register, the nineteenth-century Difference Engine, the wires of telegraphy all played their parts in weaving the spiderweb of information to which we cling. Each new information technology, in its own time, set off blooms in storage and transmission. From the printing press came new species of information organizers: dictionaries, cyclopaedias, almanacs— compendiums of words, classifiers of facts, trees of knowledge. Hardly any information technology goes obsolete. Each new one throws its predecessors into relief. Thus Thomas Hobbes, in the seventeenth century, resisted his era’s new-media hype: “The invention of printing, though ingenious, compared with the invention of letters is no great matter.” Up to a point, he was right. Every new medium transforms the nature of human thought. In the long run, history is the story of information becoming aware of itself.

The Information: A History, a Theory, a Flood is a fascinating read.

(image source)

The Information: A History, A Theory, A Flood

james gleick the information

“The fundamental problem of communication is that of reproducing at
one point either exactly or approximately a message selected at another point.
Frequently the messages have meaning.”

Claude Shannon (1948)

***

“When information is cheap, attention becomes expensive.” Information is something we are all curious about but how accurately can we predict the future if we fail to understand the past? This is part of what noted science writer James Gleick explores in The Information: A History, a Theory, a Flood.

It is not the amount of knowledge that makes a brain. It is not even the distribution of knowledge. It is the interconnectedness.

the information

The “history” explores African drum languages, writing and lexicography, the story of Morse code, the telegraph and telephone, and brings us into computing with our desire to increase the efficiency with which we communicate language. The “theory” touches on Claude Shannon, Norbert Wiener, and Alan Turing among others who laid the foundation. The “flood” explains how biology uses genetics as a mechanism for information exchange and self-replicating memes.

For the purposes of science, information had to mean something special. Three centuries earlier, the new discipline of physics could not proceed until Isaac Newton appropriated words that were ancient and vague — force, mass, motion, and even time — and gave them new meanings. Newton made these terms into quantities, suitable for use in mathematical formulas. Until then, motion (for example) had been just as soft and inclusive a term as information. For Aristotelians, motion covered a far-flung family of phenomena: a peach ripening, a stone falling, a child growing, a body decaying. That was too rich. Most varieties of motion had to be tossed out before Newton’s laws could apply and the Scientific Revolution could succeed. In the nineteenth century, energy began to undergo a similar transformation: natural philosophers adapted a word meaning vigor or intensity. They mathematicized it, giving energy its fundamental place in the physicists’ view of nature.

It was the same with information. A rite of purification became necessary.

And then, when it was made simple, distilled, counted in bits, information was found to be everywhere. Shannon’s theory made a bridge between information and uncertainty; between information and entropy; and between information and chaos. It led to compact discs and fax machines, computers and cyberspace, Moore’s law and all the world’s Silicon Alleys. Information processing was born, along with information storage and information retrieval. People began to name a successor to the Iron Age and the Steam Age. “Man the food-gatherer reappears incongruously as information-gatherer,” remarked Marshall McLuhan in 1967. He wrote this an instant too soon, in the first dawn of computation and cyberspace.

We can see now that information is what our world runs on: the blood and the fuel, the vital principle. It pervades the sciences from top to bottom, transforming every branch of knowledge. Information theory began as a bridge from mathematics to electrical engineering and from there to computing. What English speakers call “computer science” Europeans have known as informatique, informatica, and Informatik. Now even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level — an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’ ” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions. . . . If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.” The cells of an organism are nodes in a richly interwoven communications network, transmitting and receiving, coding and decoding. Evolution itself embodies an ongoing exchange of information between organism and environment.

In an interview with PW Gleick answers the deceptively simple question: What is information?

My first inclination is to define information by listing all the forms it takes—words, music, visual images, and all the ways we store and transmit our knowledge of the world. But in 1948 engineers came up with a more technical definition. At its most fundamental, information is a binary choice. In other words, a single bit of information is one yes-or-no choice. This is a very powerful concept that has made a lot of modern technology possible. But as empowering as this definition is, it is also desiccating, because it strips away any notion of meaning, usefulness, knowledge, or wisdom. By the technical definition, all information has a certain value, regardless of whether the message it conveys is true or false. A message could be complete nonsense, for example, and still take 1,000 bits. So while the technical definition has helped us become powerful users of information, it also instantly put us on thin ice, because everything we care about involves meaning, truth, and, ultimately, something like wisdom. And as we now flood the world with information, it becomes harder and harder to find meaning. That paradox is the final tension in my book.

In the age of print, scarcity was the issue. In the digital age, it is abundance. What are the implications of that shift?

There are two keys to cope with the information flood: searching and filtering. Think about how many times you are having a conversation with a group of people, and the most interesting feature of the conversation is some dispute over something you can't quite remember. Now, any one of us has the power to pull out their iPhone and do a Google search—it's just a matter of who is going to be rude enough to do it first [laughs]. We are now like gods in our ability to search for and find information.

But where we remain all too mortal is in our ability to process it, to make sense of it, and to filter and find the information we want. That's where the real challenges lie. Take, for example, writing a nonfiction book. The tools at my disposal now compared to just 10 years ago are extraordinary. A sentence that once might have required a day of library work now might require no more than a few minutes on the Internet. That is a good thing. Information is everywhere, and facts are astoundingly accessible. But it's also a challenge, because authors today must pay more attention than ever to where we add value. And I can tell you this, the value we add is not in the few minutes of work it takes to dig up some factoid, because any reader can now dig up the same factoid in the same few minutes.

In The Information, Gleick neatly captures today's reality. “We know about streaming information, parsing it, sorting it, matching it, and filtering it. Our furniture includes iPods and plasma screens, our skills include texting and Googling, we are endowed, we are expert, so we see information in the foreground,” he writes. “But it has always been there.”

We have met the Devil of Information Overload and his impish underlings, the computer virus, the busy signal, the dead link, and the PowerPoint presentation.

Still curious? See The Filter Bubble.

We’re in the (bad) Habit of Associating Value with Scarcity

James Gleick, author of The Information: A History, A Theory, A Flood, says:

We’re in the habit of associating value with scarcity, but the digital world unlinks them. You can be the sole owner of a Jackson Pollock or a Blue Mauritius but not of a piece of information — not for long, anyway. Nor is obscurity a virtue. A hidden parchment page enters the light when it molts into a digital simulacrum. It was never the parchment that mattered.