Fooled By Randomness

fooled by randomness

I don’t want you to make the same mistake I did.

I waited too long before reading Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets by Nassim Taleb. He wrote the book before the Black Swan and Antifragile, which propelled him into intellectual celebrity. Interestingly, Fooled by Randomness contains semi-explored gems of the ideas that would later go on to become the best-selling books The Black Swan and Antifragile.

Hindsight Bias

Part of the argument that Fooled by Randomness presents is that when we look back at things that have happened we see them as less random than they actually were.

It is as if there were two planets: the one in which we actually live and the one, considerably more deterministic, on which people are convinced we live. It is as simple as that: Past events will always look less random than they were (it is called the hindsight bias). I would listen to someone’s discussion of his own past realizing that much of what he was saying was just backfit explanations concocted ex post by his deluded mind.

The Courage of Montaigne

Writing on Montaigne as the role model for the modern thinker, Taleb also addresses his courage:

It certainly takes bravery to remain skeptical; it takes inordinate courage to introspect, to confront oneself, to accept one’s limitations— scientists are seeing more and more evidence that we are specifically designed by mother nature to fool ourselves.


Fooled by Randomness is about probability, not in a mathematical way but as skepticism.

In this book probability is principally a branch of applied skepticism, not an engineering discipline. …

Probability is not a mere computation of odds on the dice or more complicated variants; it is the acceptance of the lack of certainty in our knowledge and the development of methods for dealing with our ignorance. Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem or a brain teaser. Mother nature does not tell you how many holes there are on the roulette table , nor does she deliver problems in a textbook way (in the real world one has to guess the problem more than the solution).

Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem” which is fascinating given how we tend to solve problems. In decisions under uncertainty, I discussed how risk and uncertainty are different things, which creates two types of ignorance.

Most decisions are not risk-based, they are uncertainty-based and you either know you are ignorant or you have no idea you are ignorant. There is a big distinction between the two. Trust me, you’d rather know you are ignorant.

Randomness Disguised as Non-Randomness

The core of the book is about luck that we understand as skill or “randomness disguised as non-randomness (that is determinism).”

This problem manifests itself most frequently in the lucky fool, “defined as a person who benefited from a disproportionate share of luck but attributes his success to some other, generally very precise, reason.”

Such confusion crops up in the most unexpected areas, even science, though not in such an accentuated and obvious manner as it does in the world of business. It is endemic in politics, as it can be encountered in the shape of a country’s president discoursing on the jobs that “he” created, “his” recovery, and “his predecessor’s” inflation.

These lucky fools are often fragilistas — they have no idea they are lucky fools. For example:

[W]e often have the mistaken impression that a strategy is an excellent strategy, or an entrepreneur a person endowed with “vision,” or a trader a talented trader, only to realize that 99.9% of their past performance is attributable to chance, and chance alone. Ask a profitable investor to explain the reasons for his success; he will offer some deep and convincing interpretation of the results. Frequently, these delusions are intentional and deserve to bear the name “charlatanism.”

This does not mean that all success is luck or randomness. There is a difference between “it is more random than we think” and “it is all random.”

Let me make it clear here : Of course chance favors the prepared! Hard work, showing up on time, wearing a clean (preferably white) shirt, using deodorant, and some such conventional things contribute to success— they are certainly necessary but may be insufficient as they do not cause success. The same applies to the conventional values of persistence, doggedness and perseverance: necessary, very necessary. One needs to go out and buy a lottery ticket in order to win. Does it mean that the work involved in the trip to the store caused the winning? Of course skills count, but they do count less in highly random environments than they do in dentistry.

No, I am not saying that what your grandmother told you about the value of work ethics is wrong! Furthermore, as most successes are caused by very few “windows of opportunity,” failing to grab one can be deadly for one’s career. Take your luck!

That last paragraph connects to something Charlie Munger once said: “Really good investment opportunities aren’t going to come along too often and won’t last too long, so you’ve got to be ready to act. Have a prepared mind.

Taleb thinks of success in terms of degrees, so mild success might be explained by skill and labour but outrageous success “is attributable variance.”

Luck Makes You Fragile

One thing Taleb hits on that really stuck with me is that “that which came with the help of luck could be taken away by luck (and often rapidly and unexpectedly at that). The flipside, which deserves to be considered as well (in fact it is even more of our concern), is that things that come with little help from luck are more resistant to randomness.” How Antifragile.

Taleb argues this is the problem of induction, “it does not matter how frequently something succeeds if failure is too costly to bear.”

Noise and Signal

We are confused between noise and signal.

…the literary mind can be intentionally prone to the confusion between noise and meaning, that is, between a randomly constructed arrangement and a precisely intended message. However, this causes little harm; few claim that art is a tool of investigation of the Truth— rather than an attempt to escape it or make it more palatable. Symbolism is the child of our inability and unwillingness to accept randomness; we give meaning to all manner of shapes; we detect human figures in inkblots.

All my life I have suffered the conflict between my love of literature and poetry and my profound allergy to most teachers of literature and “critics.” The French thinker and poet Paul Valery was surprised to listen to a commentary of his poems that found meanings that had until then escaped him (of course, it was pointed out to him that these were intended by his subconscious).

If we’re concerned about situations where randomness is confused with non randomness should we also be concerned with situations where non randomness is mistaken for randomness, which would result in signal being ignored?

First, I am not overly worried about the existence of undetected patterns. We have been reading lengthy and complex messages in just about any manifestation of nature that presents jaggedness (such as the palm of a hand, the residues at the bottom of Turkish coffee cups, etc.). Armed with home supercomputers and chained processors, and helped by complexity and “chaos” theories, the scientists, semiscientists, and pseudoscientists will be able to find portents. Second, we need to take into account the costs of mistakes; in my opinion, mistaking the right column for the left one is not as costly as an error in the opposite direction. Even popular opinion warns that bad information is worse than no information at all.

If you haven’t yet, pick up a copy of Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets. Don’t make the same mistake I did and wait to read this important book.

(image via)

The Decision-Maker: A Tool For a Lifetime

Seymour Schulich

The first chapter in Seymour Schulich’s book, Get Smarter: Life and Business Lessons, offers a decision tool that adds to the simple pro-and-con list that many of us have used to make decisions. Schulich, a self-made billionaire, is one of Canada’s richest and best-known businessmen.

I learned this tool in a practical mathematics course more than fifty years ago and have used it for virtually every major decision of my adult life. It has never let me down and it will serve you well, too.

You all know the simple pro-and-con list? The one where you divide the page in two and simply list out all the pros and cons. Well, the Decision-Maker adds a twist to that. Here’s how it works.

On one sheet of paper, list all the positive things you can about the issue in question, then give each one a score from zero to ten—the higher the score, the more important it is to you.

On another sheet, list the negative points, and score them from zero to ten—only this time, ten means it’s a major drawback. Suppose you are thinking of buying a house, and you tour one that’s in your price range, except the owners have painted every room to look like a giant banana. If you really hate yellow and can’t stand the thought of lifting a paint brush, you might give “ugly yellow house” a ten, and if it’s not that big a deal, maybe a two or a three.

Now add up the scores. But here’s the rule.

If the positive score is at least double the negative score, you should do it—whatever “it” is. But if the positives don’t outweigh the negatives by that two-to-one ratio, don’t do it, or at least think twice about it.

Yes that sounds simple. I agree. But I also don’t think that things need to be complicated in order to be effective.

The Decision-Maker is designed not to allow one or two factors to sway a major life decision in a disproportionate way. It forces you to strip away the emotion and really examine the relative importance of each point—which, of course, is why it works so well.

This tool works for groups too.

When we were considering whether to sell our royalty company, Franco-Nevada, to Newmont Mining, Franco’s executive team produced a collective Decision-Maker. We listed all the pros and cons, then the top four executives assigned their own point scores to each. We averaged them, the positives far outweighed the negatives, and we sold the company.

The Difference Between Seeing and Observing

The Art of Observation
In A Scandal in Bohemia, Sherlock Holmes teaches Watson the difference between seeing and observing:

“When I hear you give your reasons,” I remarked, “the thing always appears to me to be so ridiculously simple that I could easily do it myself, though at each successive instance of your reasoning, I am baffled until you explain your process. And yet I believe that my eyes are as good as yours.”

“Quite so,” he answered, lighting a cigarette, and throwing himself down into an armchair. “You see, but you do not observe. The distinction is clear. For example, you have frequently seen the steps which lead up from the hall to this room.”


“How often?”

“Well, some hundreds of times.”

“Then how many are there?”

“How many? I don’t know.”

“Quite so! You have not observed. And yet you have seen. That is just my point. Now, I know that there are seventeen steps, because I have both seen and observed.”

The difference between seeing and observing is fundamental to many aspects of life. Indeed, we can learn a lot from how Sherlock Holmes thinks. Noticing is even something that Nassim Taleb has chimed in on with Noise and Signal.

In the video below, Harvard Business School Professor Max Bazerman, author of The Power of Noticing: What the Best Leaders See, discusses how important it is not just to be able to focus, but to be a good noticer as well. What he’s really talking about is observation.

A number of years ago I had an opportunity to notice and I failed to do so and it’s been an obsession with me ever since. On March 10, 2005 I was hired by the U.S. Department of Justice in a landmark case that they were fighting against the tobacco industry. I was hired as a remedy witness. That is, I was hired to provide recommendations to the court about what the penalty would be if, in fact, the Department of Justice succeeded in its trial against the tobacco industry. I had spent a couple hundred hours working for the Department of Justice including submitting my written direct testimony which had been submitted to the court.

I was scheduled to be on the stand on May 4 where the tobacco industry attorneys would be asking me a series of questions. On April 30, a number of days before my May 4 testimony I was in Washington D.C. to meet with the Department of Justice attorneys to prepare for my time on the stand. When the day started the Department of Justice attorney that I had been working with said to me, “Professor Bazerman.” This occurred long after he had learned to call me Max. He said, “Professor Bazerman, the Department of Justice requests that you amend your testimony to note that the testimony would not be relevant if any of the following four conditions existed.”

He then read to me four legal conditions that I didn’t understand. When he was done talking I said to him, “Why would you ask me to amend my testimony when you know that I didn’t understand what you just said to me.” And his response was because if you don’t agree, there’s a significant chance that senior leadership in the Department of Justice will remove you from the case before you are on the stand on May 4. To which I said, “Okay, I don’t agree to those changes.” And his response was, “Good. Let’s continue with your preparation.” I was jarred by the fact that something very strange had occurred. But I was overwhelmed in life. I was trying to help this case and I didn’t quite know what had occurred. But to this day I’m critical of the fact that I took no action. I did appear on trial on May 4 and the trial ended in early June.

But on June 17 I woke up in a hotel room in London. I was working with another client at the time. And I woke up early at 5:00 a.m. and I opened up The New York Times web edition and I read a story about Matt Myers, the president of Tobacco Free Kids, who had come forward to the media with evidence about how Robert McCallum, the number two official in the Department of Justice, was involved in attempting to get him to change his testimony. And I then read basically the same account that I had experienced back on April 30. Matt Myers had the insight to know that he should do something with this information about what had occurred in terms of the attempt to tamper with this testimony. And at that point it was straightforward to come forward to the media to speak to congressional representatives about what happened. And my own role received media attention as well.

But to this day I’m still struck by the fact that I didn’t come forward on April 30 when, in the back of my mind I knew something had occurred. The reason I tell you this story is because I think a lot of our failure to notice happens when we’re busy. It happens when we don’t know exactly what’s happening. But I think it’s our job as executives, as leaders, as professionals to act when we’re pretty sure that there’s something wrong. It’s our job to notice and to not simply be passive when we can’t quite figure out the evidence. It’s our job to take steps to figure out what’s going on and to act on critical information.


Follow your curiosity and learn about why your decision making environment matters.

The Relationship Between Design and Planning


While I’m not all that interested in military doctrine and tactics in and of themselves, I am interested in complex systems, how the weak win wars, and the lessons military leaders offer (for example, see the lessons of William McRaven and Stanley McChrystal).

This is how I found myself flipping through The U.S. Army / Marine Corps Counterinsurgency Field Manual, which was written to facilitate a common understanding of the problems inherent in counterinsurgency campaigns.

There was a fascinating section on the difference between designing and planning that caused me to pause and reflect.

While both activities seek to formulate ways to bring about preferable futures, they are cognitively different. Planning applies established procedures to solve a largely understood problem within an accepted framework. Design inquires into the nature of a problem to conceive a framework for solving that problem. In general, planning is problem solving, while design is problem setting. Where planning focuses on generating a plan—a series of executable actions—design focuses on learning about the nature of an unfamiliar problem.

When situations do not conform to established frames of reference — when the hardest part of the problem is figuring out what the problem is—planning alone is inadequate and design becomes essential. In these situations, absent a design process to engage the problem’s essential nature, planners default to doctrinal norms; they develop plans based on the familiar rather than an understanding of the real situation. Design provides a means to conceptualize and hypothesize about the underlying causes and dynamics that explain an unfamiliar problem. Design provides a means to gain understanding of a complex problem and insights towards achieving a workable solution.

To better understand the multifaceted problems many of us face today it helps to talk with people who have different perspectives. This helps achieve better situational understanding. At best this can point the way to solutions and at worst this should help with learning what to avoid.

Often we skip the information gathering phase because it’s a lot of work. A lot of conversations. However this process helps us become informed, rather than just opinionated.

The underlying premise is this: when participants achieve a level of understanding such that the situation no longer appears complex, they can exercise logic and intuition effectively. As a result, design focuses on framing the problem rather than developing courses of action.

Just as you can never step in the same river twice, design is not something you do once and walk away. It’s an ongoing inquiry into the nature of problems and the various factors and relationships to help improve understanding. Constantly assessing the situation from a design perspective, helps gauge the effectiveness of the planning and subsequent actions. If you don’t periodically reassess the situation, you might be solving a problem that no longer exists.

The U.S. Army / Marine Corps Counterinsurgency Field Manual is full of other thought-provoking content.

(image source)

How We Can Improve Our Decisions

I gave a keynote speech at the Pender Investment Conference in early November. This was a great venue to talk about some of the things I’ll be discussing in more detail at Re:Think Decision Making this February in San Francisco.

One of the most common things people ask me is how we can improve our decisions. Better decisions are a function of two things that sometimes conflict: making fewer mistakes and having better insights.

Pender was kind enough to put together a summary of the speech that I wanted to share with you.

Relevant Thought Processes


When Charlie Munger was asked the secret to his success, he responded “I’m rational.” But what does it mean to be rational?

In investing as in life, there is perhaps nothing more useful than developing thought processes that help determine what is true and what is best to do, and that lead to rational action and belief.

In order to take actions that fulfill our goals, we must base those actions on beliefs that are properly calibrated to the world. We all want our beliefs to align with the way the world actually works and we also want to maximize the chances of achieving our goals, based on the resources at our disposal.


We tend to think of emotion as the enemy of rational behavior and that removing emotional biases from our decision making process will make us more rational. However, while emotions can impede rational decisions, they can also facilitate them. Emotions serve a valuable role because they help us focus.

​​“… emotions stop the combinatorial explosion of possibilities that would occur if an intelligent system tried to calculate the utility of all possible future outcomes. … In short, emotions get us in the right ballpark of the correct response.” — Keith Stanovich


But we often have patterns of behavior that cause us to fall short on rational thinking.

One way of looking at this is that our brains have two ways of reacting. One is an automatic mode. This is when our minds, without thinking, have an immediate reaction to a situation with little or no conscious effort.

This is how we evolved. This is our instinct. I don’t have to think very hard when I see a lion coming at me. I quickly respond to the situation in order to maximize my odds of safety. Academics call this System One thinking.

System Two thinking is when we allocate our scarce resources of attention and effort to mental activities in order to (hopefully) make better decisions. We must pay attention or we are likely to end up behaving in a way that is not rational. That is, in a way that fails to acknowledge how the world works, or failing to use the tools at our disposal in the best way possible to achieve our aim.

If we believe the premises that mental resources are scarce and we require these resources to make decisions, it stands to reason that when it comes to searching for a reason to do something our brains shut off when we see an idea that we think fits. We stop thinking because thinking is resource intensive and our bodies are designed to conserve resources.

So this has some interesting implications for decision making. Rather than trying to eliminate these innate biases, a better way is to develop mental models.

Decision Environments

Most of us make decisions in an environment where it is very hard for us to behave rationally.

I’m hard pressed to imagine an environment less conducive to rational decision making than that of the modern office worker. It is hard to make rational decisions in this pressured environment.

Now picture Warren Buffett sitting at his desk with his feet up and reading. Look at his empty calendar. He has no meetings and he doesn’t even have a computer in his office. He’s in Omaha for a reason.

​​“It’s very easy to think clearly here. You’re undisturbed by irrelevant factors and the noise generally of business investments. If you can’t think clearly in Omaha, you’re not going to think clearly anyplace.” — Warren Buffett

Buffett is smart enough to structure his environment to reduce decisions, interruptions, and encourage focus.

Our routines influence our decisions as well. One of the objectives of good decision-making is to ensure that decisions are not influenced by irrelevant information.

​​“The most successful people don’t have super-strong willpower when making decisions. Rather, they conserve their willpower by developing habits and routines, so they reduce the amount of stress in their lives. … The more choices you make, the harder they become. To save energy your brain starts to look for shortcuts. One shortcut is to be reckless and act impulsively (rather than rationally). The other shortcut is to do nothing, which saves as much energy as possible (and often creates bigger problems in the long run).” — Roy Baumeister

If you agree with that then it impacts how you organize your day. You may decide to meet clients in the morning as you know that later in the day, they (and you) are likely already at a point of “decision fatigue”.

Circle of Competence

​​“You don’t have to be an expert on every company, or even many. You only have to be able to evaluate companies within your circle of competence. The size of that circle is not very important; knowing its boundaries, however, is vital.” — Warren Buffett

It’s a simple concept: Each of us, through experience or study, has built up useful knowledge on certain areas of the market. We do not necessarily need to understand the more esoteric areas to invest capital. Far more important is to honestly define what we do know and to stick to those areas. The circle can be widened, but only slowly and over time. Mistakes are most often made when straying from this discipline.

If you cannot understand the variables that govern a given situation you are probably outside of your circle of competence. Ideally you’d pass this decision off to someone for whom this is inside their circle of competence. But what can you do if you are outside your circle of competence and have to make a decision quickly?


Carl Gustav Jacob Jacobi, the German mathematician said, “invert, always invert” recommending that “many hard problems are best solved when they are addressed backward.”

This model is one of the most powerful thinking habits we can adopt. “Indeed,” says Charlie Munger, “many problems can’t be solved forward.”

Think about what makes life good. Now invert the process and think about what would make life bad. Knowing what would make life bad gives you a shortlist of what to sidestep. Both thinking forwards and thinking backwards can result in action, however, despite your best intentions, thinking forward can in fact increase the odds that you’ll cause harm, while thinking backwards is actually less likely to cause harm – call it the avoiding stupidity filter.

Decision Journals

As participants in the financial industry, offering guidance, advice, and opinion to clients, your product is decisions. So you should care enormously whether you’re providing good advice or bad advice, and continuously looking for opportunities to learn.

The key to providing good advice is to have a good process. Ultimately, the process matters more than the outcome because the outcome doesn’t always tell you if you made a good decision or a poor one.

If you have a good process and a good outcome that’s deserved success. If, however, you have a good process and a bad outcome, that’s what I call a bad break. If, on the other hand, you have a bad process and a good outcome, that’s just dumb luck. And finally a bad process with a bad outcome is really just poetic justice.

When asked how we can improve decisions, Daniel Kahneman, one of the preeminent psychologists in the world who won a Nobel prize in economics, said, without hesitation, buy a very cheap notebook and start keeping track of your decisions: A decision journal.

The basic idea is that whenever you’re making a decision of consequence you should write down what you decided and why. Perhaps you want to do this with your clients, ask them these questions as they are making an investment. It could facilitate great future discussions, regardless of whether the investment is a success or failure.

Conceptually, this is pretty easy but it requires some discipline and humility to implement and maintain.


The long and short of it is: In your decision making, spend less time trying to be brilliant and more time trying to avoid obvious stupidity. Avoiding stupidity is easier than seeking brilliance.

The History of Cognitive Overload

The Organized Mind

The Organized Mind: Thinking Straight in the Age of Information Overload, a book by Daniel Levitin, has an interesting section on cognitive overload.

Each day we are confronted with hundreds, probably thousands of decisions. Most of which are insignificant or unimportant or both. Do we really need a whole aisle for toothpaste?

In response to all of these decisions most of us adopt a strategy of satisficing, a term coined by Nobel Prize winner Herbert Simon to describe something that is perhaps not the best but good enough. For things that don’t matter, this is a good approach. You don’t know which pizza place is the best but you know which ones are good enough.

Satisficing is one of the foundations of productive human behavior; it prevails when we don’t waste time on decisions that don’t matter, or more accurately, when we don’t waste time trying to find improvements that are not going to make a significant difference in our happiness or satisfaction.

All of us, Levitin argues, engage in satisficing every time we clean our homes.

If we got down on the floor with a toothbrush every day to clean the grout, if we scrubbed the windows and walls every single day, the house would be spotless. But few of us go to this much trouble even on a weekly basis (and when we do, we’re likely to be labeled obsessive-compulsive). For most of us, we clean our houses until they are clean enough, reaching a kind of equilibrium between effort and benefit. It is this cost-benefits analysis that is at the heart of satisficing.

The easiest way to be happy is to want what you already have. “Happy people engage in satisficing all the time, even if they don’t know it.”

Satisficing is a tool that allows you not to waste time on things that don’t really matter. Who cares if you pick Colgate or Crest? For other decisions, “the old-fashioned pursuit of excellence remains the right strategy.”

We now spend an unusual amount of time and energy ignoring and filtering. Consider the supermarket.

In 1976, the average supermarket stocked 9,000 unique products; today that number has ballooned to 40,000 of them, yet the average person gets 80%– 85% of their needs in only 150 different supermarket items. That means that we need to ignore 39,850 items in the store.

This comes with a cost.

Neuroscientists have discovered that unproductivity and loss of drive can result from decision overload. Although most of us have no trouble ranking the importance of decisions if asked to do so, our brains don’t automatically do this.

We have a limited number of decisions. There are only so many we can make in a day. Once we’ve hit that limit it doesn’t matter how important they are.

The decision-making network in our brain doesn’t prioritize.

Our world has exploded. Information is abundant. I didn’t think we could process it all but Levitin argues that we can, at a cost.

We can have trouble separating the trivial from the important, and all this information processing makes us tired. Neurons are living cells with a metabolism; they need oxygen and glucose to survive and when they’ve been working hard, we experience fatigue. Every status update you read on Facebook, every tweet or text message you get from a friend, is competing for resources in your brain with important things like whether to put your savings in stocks or bonds, where you left your passport, or how best to reconcile with a close friend you just had an argument with.

The processing capacity of the conscious mind has been estimated at 120 bits per second. That bandwidth, or window, is the speed limit for the traffic of information we can pay conscious attention to at any one time. While a great deal occurs below the threshold of our awareness, and this has an impact on how we feel and what our life is going to be like, in order for something to become encoded as part of your experience, you need to have paid conscious attention to it.

What does this mean?

In order to understand one person speaking to us, we need to process 60 bits of information per second. With a processing limit of 120 bits per second, this means you can barely understand two people talking to you at the same time. Under most circumstances, you will not be able to understand three people talking at the same time. …

With such attentional restrictions, it’s clear why many of us feel overwhelmed by managing some of the most basic aspects of life. Part of the reason is that our brains evolved to help us deal with life during the hunter-gatherer phase of human history, a time when we might encounter no more than a thousand people across the entire span of our lifetime. Walking around midtown Manhattan, you’ll pass that number of people in half an hour.

Attention is the most essential mental resource for any organism. It determines which aspects of the environment we deal with, and most of the time, various automatic, subconscious processes make the correct choice about what gets passed through to our conscious awareness. For this to happen, millions of neurons are constantly monitoring the environment to select the most important things for us to focus on. These neurons are collectively the attentional filter. They work largely in the background, outside of our conscious awareness. This is why most of the perceptual detritus of our daily lives doesn’t register, or why, when you’ve been driving on the freeway for several hours at a stretch, you don’t remember much of the scenery that has whizzed by: Your attentional system “protects” you from registering it because it isn’t deemed important. This unconscious filter follows certain principles about what it will let through to your conscious awareness.

The attentional filter is one of evolution’s greatest achievements. In nonhumans, it ensures that they don’t get distracted by irrelevancies. Squirrels are interested in nuts and predators, and not much else. Dogs, whose olfactory sense is one million times more sensitive than ours, use smell to gather information about the world more than they use sound, and their attentional filter has evolved to make that so. If you’ve ever tried to call your dog while he is smelling something interesting, you know that it is very difficult to grab his attention with sound— smell trumps sound in the dog brain. No one has yet worked out all of the hierarchies and trumping factors in the human attentional filter, but we’ve learned a great deal about it. When our protohuman ancestors left the cover of the trees to seek new sources of food, they simultaneously opened up a vast range of new possibilities for nourishment and exposed themselves to a wide range of new predators. Being alert and vigilant to threatening sounds and visual cues is what allowed them to survive; this meant allowing an increasing amount of information through the attentional filter.

Levitin points out an interesting fact on how highly successful people (HSP) differ from the rest of us when it comes to attentional filters.

Successful people— or people who can afford it— employ layers of people whose job it is to narrow the attentional filter. That is, corporate heads, political leaders, spoiled movie stars, and others whose time and attention are especially valuable have a staff of people around them who are effectively extensions of their own brains, replicating and refining the functions of the prefrontal cortex’s attentional filter.

These highly successful persons have many of the daily distractions of life handled for them, allowing them to devote all of their attention to whatever is immediately before them. They seem to live completely in the moment. Their staff handle correspondence, make appointments, interrupt those appointments when a more important one is waiting, and help to plan their days for maximum efficiency (including naps!). Their bills are paid on time, their car is serviced when required, they’re given reminders of projects due, and their assistants send suitable gifts to the HSP’s loved ones on birthdays and anniversaries. Their ultimate prize if it all works? A Zen-like focus.

Levitin argues that if we organize our minds and our lives “following the new neuroscience of attention and memory, we can all deal with the world in ways that provide the sense of freedom that these highly successful people enjoy.”

To do that, however, we need to understand the architecture of our attentional system. “To better organize our mind, we need to know how it has organized itself.”

Change and importance are two crucial principles used by our attentional filter.

The brain’s change detector is at work all the time, whether you know it or not. If a close friend or relative calls on the phone, you might detect that her voice sounds different and ask if she’s congested or sick with the flu. When your brain detects the change, this information is sent to your consciousness, but your brain doesn’t explicitly send a message when there is no change. If your friend calls and her voice sounds normal, you don’t immediately think, “Oh, her voice is the same as always.” Again, this is the attentional filter doing its job, detecting change, not constancy.

Importance can also filter information. But it’s not objective or absolute importance but something personal and relevant to you.

If you’re driving, a billboard for your favorite music group might catch your eye (really, we should say catch your mind) while other billboards go ignored. If you’re in a crowded room, at a party for instance, certain words to which you attach high importance might suddenly catch your attention, even if spoken from across the room. If someone says “fire” or “sex” or your own name, you’ll find that you’re suddenly following a conversation far away from where you’re standing, with no awareness of what those people were talking about before your attention was captured.

The attentional filter lets us live on autopilot most of the time coming out of it only when we need to. In so doing, we “do not register the complexities, nuances, and often the beauty of what is right in front of us.”

A great number of failures of attention occur because we are not using these two principles to our advantage.

Simply put, attention is limited.

A critical point that bears repeating is that attention is a limited-capacity resource— there are definite limits to the number of things we can attend to at once. We see this in everyday activities. If you’re driving, under most circumstances, you can play the radio or carry on a conversation with someone else in the car. But if you’re looking for a particular street to turn onto, you instinctively turn down the radio or ask your friend to hang on for a moment, to stop talking. This is because you’ve reached the limits of your attention in trying to do these three things. The limits show up whenever we try to do too many things at once.

Our brain hides things from us.

The human brain has evolved to hide from us those things we are not paying attention to. In other words, we often have a cognitive blind spot: We don’t know what we’re missing because our brain can completely ignore things that are not its priority at the moment— even if they are right in front of our eyes. Cognitive psychologists have called this blind spot various names, including inattentional blindness.

One of the most famous demonstrations of this is the basketball video (for more see: The Invisible Gorilla: How Our Intuitions Deceive Us.)

A lot of instances of losing things like car keys, passports, money, receipts, and so on occur because our attentional systems are overloaded and they simply can’t keep track of everything. The average American owns thousands of times more possessions than the average hunter-gatherer. In a real biological sense, we have more things to keep track of than our brains were designed to handle. Even towering intellectuals such as Kant and Wordsworth complained of information excess and sheer mental exhaustion induced by too much sensory input or mental overload.

But we need not fear this cognitive overload, Levitin argues. “More than ever, effective external systems are available for organizing, categorizing, and keeping track of things.”

Information Overload, Then and Now

We’ve been around a long time. For most of that time we didn’t do much of anything other than “procreate and survive.” Then we discovered farming and irrigation and gave up our fairly nomadic lifestyle. Farming allowed us to specialize. I could grow potatoes and you could grow tomatoes and we could trade. This created a dependency on each other and markets for trading. All of this trading, in turn required an accounting system to keep tabs on inventory and trades. This was the birthplace of writing.

With the growth of trade, cities, and writing, people soon discovered architecture, government, and the other refinements of being that collectively add up to what we think of as civilization. The appearance of writing some 5,000 years ago was not met with unbridled enthusiasm; many contemporaries saw it as technology gone too far, a demonic invention that would rot the mind and needed to be stopped. Then, as now, printed words were promiscuous— it was impossible to control where they went or who would receive them, and they could circulate easily without the author’s knowledge or control. Lacking the opportunity to hear information directly from a speaker’s mouth, the antiwriting contingent complained that it would be impossible to verify the accuracy of the writer’s claims, or to ask follow-up questions. Plato was among those who voiced these fears; his King Thamus decried that the dependence on written words would “weaken men’s characters and create forgetfulness in their souls.” Such externalization of facts and stories meant people would no longer need to mentally retain large quantities of information themselves and would come to rely on stories and facts as conveyed, in written form, by others. Thamus, king of Egypt, argued that the written word would infect the Egyptian people with fake knowledge. The Greek poet Callimachus said books are “a great evil.” The Roman philosopher Seneca the Younger ( tutor to Nero) complained that his peers were wasting time and money accumulating too many books, admonishing that “the abundance of books is a distraction.” Instead, Seneca recommended focusing on a limited number of good books, to be read thoroughly and repeatedly. Too much information could be harmful to your mental health.

Cue the printing press, which allowed for the rapid copying of books. This further complicated intellectual life.

The printing press was introduced in the mid 1400s, allowing for the more rapid proliferation of writing, replacing laborious (and error-prone) hand copying. Yet again, many complained that intellectual life as we knew it was done for. Erasmus, in 1525, went on a tirade against the “swarms of new books,” which he considered a serious impediment to learning. He blamed printers whose profit motive sought to fill the world with books that were “foolish, ignorant, malignant, libelous, mad, impious and subversive.” Leibniz complained about “that horrible mass of books that keeps on growing ” and that would ultimately end in nothing less than a “return to barbarism.” Descartes famously recommended ignoring the accumulated stock of texts and instead relying on one’s own observations. Presaging what many say today, Descartes complained that “even if all knowledge could be found in books, where it is mixed in with so many useless things and confusingly heaped in such large volumes, it would take longer to read those books than we have to live in this life and more effort to select the useful things than to find them oneself.”

A steady flow of complaints about the proliferation of books reverberated into the late 1600s. Intellectuals warned that people would stop talking to each other, burying themselves in books, polluting their minds with useless, fatuous ideas.

There is an argument that this generation is at the same crossroads — our Gutenburg moment.

iPhones and iPads, email, and Twitter are the new revolution.

Each was decried as an addiction, an unnecessary distraction, a sign of weak character, feeding an inability to engage with real people and the real-time exchange of ideas.

The industrial revolution brought along a rapid rise in discovery and advancement. Scientific information increased at a staggering clip.

Today, someone with a PhD in biology can’t even know all that is known about the nervous system of the squid! Google Scholar reports 30,000 research articles on that topic, with the number increasing exponentially. By the time you read this, the number will have increased by at least 3,000. The amount of scientific information we’ve discovered in the last twenty years is more than all the discoveries up to that point, from the beginning of language.

This is taxing all of us as we filter what we need to know from what we don’t. This ties in nicely with Tyler Cowen’s argument that the future of work is changing and we will need to add value to computers.

To cope with information overload we create to-do lists and email ourselves reminders. I have lists of lists. Right now there are over 800 unread emails in my inbox. Many of these are reminders to myself to look into something or to do something, links that I need to go back and read, or books I want to add to my wishlist. I see those emails and think, yes I want to do that but not right now. So they sit in my inbox. Occasionally I’ll create a to-do list, which starts off with the best intentions and rapidly becomes a brain dump. Eventually I remember the 18 minute plan for managing your day and I re-focus, scheduling time for the most important things. No matter what I do I always feel like I’m on the border between organized and chaos.

A large part of this feeling of being overwhelmed can be traced back to our evolutionarily outdated attentional system. I mentioned earlier the two principles of the attentional filter: change and importance. There is a third principle of attention— not specific to the attentional filter— that is relevant now more than ever. It has to do with the difficulty of attentional switching. We can state the principle this way: Switching attention comes with a high cost.

Our brains evolved to focus on one thing at a time. This enabled our ancestors to hunt animals, to create and fashion tools, to protect their clan from predators and invading neighbors. The attentional filter evolved to help us to stay on task, letting through only information that was important enough to deserve disrupting our train of thought. But a funny thing happened on the way to the twenty-first century: The plethora of information and the technologies that serve it changed the way we use our brains. Multitasking is the enemy of a focused attentional system. Increasingly, we demand that our attentional system try to focus on several things at once, something that it was not evolved to do. We talk on the phone while we’re driving, listening to the radio, looking for a parking place, planning our mom’s birthday party, trying to avoid the road construction signs, and thinking about what’s for lunch. We can’t truly think about or attend to all these things at once, so our brains flit from one to the other, each time with a neurobiological switching cost. The system does not function well that way. Once on a task, our brains function best if we stick to that task.

When you pay attention to something it means you don’t see something else. David Foster Wallace hit upon this in his speech, The Truth With A Whole Lot Of Rhetorical Bullshit Pared Away. He said:

Learning how to think really means learning how to exercise some control over how and what you think. It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience. Because if you cannot exercise this kind of choice in adult life, you will be totally hosed. Think of the old cliché about the mind being an excellent servant but a terrible master. This, like many clichés, so lame and unexciting on the surface, actually expresses a great and terrible truth.

And Winifred Gallagher, author of the book Rapt: Attention and the Focused Life, wrote:

That your experience largely depends on the material objects and mental subjects that you choose to pay attention to or ignore is not an imaginative notion, but a physiological fact. When you focus on a stop sign or a sonnet, a waft of perfume or a stock-market tip, your brain registers that “target,” which enables it to affect your behavior. In contrast, the things that you don’t attend to in a sense don’t exist, at least for you.

All day long, you are selectively paying attention to something, and much more often than you may suspect, you can take charge of this process to good effect. Indeed, your ability to focus on this and suppress that is the key to controlling your experience and, ultimately, your well-being.

When you walk into the front door of your house after a long day of work to screaming kids and a ringing phone you’re not thinking about where you left your car keys.

Attention is created by networks of neurons in the prefrontal cortex (just behind your forehead) that are sensitive only to dopamine. When dopamine is released, it unlocks them, like a key in your front door, and they start firing tiny electrical impulses that stimulate other neurons in their network. But what causes that initial release of dopamine? Typically, one of two different triggers:

1. Something can grab your attention automatically, usually something that is salient to your survival, with evolutionary origins. This vigilance system incorporating the attentional filter is always at work, even when you’re asleep, monitoring the environment for important events. This can be a loud sound or bright light (the startle reflex), something moving quickly (that might indicate a predator), a beverage when you’re thirsty, or an attractively shaped potential sexual partner.

2. You effectively will yourself to focus only on that which is relevant to a search or scan of the environment. This deliberate filtering has been shown in the laboratory to actually change the sensitivity of neurons in the brain. If you’re trying to find your lost daughter at the state fair, your visual system reconfigures to look only for things of about her height, hair color, and body build, filtering everything else out. Simultaneously, your auditory system retunes itself to hear only frequencies in that band where her voice registers. You could call it the Where’s Waldo? filtering network.

It all comes back to Waldo.

If it has red in it, our red-sensitive neurons are involved in the imagining. They then automatically tune themselves, and inhibit other neurons (the ones for the colors you’re not interested in) to facilitate the search. Where’s Waldo? trains children to set and exercise their visual attentional filters to locate increasingly subtle cues in the environment, much as our ancestors might have trained their children to track animals through the forest, starting with easy-to-see and easy-to -differentiate animals and working up to camouflaging animals that are more difficult to pick out from the surrounding environment. The system also works for auditory filtering— if we are expecting a particular pitch or timbre in a sound, our auditory neurons become selectively tuned to those characteristics.

When we willfully retune sensory neurons in this way, our brains engage in top-down processing, originating in a higher, more advanced part of the brain than sensory processing.

But if we have an effective attention filter, why do we find it so hard to filter out distractions? Cue technology.

For one thing, we’re doing more work than ever before. The promise of a computerized society, we were told, was that it would relegate to machines all of the repetitive drudgery of work, allowing us humans to pursue loftier purposes and to have more leisure time. It didn’t work out this way. Instead of more time, most of us have less. Companies large and small have off-loaded work onto the backs of consumers. Things that used to be done for us, as part of the value-added service of working with a company, we are now expected to do ourselves. With air travel, we’re now expected to complete our own reservations and check-in, jobs that used to be done by airline employees or travel agents. At the grocery store, we’re expected to bag our own groceries and, in some supermarkets, to scan our own purchases. We pump our own gas at filling stations. Telephone operators used to look up numbers for us. Some companies no longer send out bills for their services— we’re expected to log in to their website, access our account, retrieve our bill, and initiate an electronic payment; in effect, do the job of the company for them. Collectively, this is known as shadow work— it represents a kind of parallel, shadow economy in which a lot of the service we expect from companies has been transferred to the customer. Each of us is doing the work of others and not getting paid for it. It is responsible for taking away a great deal of the leisure time we thought we would all have in the twenty-first century.

Beyond doing more work, we are dealing with more changes in information technology than our parents did, and more as adults than we did as children. The average American replaces her cell phone every two years, and that often means learning new software, new buttons, new menus. We change our computer operating systems every three years, and that requires learning new icons and procedures, and learning new locations for old menu items.

It’s not a coincidence that highly successful people tend to offload these tasks to others, allowing them to focus.

As knowledge becomes more available— and decentralized through the Internet— the notions of accuracy and authoritativeness have become clouded. Conflicting viewpoints are more readily available than ever, and in many cases they are disseminated by people who have no regard for facts or truth. Many of us find we don’t know whom to believe, what is true, what has been modified, and what has been vetted.


My teacher, the Stanford cognitive psychologist Amos Tversky, encapsulates this in “the Volvo story.” A colleague was shopping for a new car and had done a great deal of research. Consumer Reports showed through independent tests that Volvos were among the best built and most reliable cars in their class. Customer satisfaction surveys showed that Volvo owners were far happier with their purchase after several years. The surveys were based on tens of thousands of customers. The sheer number of people polled meant that any anomaly— like a specific vehicle that was either exceptionally good or exceptionally bad— would be drowned out by all the other reports. In other words, a survey such as this has statistical and scientific legitimacy and should be weighted accordingly when one makes a decision. It represents a stable summary of the average experience, and the most likely best guess as to what your own experience will be (if you’ve got nothing else to go on, your best guess is that your experience will be most like the average).

Amos ran into his colleague at a party and asked him how his automobile purchase was going. The colleague had decided against the Volvo in favor of a different, lower-rated car. Amos asked him what made him change his mind after all that research pointed to the Volvo. Was it that he didn’t like the price? The color options? The styling? No, it was none of those reasons, the colleague said. Instead, the colleague said, he found out that his brother-in-law had owned a Volvo and that it was always in the shop.

From a strictly logical point of view, the colleague is being irrational. The brother-in-law’s bad Volvo experience is a single data point swamped by tens of thousands of good experiences— it’s an unusual outlier. But we are social creatures. We are easily swayed by first-person stories and vivid accounts of a single experience. Although this is statistically wrong and we should learn to overcome the bias, most of us don’t. Advertisers know this, and this is why we see so many first-person testimonial advertisements on TV. “I lost twenty pounds in two weeks by eating this new yogurt— and it was delicious, too!” Or “I had a headache that wouldn’t go away. I was barking at the dog and snapping at my loved ones. Then I took this new medication and I was back to my normal self.” Our brains focus on vivid, social accounts more than dry, boring, statistical accounts.

So not only does knowledge become easier to access than ever before (frictionless) but as it becomes more available our brains need to cope with it, which they do by magnifying our pre-existing cognitive biases.


In Roger Shepard’s version of the famous “Ponzo illusion,” the monster at the top seems larger than the one at the bottom, but a ruler will show that they’re the same size. In the Ebbinghaus illusion below it, the white circle on the left seems larger than the white circle on the right, but they’re the same size. We say that our eyes are playing tricks on us, but in fact, our eyes aren’t playing tricks on us, our brain is. The visual system uses heuristics or shortcuts to piece together an understanding of the world, and it sometimes gets things wrong.

We are prone to cognitive illusions when we make decisions. The same type of shortcuts are at play.

The Organized Mind: Thinking Straight in the Age of Information Overload is a wholly fascinating look at our minds.