Tag: Daniel Levitin

The History of Cognitive Overload

The Organized Mind

The Organized Mind: Thinking Straight in the Age of Information Overload, a book by Daniel Levitin, has an interesting section on cognitive overload.

Each day we are confronted with hundreds, probably thousands of decisions. Most of which are insignificant or unimportant or both. Do we really need a whole aisle for toothpaste?

In response to all of these decisions most of us adopt a strategy of satisficing, a term coined by Nobel Prize winner Herbert Simon to describe something that is perhaps not the best but good enough. For things that don’t matter, this is a good approach. You don’t know which pizza place is the best but you know which ones are good enough.

Satisficing is one of the foundations of productive human behavior; it prevails when we don’t waste time on decisions that don’t matter, or more accurately, when we don’t waste time trying to find improvements that are not going to make a significant difference in our happiness or satisfaction.

All of us, Levitin argues, engage in satisficing every time we clean our homes.

If we got down on the floor with a toothbrush every day to clean the grout, if we scrubbed the windows and walls every single day, the house would be spotless. But few of us go to this much trouble even on a weekly basis (and when we do, we’re likely to be labeled obsessive-compulsive). For most of us, we clean our houses until they are clean enough, reaching a kind of equilibrium between effort and benefit. It is this cost-benefits analysis that is at the heart of satisficing.

The easiest way to be happy is to want what you already have. “Happy people engage in satisficing all the time, even if they don’t know it.”

Satisficing is a tool that allows you not to waste time on things that don’t really matter. Who cares if you pick Colgate or Crest? For other decisions, “the old-fashioned pursuit of excellence remains the right strategy.”

We now spend an unusual amount of time and energy ignoring and filtering. Consider the supermarket.

In 1976, the average supermarket stocked 9,000 unique products; today that number has ballooned to 40,000 of them, yet the average person gets 80%– 85% of their needs in only 150 different supermarket items. That means that we need to ignore 39,850 items in the store.

This comes with a cost.

Neuroscientists have discovered that unproductivity and loss of drive can result from decision overload. Although most of us have no trouble ranking the importance of decisions if asked to do so, our brains don’t automatically do this.

We have a limited number of decisions. There are only so many we can make in a day. Once we've hit that limit it doesn't matter how important they are.

The decision-making network in our brain doesn’t prioritize.

Our world has exploded. Information is abundant. I didn’t think we could process it all but Levitin argues that we can, at a cost.

We can have trouble separating the trivial from the important, and all this information processing makes us tired. Neurons are living cells with a metabolism; they need oxygen and glucose to survive and when they’ve been working hard, we experience fatigue. Every status update you read on Facebook, every tweet or text message you get from a friend, is competing for resources in your brain with important things like whether to put your savings in stocks or bonds, where you left your passport, or how best to reconcile with a close friend you just had an argument with.

The processing capacity of the conscious mind has been estimated at 120 bits per second. That bandwidth, or window, is the speed limit for the traffic of information we can pay conscious attention to at any one time. While a great deal occurs below the threshold of our awareness, and this has an impact on how we feel and what our life is going to be like, in order for something to become encoded as part of your experience, you need to have paid conscious attention to it.

What does this mean?

In order to understand one person speaking to us, we need to process 60 bits of information per second. With a processing limit of 120 bits per second, this means you can barely understand two people talking to you at the same time. Under most circumstances, you will not be able to understand three people talking at the same time. …

With such attentional restrictions, it’s clear why many of us feel overwhelmed by managing some of the most basic aspects of life. Part of the reason is that our brains evolved to help us deal with life during the hunter-gatherer phase of human history, a time when we might encounter no more than a thousand people across the entire span of our lifetime. Walking around midtown Manhattan, you’ll pass that number of people in half an hour.

Attention is the most essential mental resource for any organism. It determines which aspects of the environment we deal with, and most of the time, various automatic, subconscious processes make the correct choice about what gets passed through to our conscious awareness. For this to happen, millions of neurons are constantly monitoring the environment to select the most important things for us to focus on. These neurons are collectively the attentional filter. They work largely in the background, outside of our conscious awareness. This is why most of the perceptual detritus of our daily lives doesn’t register, or why, when you’ve been driving on the freeway for several hours at a stretch, you don’t remember much of the scenery that has whizzed by: Your attentional system “protects” you from registering it because it isn’t deemed important. This unconscious filter follows certain principles about what it will let through to your conscious awareness.

The attentional filter is one of evolution’s greatest achievements. In nonhumans, it ensures that they don’t get distracted by irrelevancies. Squirrels are interested in nuts and predators, and not much else. Dogs, whose olfactory sense is one million times more sensitive than ours, use smell to gather information about the world more than they use sound, and their attentional filter has evolved to make that so. If you’ve ever tried to call your dog while he is smelling something interesting, you know that it is very difficult to grab his attention with sound— smell trumps sound in the dog brain. No one has yet worked out all of the hierarchies and trumping factors in the human attentional filter, but we’ve learned a great deal about it. When our protohuman ancestors left the cover of the trees to seek new sources of food, they simultaneously opened up a vast range of new possibilities for nourishment and exposed themselves to a wide range of new predators. Being alert and vigilant to threatening sounds and visual cues is what allowed them to survive; this meant allowing an increasing amount of information through the attentional filter.

Levitin points out an interesting fact on how highly successful people (HSP) differ from the rest of us when it comes to attentional filters.

Successful people— or people who can afford it— employ layers of people whose job it is to narrow the attentional filter. That is, corporate heads, political leaders, spoiled movie stars, and others whose time and attention are especially valuable have a staff of people around them who are effectively extensions of their own brains, replicating and refining the functions of the prefrontal cortex’s attentional filter.

These highly successful persons have many of the daily distractions of life handled for them, allowing them to devote all of their attention to whatever is immediately before them. They seem to live completely in the moment. Their staff handle correspondence, make appointments, interrupt those appointments when a more important one is waiting, and help to plan their days for maximum efficiency (including naps!). Their bills are paid on time, their car is serviced when required, they’re given reminders of projects due, and their assistants send suitable gifts to the HSP’s loved ones on birthdays and anniversaries. Their ultimate prize if it all works? A Zen-like focus.

Levitin argues that if we organize our minds and our lives “following the new neuroscience of attention and memory, we can all deal with the world in ways that provide the sense of freedom that these highly successful people enjoy.”

To do that, however, we need to understand the architecture of our attentional system. “To better organize our mind, we need to know how it has organized itself.”

Change and importance are two crucial principles used by our attentional filter.

The brain’s change detector is at work all the time, whether you know it or not. If a close friend or relative calls on the phone, you might detect that her voice sounds different and ask if she’s congested or sick with the flu. When your brain detects the change, this information is sent to your consciousness, but your brain doesn’t explicitly send a message when there is no change. If your friend calls and her voice sounds normal, you don’t immediately think, “Oh, her voice is the same as always.” Again, this is the attentional filter doing its job, detecting change, not constancy.

Importance can also filter information. But it’s not objective or absolute importance but something personal and relevant to you.

If you’re driving, a billboard for your favorite music group might catch your eye (really, we should say catch your mind) while other billboards go ignored. If you’re in a crowded room, at a party for instance, certain words to which you attach high importance might suddenly catch your attention, even if spoken from across the room. If someone says “fire” or “sex” or your own name, you’ll find that you’re suddenly following a conversation far away from where you’re standing, with no awareness of what those people were talking about before your attention was captured.

The attentional filter lets us live on autopilot most of the time coming out of it only when we need to. In so doing, we “do not register the complexities, nuances, and often the beauty of what is right in front of us.”

A great number of failures of attention occur because we are not using these two principles to our advantage.

Simply put, attention is limited.

A critical point that bears repeating is that attention is a limited-capacity resource— there are definite limits to the number of things we can attend to at once. We see this in everyday activities. If you’re driving, under most circumstances, you can play the radio or carry on a conversation with someone else in the car. But if you’re looking for a particular street to turn onto, you instinctively turn down the radio or ask your friend to hang on for a moment, to stop talking. This is because you’ve reached the limits of your attention in trying to do these three things. The limits show up whenever we try to do too many things at once.

Our brain hides things from us.

The human brain has evolved to hide from us those things we are not paying attention to. In other words, we often have a cognitive blind spot: We don’t know what we’re missing because our brain can completely ignore things that are not its priority at the moment— even if they are right in front of our eyes. Cognitive psychologists have called this blind spot various names, including inattentional blindness.

One of the most famous demonstrations of this is the basketball video (for more see: The Invisible Gorilla: How Our Intuitions Deceive Us.)

A lot of instances of losing things like car keys, passports, money, receipts, and so on occur because our attentional systems are overloaded and they simply can’t keep track of everything. The average American owns thousands of times more possessions than the average hunter-gatherer. In a real biological sense, we have more things to keep track of than our brains were designed to handle. Even towering intellectuals such as Kant and Wordsworth complained of information excess and sheer mental exhaustion induced by too much sensory input or mental overload.

But we need not fear this cognitive overload, Levitin argues. “More than ever, effective external systems are available for organizing, categorizing, and keeping track of things.”

Information Overload, Then and Now

We've been around a long time. For most of that time we didn't do much of anything other than “procreate and survive.” Then we discovered farming and irrigation and gave up our fairly nomadic lifestyle. Farming allowed us to specialize. I could grow potatoes and you could grow tomatoes and we could trade. This created a dependency on each other and markets for trading. All of this trading, in turn required an accounting system to keep tabs on inventory and trades. This was the birthplace of writing.

With the growth of trade, cities, and writing, people soon discovered architecture, government, and the other refinements of being that collectively add up to what we think of as civilization. The appearance of writing some 5,000 years ago was not met with unbridled enthusiasm; many contemporaries saw it as technology gone too far, a demonic invention that would rot the mind and needed to be stopped. Then, as now, printed words were promiscuous— it was impossible to control where they went or who would receive them, and they could circulate easily without the author’s knowledge or control. Lacking the opportunity to hear information directly from a speaker’s mouth, the antiwriting contingent complained that it would be impossible to verify the accuracy of the writer’s claims, or to ask follow-up questions. Plato was among those who voiced these fears; his King Thamus decried that the dependence on written words would “weaken men’s characters and create forgetfulness in their souls.” Such externalization of facts and stories meant people would no longer need to mentally retain large quantities of information themselves and would come to rely on stories and facts as conveyed, in written form, by others. Thamus, king of Egypt, argued that the written word would infect the Egyptian people with fake knowledge. The Greek poet Callimachus said books are “a great evil.” The Roman philosopher Seneca the Younger ( tutor to Nero) complained that his peers were wasting time and money accumulating too many books, admonishing that “the abundance of books is a distraction.” Instead, Seneca recommended focusing on a limited number of good books, to be read thoroughly and repeatedly. Too much information could be harmful to your mental health.

Cue the printing press, which allowed for the rapid copying of books. This further complicated intellectual life.

The printing press was introduced in the mid 1400s, allowing for the more rapid proliferation of writing, replacing laborious (and error-prone) hand copying. Yet again, many complained that intellectual life as we knew it was done for. Erasmus, in 1525, went on a tirade against the “swarms of new books,” which he considered a serious impediment to learning. He blamed printers whose profit motive sought to fill the world with books that were “foolish, ignorant, malignant, libelous, mad, impious and subversive.” Leibniz complained about “that horrible mass of books that keeps on growing ” and that would ultimately end in nothing less than a “return to barbarism.” Descartes famously recommended ignoring the accumulated stock of texts and instead relying on one’s own observations. Presaging what many say today, Descartes complained that “even if all knowledge could be found in books, where it is mixed in with so many useless things and confusingly heaped in such large volumes, it would take longer to read those books than we have to live in this life and more effort to select the useful things than to find them oneself.”

A steady flow of complaints about the proliferation of books reverberated into the late 1600s. Intellectuals warned that people would stop talking to each other, burying themselves in books, polluting their minds with useless, fatuous ideas.

There is an argument that this generation is at the same crossroads — our Gutenburg moment.

iPhones and iPads, email, and Twitter are the new revolution.

Each was decried as an addiction, an unnecessary distraction, a sign of weak character, feeding an inability to engage with real people and the real-time exchange of ideas.

The industrial revolution brought along a rapid rise in discovery and advancement. Scientific information increased at a staggering clip.

Today, someone with a PhD in biology can’t even know all that is known about the nervous system of the squid! Google Scholar reports 30,000 research articles on that topic, with the number increasing exponentially. By the time you read this, the number will have increased by at least 3,000. The amount of scientific information we’ve discovered in the last twenty years is more than all the discoveries up to that point, from the beginning of language.

This is taxing all of us as we filter what we need to know from what we don’t. This ties in nicely with Tyler Cowen's argument that the future of work is changing and we will need to add value to computers.

To cope with information overload we create to-do lists and email ourselves reminders. I have lists of lists. Right now there are over 800 unread emails in my inbox. Many of these are reminders to myself to look into something or to do something, links that I need to go back and read, or books I want to add to my wishlist. I see those emails and think, yes I want to do that but not right now. So they sit in my inbox. Occasionally I'll create a to-do list, which starts off with the best intentions and rapidly becomes a brain dump. Eventually I remember the 18 minute plan for managing your day and I re-focus, scheduling time for the most important things. No matter what I do I always feel like I'm on the border between organized and chaos.

A large part of this feeling of being overwhelmed can be traced back to our evolutionarily outdated attentional system. I mentioned earlier the two principles of the attentional filter: change and importance. There is a third principle of attention— not specific to the attentional filter— that is relevant now more than ever. It has to do with the difficulty of attentional switching. We can state the principle this way: Switching attention comes with a high cost.

Our brains evolved to focus on one thing at a time. This enabled our ancestors to hunt animals, to create and fashion tools, to protect their clan from predators and invading neighbors. The attentional filter evolved to help us to stay on task, letting through only information that was important enough to deserve disrupting our train of thought. But a funny thing happened on the way to the twenty-first century: The plethora of information and the technologies that serve it changed the way we use our brains. Multitasking is the enemy of a focused attentional system. Increasingly, we demand that our attentional system try to focus on several things at once, something that it was not evolved to do. We talk on the phone while we’re driving, listening to the radio, looking for a parking place, planning our mom’s birthday party, trying to avoid the road construction signs, and thinking about what’s for lunch. We can’t truly think about or attend to all these things at once, so our brains flit from one to the other, each time with a neurobiological switching cost. The system does not function well that way. Once on a task, our brains function best if we stick to that task.

When you pay attention to something it means you don't see something else. David Foster Wallace hit upon this in his speech, The Truth With A Whole Lot Of Rhetorical Bullshit Pared Away. He said:

Learning how to think really means learning how to exercise some control over how and what you think. It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience. Because if you cannot exercise this kind of choice in adult life, you will be totally hosed. Think of the old cliché about the mind being an excellent servant but a terrible master. This, like many clichés, so lame and unexciting on the surface, actually expresses a great and terrible truth.

And Winifred Gallagher, author of the book Rapt: Attention and the Focused Life, wrote:

That your experience largely depends on the material objects and mental subjects that you choose to pay attention to or ignore is not an imaginative notion, but a physiological fact. When you focus on a stop sign or a sonnet, a waft of perfume or a stock-market tip, your brain registers that “target,” which enables it to affect your behavior. In contrast, the things that you don’t attend to in a sense don’t exist, at least for you.

All day long, you are selectively paying attention to something, and much more often than you may suspect, you can take charge of this process to good effect. Indeed, your ability to focus on this and suppress that is the key to controlling your experience and, ultimately, your well-being.

When you walk into the front door of your house after a long day of work to screaming kids and a ringing phone you're not thinking about where you left your car keys.

Attention is created by networks of neurons in the prefrontal cortex (just behind your forehead) that are sensitive only to dopamine. When dopamine is released, it unlocks them, like a key in your front door, and they start firing tiny electrical impulses that stimulate other neurons in their network. But what causes that initial release of dopamine? Typically, one of two different triggers:

1. Something can grab your attention automatically, usually something that is salient to your survival, with evolutionary origins. This vigilance system incorporating the attentional filter is always at work, even when you’re asleep, monitoring the environment for important events. This can be a loud sound or bright light (the startle reflex), something moving quickly (that might indicate a predator), a beverage when you’re thirsty, or an attractively shaped potential sexual partner.

2. You effectively will yourself to focus only on that which is relevant to a search or scan of the environment. This deliberate filtering has been shown in the laboratory to actually change the sensitivity of neurons in the brain. If you’re trying to find your lost daughter at the state fair, your visual system reconfigures to look only for things of about her height, hair color, and body build, filtering everything else out. Simultaneously, your auditory system retunes itself to hear only frequencies in that band where her voice registers. You could call it the Where’s Waldo? filtering network.

It all comes back to Waldo.

If it has red in it, our red-sensitive neurons are involved in the imagining. They then automatically tune themselves, and inhibit other neurons (the ones for the colors you’re not interested in) to facilitate the search. Where’s Waldo? trains children to set and exercise their visual attentional filters to locate increasingly subtle cues in the environment, much as our ancestors might have trained their children to track animals through the forest, starting with easy-to-see and easy-to -differentiate animals and working up to camouflaging animals that are more difficult to pick out from the surrounding environment. The system also works for auditory filtering— if we are expecting a particular pitch or timbre in a sound, our auditory neurons become selectively tuned to those characteristics.

When we willfully retune sensory neurons in this way, our brains engage in top-down processing, originating in a higher, more advanced part of the brain than sensory processing.

But if we have an effective attention filter, why do we find it so hard to filter out distractions? Cue technology.

For one thing, we’re doing more work than ever before. The promise of a computerized society, we were told, was that it would relegate to machines all of the repetitive drudgery of work, allowing us humans to pursue loftier purposes and to have more leisure time. It didn’t work out this way. Instead of more time, most of us have less. Companies large and small have off-loaded work onto the backs of consumers. Things that used to be done for us, as part of the value-added service of working with a company, we are now expected to do ourselves. With air travel, we’re now expected to complete our own reservations and check-in, jobs that used to be done by airline employees or travel agents. At the grocery store, we’re expected to bag our own groceries and, in some supermarkets, to scan our own purchases. We pump our own gas at filling stations. Telephone operators used to look up numbers for us. Some companies no longer send out bills for their services— we’re expected to log in to their website, access our account, retrieve our bill, and initiate an electronic payment; in effect, do the job of the company for them. Collectively, this is known as shadow work— it represents a kind of parallel, shadow economy in which a lot of the service we expect from companies has been transferred to the customer. Each of us is doing the work of others and not getting paid for it. It is responsible for taking away a great deal of the leisure time we thought we would all have in the twenty-first century.

Beyond doing more work, we are dealing with more changes in information technology than our parents did, and more as adults than we did as children. The average American replaces her cell phone every two years, and that often means learning new software, new buttons, new menus. We change our computer operating systems every three years, and that requires learning new icons and procedures, and learning new locations for old menu items.

It's not a coincidence that highly successful people tend to offload these tasks to others, allowing them to focus.

As knowledge becomes more available— and decentralized through the Internet— the notions of accuracy and authoritativeness have become clouded. Conflicting viewpoints are more readily available than ever, and in many cases they are disseminated by people who have no regard for facts or truth. Many of us find we don’t know whom to believe, what is true, what has been modified, and what has been vetted.


My teacher, the Stanford cognitive psychologist Amos Tversky, encapsulates this in “the Volvo story.” A colleague was shopping for a new car and had done a great deal of research. Consumer Reports showed through independent tests that Volvos were among the best built and most reliable cars in their class. Customer satisfaction surveys showed that Volvo owners were far happier with their purchase after several years. The surveys were based on tens of thousands of customers. The sheer number of people polled meant that any anomaly— like a specific vehicle that was either exceptionally good or exceptionally bad— would be drowned out by all the other reports. In other words, a survey such as this has statistical and scientific legitimacy and should be weighted accordingly when one makes a decision. It represents a stable summary of the average experience, and the most likely best guess as to what your own experience will be (if you’ve got nothing else to go on, your best guess is that your experience will be most like the average).

Amos ran into his colleague at a party and asked him how his automobile purchase was going. The colleague had decided against the Volvo in favor of a different, lower-rated car. Amos asked him what made him change his mind after all that research pointed to the Volvo. Was it that he didn’t like the price? The color options? The styling? No, it was none of those reasons, the colleague said. Instead, the colleague said, he found out that his brother-in-law had owned a Volvo and that it was always in the shop.

From a strictly logical point of view, the colleague is being irrational. The brother-in-law’s bad Volvo experience is a single data point swamped by tens of thousands of good experiences— it’s an unusual outlier. But we are social creatures. We are easily swayed by first-person stories and vivid accounts of a single experience. Although this is statistically wrong and we should learn to overcome the bias, most of us don’t. Advertisers know this, and this is why we see so many first-person testimonial advertisements on TV. “I lost twenty pounds in two weeks by eating this new yogurt— and it was delicious, too!” Or “I had a headache that wouldn’t go away. I was barking at the dog and snapping at my loved ones. Then I took this new medication and I was back to my normal self.” Our brains focus on vivid, social accounts more than dry, boring, statistical accounts.

So not only does knowledge become easier to access than ever before (frictionless) but as it becomes more available our brains need to cope with it, which they do by magnifying our pre-existing cognitive biases.


In Roger Shepard’s version of the famous “Ponzo illusion,” the monster at the top seems larger than the one at the bottom, but a ruler will show that they’re the same size. In the Ebbinghaus illusion below it, the white circle on the left seems larger than the white circle on the right, but they’re the same size. We say that our eyes are playing tricks on us, but in fact, our eyes aren’t playing tricks on us, our brain is. The visual system uses heuristics or shortcuts to piece together an understanding of the world, and it sometimes gets things wrong.

We are prone to cognitive illusions when we make decisions. The same type of shortcuts are at play.

The Organized Mind: Thinking Straight in the Age of Information Overload is a wholly fascinating look at our minds.

Thinking Straight in the Age of Information Overload

The Organized Mind

The Organized Mind: Thinking Straight in the Age of Information Overload, a book by Daniel Levitin, explores “how humans have coped with information and organization from the beginning of civilization. … It’s also the story of how the most successful members of society—from successful artists, athletes, and warriors, to business executives and highly credentialed professionals—have learned to maximize their creativity, and efficiency, by organizing their lives so that they spend less time on the mundane, and more time on the inspiring, comforting, and rewarding things in life.”


Memory is fallible. More than just remembering things wrongly, “we don’t even know we’re remembering them wrongly.”

The first humans who figured out how to write things down around 5,000 years ago were in essence trying to increase the capacity of their hippocampus, part of the brain’s memory system. They effectively extended the natural limits of human memory by preserving some of their memories on clay tablets and cave walls, and later, papyrus and parchment. Later, we developed other mechanisms —such as calendars, filing cabinets, computers, and smartphones— to help us organize and store the information we’ve written down. When our computer or smartphone starts to run slowly, we might buy a larger memory card. That memory is both a metaphor and a physical reality. We are off-loading a great deal of the processing that our neurons would normally do to an external device that then becomes an extension of our own brains, a neural enhancer.

These external memory mechanisms are generally of two types, either following the brain’s own organizational system or reinventing it, sometimes overcoming its limitations. Knowing which is which can enhance the way we use these systems, and so improve our ability to cope with information overload.

And once memory became external (written down and stored) our attention systems “were freed up to focus on something else.”

But we need a place (and a system) to organize all of this information.

The indexing problem is that there are several possibilities about where you store this report, based on your needs: It could be stored with other writings about plants, or with writings about family history, or with writings about cooking, or with writings about how to poison an enemy.

This brings us to two aspects of the human brain that are not given their due: richness and associative access.

Richness refers to the theory that a large number of the things you’ve ever thought or experienced are still in there, somewhere. Associative access means that your thoughts can be accessed in a number of different ways by semantic or perceptual associations— memories can be triggered by related words , by category names, by a smell, an old song or photograph, or even seemingly random neural firings that bring them up to consciousness.

Being able to access any memory regardless of where it is stored is what computer scientists call random access. DVDs and hard drives work this way; videotapes do not. You can jump to any spot in a movie on a DVD or hard drive by “pointing” at it. But to get to a particular point in a videotape, you need to go through every previous point first (sequential access). Our ability to randomly access our memory from multiple cues is especially powerful. Computer scientists call it relational memory. You may have heard of relational databases— that’s effectively what human memory is.


Having relational memory means that if I want to get you to think of a fire truck, I can induce the memory in many different ways. I might make the sound of a siren, or give you a verbal description (“ a large red truck with ladders on the side that typically responds to a certain kind of emergency”).

We categorize objects in a seemingly infinite number of ways. Each of those ways “has its own route to the neural node that represents fire truck in your brain.” Take a look at one way we can think of a firetruck.


Thinking about one memory or association activates more. This can be both a strength and a weakness.

If you are trying to retrieve a particular memory, the flood of activations can cause competition among different nodes, leaving you with a traffic jam of neural nodes trying to get through to consciousness, and you end up with nothing.

Organizing our Lives

The ancients Greeks came up with memory palaces and the method of loci to improve memory. The Egyptians became experts at externalizing information, inventing perhaps the biggest pre-google repository of knowledge, the library.

We don’t know why these simultaneous explosions of intellectual activity occurred when they did (perhaps daily human experience had hit a certain level of complexity). But the human need to organize our lives, our environment, even our thoughts, remains strong. This need isn’t simply learned, it is a biological imperative— animals organize their environments instinctively.

But the odd thing about the mind is that it doesn’t, on its own, organize things the way you might want it to. It's largely an unconscious process.

It comes preconfigured, and although it has enormous flexibility, it is built on a system that evolved over hundreds of thousands of years to deal with different kinds and different amounts of information than we have today. To be more specific: The brain isn’t organized the way you might set up your home office or bathroom medicine cabinet. You can’t just put things anywhere you want to. The evolved architecture of the brain is haphazard and disjointed, and incorporates multiple systems, each of which has a mind of its own (so to speak). Evolution doesn’t design things and it doesn’t build systems— it settles on systems that, historically, conveyed a survival benefit (and if a better way comes along, it will adopt that). There is no overarching, grand planner engineering the systems so that they work harmoniously together. The brain is more like a big, old house with piecemeal renovations done on every floor, and less like new construction.

Consider this, then, as an analogy: You have an old house and everything is a bit outdated, but you’re satisfied. You add a room air conditioner during one particularly hot summer. A few years later, when you have more money, you decide to add a central air-conditioning system. But you don’t remove that room unit in the bedroom— why would you ? It might come in handy and it’s already there, bolted to the wall. Then a few years later, you have a catastrophic plumbing problem—pipes burst in the walls. The plumbers need to break open the walls and run new pipes, but your central air-conditioning system is now in the way, where some of their pipes would ideally go. So they run the pipes through the attic, the long way around. This works fine until one particularly cold winter when your uninsulated attic causes your pipes to freeze. These pipes wouldn’t have frozen if you had run them through the walls, which you couldn’t do because of the central air-conditioning. If you had planned all this from the start, you would have done things differently, but you didn’t— you added things one thing at a time, as and when you needed them.

Or you can use Sherlock Holmes’ analogy of a memory attic. As Holmes tells Watson, “I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as your choose.”

Levitin argues that we should learn “how our brain organizes information so that we can use what we have, rather than fight against it.” We do this primarily through the key processes of encoding and retrieval.

(Our brains are) built as a hodgepodge of different systems, each one solving a particular adaptive problem. Occasionally they work together, occasionally they’re in conflict, and occasionally they aren’t even talking to one another. Two of the key ways that we can control and improve the process are to pay special attention to the way we enter information into our memory— encoding—and the way we pull it out— retrieval.

We’re busier than ever. That's not to say that it's information overload, as there are arguments to why that doesn't exist. Our internal to-do list is never satisfied. We’re overwhelmed with things disguised as wisdom or even information and we’re forced to sort through the nonsense. Levitin implies that one consequence to this approach is that we’re losing things. Our keys. Our driver’s licenses. Our iPhone. And it’s not just physical things. “we also forget things we were supposed to remember, important things like the password to our e-mail or a website, the PIN for our cash cards— the cognitive equivalent of losing our keys.”

These are important and hard to replace things.

We don’t tend to have general memory failures; we have specific, temporary memory failures for one or two things. During those frantic few minutes when you’re searching for your lost keys, you (probably) still remember your name and address, where your television set is, and what you had for breakfast —it’s just this one memory that has been aggravatingly lost. There is evidence that some things are typically lost far more often than others: We tend to lose our car keys but not our car, we lose our wallet or cell phone more often than the stapler on our desk or soup spoons in the kitchen, we lose track of coats and sweaters and shoes more often than pants. Understanding how the brain’s attentional and memory systems interact can go a long way toward minimizing memory lapses.

These simple facts about the kinds of things we tend to lose and those that we don’t can tell us a lot about how our brains work, and a lot about why things go wrong.

The way this works is fascinating. Levitin also hits on a topic that has long interested me. “Companies,” he writes, “are like expanded brains, with individual workers functioning something like neurons.”

Companies tend to be collections of individuals united to a common set of goals, with each worker performing a specialized function. Businesses typically do better than individuals at day-to-day tasks because of distributed processing. In a large business, there is a department for paying bills on time (accounts payable), and another for keeping track of keys (physical plant or security). Although the individual workers are fallible, systems and redundancies are usually in place, or should be, to ensure that no one person’s momentary distraction or lack of organization brings everything to a grinding halt. Of course, business organizations are not always prefectly organized, and occasionally, through the same cognitive blocks that cause us to lose our car keys, businesses lose things, too— profits, clients, competitive positions in the marketplace.

In today’s world it’s hard to keep up. We have pin numbers, phone numbers, email addresses, multiple to-do lists, small physical objects to keep track of, kids to pick up, books to read, videos to watch, nearly infinite websites to browse, and so on. Most of us, however, are still largely using the systems to organize and maintain this knowledge that were put into place in a less informatic time.

The Organized Mind: Thinking Straight in the Age of Information Overload shows us how to organize our time better, “not just so we can be more efficient but so we can find more time for fun, for play, for meaningful relationships, and for creativity.”