Tag Archives: Cognitive Biases

Is Our Faulty Memory Really So Bad?

“Though the behaviors…seem perverse, they reflect reliance on a type of navigation that serves the animals quite well in most situations.”
— Daniel Schacter

***

The Harvard psychologist Daniel Schacter has some brilliant insights into the human memory.

His wonderful book The Seven Sins of Memory presents the case that our memories fail us in regular, repeated, and predictable ways. We forget things we think we should know; we think we saw things we didn’t see; we can’t remember where we left our keys; we can’t remember _____’s name; we think Susan told us something that Steven did.

It’s easy to get a little down on our poor brains. Between cognitive biases, memory problems, emotional control, drug addiction, and brain disease, it’s natural to wonder how the hell our species has been so successful.

Not so fast. Schacter argues that we shouldn’t be so dismissive of the imperfect system we’ve been endowed with:

The very pervasiveness of memory’s imperfections, amply illustrated in the preceding pages, can easily lead to the conclusion that Mother Nature committed colossal blunders in burdening us with such a dysfunctional system. John Anderson, a cognitive psychologist at Carnegie-Mellon University, summarizes the prevailing perception that memory’s sins reflect poorly on its design: “over the years we have participated in many talks with artificial intelligence researchers about the prospects of using human models to guide the development of artificial intelligence programs. Invariably, the remark is made, “Well, of course, we would not want our system to have something so unreliable as human memory.”

It is tempting to agree with this characterization, especially if you’ve just wasted valuable time looking for misplaced keys, read the statistics on wrongful imprisonment resulting from eyewitness miscalculation, or woken up in the middle of the night persistently recalling a slip-up at work. But along with Anderson, I believe that this view is misguided: It is a mistake to conceive of the seven sins as design flaws that expose memory as a fundamentally defective system. To the contrary, I suggest that the seven sins are by-products of otherwise adaptive features of memory, a price we pay for processes and functions that serve us well in many respects.

Schacter starts by pointing out that all creatures have systems running on autopilot, which researchers love to exploit:

For instance, train a rat to navigate a maze to find a food reward at the end, and then place a pile of food halfway into the maze. The rat will run right past the pile of food as if it did not even exist, continuing to the end, where it seeks its just reward! Why not stop at the halfway point and enjoy the reward then? Hauser suggests that the rat is operating in this situation on the basis of “dead reckoning” — a method of navigating in which the animal keeps a literal record of where it has gone by constantly updating the speed, distance, and direction it has traveled.

A similarly comical error occurs when a pup is taken from a gerbil nest containing several other pups and is placed in a nearby cup. The mother searches for her lost baby, and while she is away, the nest is displaced a short distance. When the mother and lost pup return, she uses dead reckoning to head straight for the nest’s old location. Ignoring the screams and smells of the other pups just a short distance away, she searches for them at the old location. Hauser contends that the mother is driven by signals from her spatial system.

The reason for this bizarre behavior is that, in general, it works! Natural selection is pretty crafty and makes one simple value judgement: Does the thing provide a reproductive advantage to the individual (or group) or doesn’t it? In nature, a gerbil will rarely see its nest moved like that — it’s the artifice of the lab experiment that exposes the “auto-pilot” nature of the gerbil’s action.

It works the same way with us. The main thing to remember is that our mental systems are, by and large, working to our advantage. If we had memories that could recall all instances of the past with perfect precision, we’d be so inundated with information that we’d be paralyzed:

Consider the following experiment. Try to recall an episode from your life that involves a table. What do you remember, and how long did it take to come up with the memory? You probably had little difficult coming up with a specific incident — perhaps a conversation at the dinner table last night, or a discussion at the conference table this morning. Now imagine that the cue “table” brought forth all the memories that you have stored away involving a table. There are probably hundreds or thousands of such incidents. What if they all sprung to mind within seconds of considering the cue? A system that operated in this manner would likely result in mass confusion produced by an incessant coming to mind of numerous competing traces. It would be a bit like using an Internet search engine, typing in a word that has many matches in a worldwide data base, and then sorting through the thousands of entries that the query elicits. We wouldn’t want a memory system that produces this kind of data overload. Robert and Elizabeth Bjork have argued persuasively that the operation of inhibitory processes helps to protect us from such chaos.

The same goes for emotional experiences. We often lament that we take intensely emotional experiences hard; that we’re unable to shake the feeling certain situations imprint on us. PTSD is a particularly acute case of intense experience causing long-lasting mental harm. Yet this same system probably, on average, does us great good in survival:

Although intrusive recollections of trauma can be disabling, it is critically important that emotionally arousing experiences, which sometimes occur in response to life-threatening dangers, persist over time. The amygdala and related structures contribute to the persistence of such experiences by modulating memory formation, sometimes resulting in memories we wish we could forget. But this system boosts the likelihood that we will recall easily and quickly information about threatening or traumatic events whose recollection may one day be crucial for survival. Remembering life-threatening events persistently — where the incident occurred, who or what was responsible for it — boosts our chances of avoiding future recurrences.

Our brain has limitations, and with those limitations come trade-offs. One of the trade-offs our brain makes is to prioritize which information to hold on to, and which to let go of. It must do this — as stated above, we’d be overloaded with information without this ability. The brain has evolved to prioritize information which is:

  1. Used frequently
  2. Used recently
  3. Likely to be needed

Thus, we do forget things. The phenomenon of eyewitness testimony being unreliable can at least partially be explained by the fact that, when the event occurred, the witness probably did not know they’d need to remember it. There was no reason, in the moment, for that information to make an imprint. We have trouble recalling details of things that have not imprinted very deeply.

There are cases where people do have elements of what might seem like a “more optimal system” of memory, and generally they do not function well in the real world. Schacter gives us two in his book. The first is the famous mnemonist Shereshevski:

But what if all events were registered in elaborate detail, regardless of the level or type of processing to which they were subjected? The result would be a potentially overwhelming clutter of useless details, as happened in the famous case of the mnemonist Shereshevski. Described by Russian neuropsychologist Alexander Luria, who studied him for years, Shereshevski formed and retained highly detailed memories of virtually everything that happened to him — both the important and the trivial. Yet he was unable to function at an abstract level because he was inundated with unimportant details of his experiences — details that are best denied entry to the system in the first place. An elaboration-dependent system ensures that only those events that are important enough to warrant extensive encoding have a high likelihood of subsequent recollection.

The other case comes from more severely autistic individuals. When tested, autistic individuals make less conflagrations of the type that normally functioning individuals make, less mistaking that we heard sweet when we actually heard candy, or stool when we actually heard chair. These little misattributions are our brain working as it should, remembering the “gist” of things when the literal thing isn’t terribly important.

One symptom of autism is difficulty “generalizing” the way others are able to; difficulty developing the “gist” of situations and categories that, generally speaking, is highly helpful to a normally functioning individual. Instead, autism can cause many to take things extremely literally, and to have a great memory for rote factual information. (Picture Raymond Babbitt in Rain Man.) The trade is probably not desirable for most people — our system tends to serve us pretty well on the whole.

***

There’s at least one other way our system “saves us from ourselves” on average — our overestimation of self. Social psychologists love to demonstrate cases where humans overestimate their ability to drive, invest, make love, and so on. It even has a (correct) name: Overconfidence.

Yet without some measure of “overconfidence,” most of us would be quite depressed. In fact, when depressed individuals are studied, their tendency towards extreme realism is one thing frequently found:

On the face of it, these biases would appear to loosen our grasp on reality and thus represent a worrisome, even dangerous tendency. After all, good mental health is usually associated with accurate perceptions of reality, whereas mental disorders and madness are associated with distorted perceptions of reality.

But as the social psychologist Shelley Taylor has argued in her work on “positive illusions,” overly optimistic views of the self appear to promote mental health rather than undermine it. Far from functioning in an impaired or suboptimal manner, people who are most susceptible to positive illusions generally do well in many aspects of their lives. Depressed patients, in contrast, tend to lack the positive illusions that are characteristic of non-depressed individuals.

Remembering the past in an overly positive manner may encourage us to meet new challenges by promoting an overly optimistic view of the future, whereas remembering the past more accurately or negatively can leave us discouraged. Clearly there must be limits to such effects, because wildly distorted optimistic biases would eventually lead to trouble. But as Taylor points out, positive illusions are generally mild and are important contributors to our sense of well-being. To the extent memory bias promotes satisfaction with our lives, it can be considered an adaptive component of the cognitive system.

So here’s to the human brain: Flawed, certainly, but we must not forget that it does a pretty good job of getting us through the day alive and (mostly) well.

***

Still Interested? Check out Daniel Schacter’s fabulous The Seven Sins of Memory.

The Art of Thinking Clearly

(Update: While the book will likely make you smarter, there is some question as to where some of the ideas came from.)

The Art of Thinking Clearly

Rolf Dobelli’s book, The Art of Thinking Clearly, is a compendium of systematic errors in decision making. While the list of fallacies is not complete, it’s a great launching pad into the best of what others have already figured out.

To avoid frivolous gambles with the wealth I had accumulated over the course of my literary career, I began to put together a list of … systematic cognitive errors, complete with notes and personal anecdotes — with no intention of ever publishing them. The list was originally designed to be used by me alone. Some of these thinking errors have been known for centuries; others have been discovered in the last few years. Some came with two or three names attached to them. … Soon I realized that such a compilation of pitfalls was not only useful for making investing decisions but also for business and personal matters. Once I had prepared the list, I felt calmer and more levelheaded. I began to recognize my own errors sooner and was able to change course before any lasting damage was done. And, for the first time in my life, I was able to recognize when others might be in the thrall of these very same systematic errors. Armed with my list, I could now resist their pull — and perhaps even gain an upper hand in my dealings.

Dobelli’s goal is to learn to recognize and evade the biggest errors in thinking. In so doing, he believes we might “experience a leap in prosperity. We need no extra cunning, no new ideas, no unnecessary gadgets, no frantic hyperactivity—all we need is less irrationality.”

Let’s take a look at some of the content.

Guarding Against Survivorship Bias

People systematically overestimate their chances of success. Guard against it by frequently visiting the graves of once-promising projects, investments, and careers. It is a sad walk but one that should clear your mind.

Pattern Recognition

When it comes to pattern recognition, we are oversensitive. Regain your scepticism. If you think you have discovered a pattern, first consider it pure chance. If it seems too good to be true, find a mathematician and have the data tested statistically.

Fighting Against Confirmation Bias

[T]ry writing down your beliefs — whether in terms of worldview, investments, marriage, health care, diet, career strategies — and set out to find disconfirming evidence. Axing beliefs that feel like old friends is hard work but imperative.

Dating Advice and Contrast

If you are seeking a partner, never go out in the company of your supermodel friends. People will find you less attractive than you really are. Go alone or, better yet, take two ugly friends.

Think Different

Fend it off (availability bias) by spending time with people who think different than you do—people whose experiences and expertise are different from yours.

Guard Against Chauffeur Knowledge

Be on the lookout for chauffeur knowledge. Do not confuse the company spokesperson, the ringmaster, the newscaster, the schmoozer, the verbiage vendor, or the cliche generator with those who possess true knowledge. How do you recognize the difference? There is a clear indicator: True experts recognize the limits of what they know and what they do not know.

The Swimmer’s Body Illusion

Professional swimmers don’t have perfect bodies because they train extensively. Rather, they are good swimmers because of their physiques. How their bodies are designed is a factor for selection and not the result of their activities. … Whenever we confuse selection factors with results, we fall prey to what Taleb calls the swimmer’s body illusion. Without this illusion, half of advertising campaigns would not work. But this bias has to do with more than just the pursuit of chiseled cheekbones and chests. For example, Harvard has the reputation of being a top university. Many highly successful people have studied there. Does this mean that Harvard is a good school? We don’t know. Perhaps the school is terrible, and it simply recruits the brightest students around.

Peer Pressure

A simple experiment, carried out in the 1950s by legendary psychologist Solomon Asch, shows how peer pressure can warp common sense. A subject is shown a line drawn on paper, and next to it three lines—numbered 1, 2, and 3—one shorter, one longer, and one the same length as the original one. He or she must indicate which of the three lines corresponds to the original one. If the person is alone in the room, he gives correct answers because the task is really quite simple. Now five other people enter the room; they are all actors, which the subject does not know. One after another, they give wrong answers, saying “number 1,” although it’s very clear that number 3 is the correct answer. Then it is the subject’s turn again. In one-third of cases, he will answer incorrectly to match the other people’s responses

Rational Decision Making and The Sunk Cost Fallacy

The sunk cost fallacy is most dangerous when we have invested a lot of time, money, energy, or love in something. This investment becomes a reason to carry on, even if we are dealing with a lost cause. The more we invest, the greater the sunk costs are, and the greater the urge to continue becomes. … Rational decision making requires you to forget about the costs incurred to date. No matter how much you have already invested, only your assessment of the future costs and benefits counts.

Avoid Negative Black Swans

But even if you feel compelled to continue as such, avoid surroundings where negative Black Swans thrive. This means: Stay out of debt, invest your savings as conservatively as possible, and get used to a modest standard of living—no matter whether your big breakthrough comes or not

Disconfirming Evidence

Munger Destroy ideas

The confirmation bias is alive and well in the business world. One example: An executive team decides on a new strategy. The team enthusiastically celebrates any sign that the strategy is a success. Everywhere the executives look, they see plenty of confirming evidence, while indications to the contrary remain unseen or are quickly dismissed as “exceptions” or “special cases.” They have become blind to disconfirming evidence.

Still curious? Read The Art of Thinking Clearly.

The Lost Genius of Irrationality

The design of better heuristics … seems to be an extraordinary area for improving social behavior.

Rory Sutherland gave a great talk (below) at TEDxOxford, which is well worth your time. Rory explains how some rules work for individuals but not groups, the effect of naming a behavior, the origins of the word designated driver, and the “football illusion.”

Some rules work from the assumption of human agency but once you understand the extent to which our actions are actually influenced by the actions of others it completely falls down. Rory offers the example of a speed camera.

The person in the right hand lane (left hand lane in North America) should at all times go a bit faster than the person in the lane to the left of them. And so what the speed camera does is it tries to encourage everybody to drive at the same speed in both lanes, which is deeply inimitable to us instinctively because everything about us says the right lane must go a bit faster. And there are good reasons for this. If you don’t have a difference in speed between the lanes it’s impossible for people to change lanes when they need to.

How to Make Better Decisions In Life And Work

You’re probably not as effective at making decisions as you could be.

Don’t worry. I’m going to show you how you can make better decisions in work and life.

We’re going to explore Chip and Dan Heaths’ new book, Decisive. It’s going to help us make better decisions both as individuals and in groups.

But before we get into that, you should think about a tough decision you’re grappling with right now. Having a decision working in your mind as you’re reading this post will help make the advice in here real.

Ok, let’s dig in.

“A remarkable aspect of your mental life is that you are rarely stumped … The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it.”
Daniel Kahneman in Thinking, Fast and Slow

We’re quick to jump to conclusions because we give too much weight to the information in front of us and we fail to search for new information, which might disprove our thoughts.

Nobel Prize winning Psychologist Daniel Kahneman called this tendency “what you see is all there is.” But that’s not the only reason we don’t make good decisions — there are many others. We’re overconfident. We look for information that fits our thoughts and ignore information that doesn’t. We are overly influenced by authority. We choose the short-term over the long-term. Once we’ve made a decision we find it hard to change our mind. In short our brains are flawed. I could go on.

Knowing about these and other biases isn’t enough; it doesn’t help us fix the problem. We need a framework for making decisions. In Decisive, the Heaths introduce a four-step process designed to counteract many biases.

In keeping with Kahneman’s visual metaphor, the Heaths refer to the tendency to see only what’s in front of us as a “spotlight” effect.

And that, in essence, is the core difficulty of decision making. What’s in the spotlight will rarely be everything we need to make good decisions, but we won’t always remember to shift the light.

Most of us rarely use a process for thinking about things. If we do use one it’s likely to be the pros-and-cons list. While better than nothing, this approach is still deeply flawed because it doesn’t really account for biases.

The Four Villains of Decision Making

  1. Narrow Framing: “… the tendency to define our choices to narrowly, to see them in binary terms. We ask, “Should I break up with my partner or not?” instead of “What are the ways I could make this relationship better?”
  2. Confirmation Bias: “When people have the opportunity to collect information from the world, they are more likely to select information that supports their preexisting attitudes, beliefs, and actions.” We pretend we want the truth, yet all we really want is reassurance.
  3. Short-term Emotion: “When we’ve got a difficult decision to make, our feelings churn. We replay the same arguments in our head. We agonize about our circumstances. We change our minds from day to day. If our decision was represented on a spreadsheet, none of the numbers would be changing—there’s no new information being added—but it doesn’t feel that way in our heads.”
  4. Overconfidence: “People think they know more than they do about how the future will unfold.”

The Heaths came up with a process to help us overcome these villains and make better choices. “We can’t deactivate our biases, but … we can counteract them with the right discipline.” The nature of each of the four decision-making villains suggests a strategy for how to defeat it.

1. You encounter a choice. But narrow framing makes you miss options. So … Widen Your Options. How can you expand your sent of choices? …

2. You analyze your options. But the confirmation bias leads you to gather self-serving information. So … Reality-Test Your Assumptions. How can you get outside your head and collect information you can trust? …

3. You make a choice. But short-term emotion will often tempt you to make the wrong one. So … Attain Distance Before Deciding. How can you overcome short-term emotion and conflicted feelings to make better choices? …

4. Then you live with it. But you’ll often be overconfident about how the future will unfold. So … Prepare to Be Wrong. How can we plan for an uncertain future so that we give our decisions the best chance to succeed? …

They call this WRAP. “At its core, the WRAP model urges you to switch from “auto spotlight” to manual spotlight.

WRAP

All in all this was a great book. We focus our efforts on analysis. If a decision is wrong the analysis must have been the problem. Not only does this ignore the fact that you can have ‘bad outcomes’ with good decisions but it also places your spotlight on the analysis at the cost of the process by which the decision was made. More to come on this …

Read this next: What Matters More in Decisions: Analysis or Process?

 

What Matters More in Decisions: Analysis or Process?

Think of the last major decision your company made.

Maybe it was an acquisition, a large purchase, or perhaps it was whether to launch a new product.

Odds are three things went into that decision: (1) It probably relied on the insights of a few key executives; (2) it involved some sort of fact gathering and analysis; and (3) it was likely enveloped in some sort of decision process—whether formal or informal—that translated the analysis into a decision.

Now how would you rate the quality of your organization’s strategic decisions?

If you’re like most executives, the answer wouldn’t be positive:

In a recent McKinsey Quarterly survey of 2,207 executives, only 28 percent said that the quality of strategic decisions in their companies was generally good, 60 percent thought that bad decisions were about as frequent as good ones, and the remaining 12 percent thought good decisions were altogether infrequent.

How could it be otherwise?

Product launches are frequently behind schedule and over budget. Strategic plans often ignore even the anticipated response of competitors. Mergers routinely fail to live up to the promises made in press releases. The persistence of these problems across time and organizations, both large and small, would indicate that we can make better decisions.

Looking at how organizations make decisions is a good place to start if we’re trying to improve the quality of decisions and remove cognitive biases.

While we often make decisions with our gut, these decisions leave us susceptible to biases. To counter the gut decision a lot of organizations gather data and analyze decisions. The widespread belief is that analysis reduces biases.

But is putting your faith in analysis any better than using your gut? What does the evidence say? Is there a better way?

Decisions: Analysis or Process

Dan Lovallo and Olivier Sibony set to find out. Lovallo is a professor at the University of Sydney and Olivier is a director at McKinsey & Company. Together they studied 1,048 “major” business decisions over five years.

What they discovered will surprise you.

Most of the decisions were not made based on gut calls but rather rigorous analysis. In short, most people did the all the leg work we think we’re supposed to do: they delivered large quantities of detailed analysis. Yet this wasn’t enough. “Our research indicates that, contrary to what one might assume, good analysis in the hands of managers who have good judgment won’t naturally yield good decisions.”

These two quotes by Warren Buffett and Charlie Munger explain how analysis can easily go astray.

I have no use whatsoever for projections or forecasts. They create an illusion of apparent precision. The more meticulous they are, the more concerned you should be. We never look at projections … — Warren Buffett

[Projections] are put together by people who have an interest in a particular outcome, have a subconscious bias, and its apparent precision makes it fallacious. They remind me of Mark Twain’s saying, ‘A mine is a hole in the ground owned by a liar.’ Projections in America are often a lie, although not an intentional one, but the worst kind because the forecaster often believes them himself. — Charlie Munger

But Lovallo and Sibony didn’t only look at analysis, they also asked executives about the process. Did they, for example, “explicitly explore and discuss major uncertainties or discuss viewpoints that contradicted the senior leader’s?”

So what matters more, process or analysis? After comparing the results they determined that “process mattered more than analysis—by a factor of six.

This finding does not mean that analysis is unimportant, as a closer look at the data reveals: almost no decisions in our sample made through a very strong process were backed by very poor analysis. Why? Because one of the things an unbiased decision-making process will do is ferret out poor analysis. The reverse is not true; superb analysis is useless unless the decision process gives it a fair hearing.

To illustrate the weakness of how most organizations make decisions, Sibony used an interesting analogy: the legal system.

Imagine walking into a courtroom where the trial consists of a prosecutor presenting PowerPoint slides. In 20 pretty compelling charts, he demonstrates why the defendant is guilty. The judge then challenges some of the facts of the presentation, but the prosecutor has a good answer to every objection. So the judge decides, and the accused man is sentenced.

That wouldn’t be due process, right? So if you would find this process shocking in a courtroom, why is it acceptable when you make an investment decision? Now of course, this is an oversimplification, but this process is essentially the one most companies follow to make a decision. They have a team arguing only one side of the case. The team has a choice of what points it wants to make and what way it wants to make them. And it falls to the final decision maker to be both the challenger and the ultimate judge. Building a good decision-making process is largely ensuring that these flaws don’t happen.

Understanding biases doesn’t make you immune to them. A disciplined decision process is the best place to improve the quality of decisions and guard against common decision-making biases.

Still curious? Read this next: A process to make better decisions.

The inspiration for this post comes from Chip and Dan Heath in Decisive.

Searching Google, And Finding Ourselves

Nicolas Carr, author of The Shallows: What the Internet Is Doing To Our Brain, speaks to how the digital age is transforming what it means to ‘search.’

When we talk about “searching” these days, we’re almost always talking about using Google to find something online.

That’s a big change for a word that long carried existential connotations — a word that had been bound up in our sense of what it meant to be human. We didn’t just search for car keys or missing socks. We searched for truth, for meaning, for transcendence. Searching was an act of exploration that took us out into the world, beyond ourselves, in order to know the world, and ourselves, more fully.

In its original form, the Google search engine did just that. It transported us out into a messy and confusing world — the world of the web — with the intent of helping us make sense of it.

But that’s less true now. Google’s big goal is no longer to read the web. It’s to read us.

… These days, Google’s search engine doesn’t push us outward so much as turn us inward. It gives us information that fits the pattern of behavior and thinking we’ve displayed in the past. It reinforces our biases rather than challenging them, and subverts the act of searching in its most meaningful sense.

As Eli Pariser writes in The Filter Bubble: “When technology’s job is to show you the world, it ends up sitting between you and reality, like a camera lens.”

(h/t Annie)

Still curious? Check out The Filter Bubble — What the Internet is Hiding From You and DuckDuckGo.