Tag Archives: Decision Making

How Analogies Reveal Connections, Spark Innovation, and Sell Our Greatest Ideas

Image Source: XKCD
Source: xkcd.com

John Pollack is a former Presidential Speechwriter. If anyone knows the power of words to move people to action, shape arguments, and persuade, it is he.

In Shortcut: How Analogies Reveal Connections, Spark Innovation, and Sell Our Greatest Ideas, he explores the powerful role of analogy in persuasion and creativity.

One of the key tools he uses for this is analogy.

While they often operate unnoticed, analogies aren’t accidents, they’re arguments—arguments that, like icebergs, conceal most of their mass and power beneath the surface. In arguments, whoever has the best argument wins.

But analogies do more than just persuade others — they also play a role in innovation and decision making.

From the bloody Chicago slaughterhouse that inspired Henry Ford’s first moving assembly line, to the “domino theory” that led America into the Vietnam War, to the “bicycle for the mind” that Steve Jobs envisioned as a Macintosh computer, analogies have played a dynamic role in shaping the world around us.

Despite their importance, many people have only a vague sense of the definition.

What is an Analogy?

In broad terms, an analogy is simply a comparison that asserts a parallel—explicit or implicit—between two distinct things, based on the perception of a share property or relation. In everyday use, analogies actually appear in many forms. Some of these include metaphors, similes, political slogans, legal arguments, marketing taglines, mathematical formulas, biblical parables, logos, TV ads, euphemisms, proverbs, fables and sports clichés.

Because they are so disguised they play a bigger role than we consciously realize. Not only do analogies effectively make arguments, but they trigger emotions. And emotions make it hard to make rational decisions.

While we take analogies for granted, the ideas they convey are notably complex.

All day every day, in fact, we make or evaluate one analogy after the other, because some comparisons are the only practical way to sort a flood of incoming data, place it within the content of our experience, and make decisions accordingly.

Remember the powerful metaphor — that arguments are war. This shapes a wide variety of expressions like “your claims are indefensible,” “attacking the weakpoints,” and “You disagree, OK shoot.”

Or consider the Map and the Territory — Analogies give people the map but explain nothing of the territory.

Warren Buffett is one of the best at using analogies to communicate effectively. One of my favorite analogies is when he noted “You never know who’s swimming naked until the tide goes out.” In other words, when times are good everyone looks amazing. When times suck, hidden weaknesses are exposed. The same could be said for analogies:

We never know what assumptions, deceptions, or brilliant insights they might be hiding until we look beneath the surface.

Most people underestimate the importance of a good analogy. As with many things in life, this lack of awareness comes at a cost. Ignorance is expensive.

Evidence suggests that people who tend to overlook or underestimate analogy’s influence often find themselves struggling to make their arguments or achieve their goals. The converse is also true. Those who construct the clearest, most resonant and apt analogies are usually the most successful in reaching the outcomes they seek.

The key to all of this is figuring out why analogies function so effectively and how they work. Once we know that, we should be able to craft better ones.

Don’t Think of an Elephant

Effective, persuasive analogies frame situations and arguments, often so subtly that we don’t even realize there is a frame, let alone one that might not work in our favor. Such conceptual frames, like picture frames, include some ideas, images, and emotions and exclude others. By setting a frame, a person or organization can, for better or worse, exert remarkable influence on the direction of their own thinking and that of others.

He who holds the pen frames the story. The first person to frame the story controls the narrative and it takes a massive amount of energy to change the direction of the story. Sometimes even the way that people come across information, shapes it — stories that would be a non-event if disclosed proactively became front page stories because someone found out.

In Don’t Think of an Elephant, George Lakoff explores the issue of framing. The book famously begins with the instruction “Don’t think of an elephant.”

What’s the first thing we all do? Think of an elephant, of course. It’s almost impossible not to think of an elephant. When we stop consciously thinking about it, it floats away and we move on to other topics — like the new email that just arrived. But then again it will pop back into consciousness and bring some friends — associated ideas, other exotic animals, or even thoughts of the GOP.

“Every word, like elephant, evokes a frame, which can be an image of other kinds of knowledge,” Lakoff writes. This is why we want to control the frame rather than be controlled by it.

In Shortcut Pollack tells of Lakoff talking about an analogy that President George W. Bush made in the 2004 State of the Union address, in which he argued the Iraq war was necessary despite the international criticism. Before we go on, take Bush’s side here and think about how you would argue this point – how would you defend this?

In the speech, Bush proclaimed that “America will never seek a permission slip to defend the security of our people.”

As Lakoff notes, Bush could have said, “We won’t ask permission.” But he didn’t. Instead he intentionally used the analogy of permission slip and in so doing framed the issue in terms that would “trigger strong, more negative emotional associations that endured in people’s memories of childhood rules and restrictions.”

Commenting on this, Pollack writes:

Through structure mapping, we correlate the role of the United States to that of a young student who must appeal to their teacher for permission to do anything outside the classroom, even going down the hall to use the toilet.

But is seeking diplomatic consensus to avoid or end a war actually analogous to a child asking their teacher for permission to use the toilet? Not at all. Yet once this analogy has been stated (Farnam Street editorial: and tweeted), the debate has been framed. Those who would reject a unilateral, my-way-or-the-highway approach to foreign policy suddenly find themselves battling not just political opposition but people’s deeply ingrained resentment of childhood’s seemingly petty regulations and restrictions. On an even subtler level, the idea of not asking for a permission slip also frames the issue in terms of sidestepping bureaucratic paperwork, and who likes bureaucracy or paperwork.

Deconstructing Analogies

Deconstructing analogies, we find out how they function so effectively. Pollack argues they meet five essential criteria.

  1. Use the highly familiar to explain something less familiar.
  2. Highlight similarities and obscure differences.
  3. Identify useful abstractions.
  4. Tell a coherent story.
  5. Resonate emotionally.

Let’s explore how these work in greater detail. Let’s use the example of master-thief, Bruce Reynolds, who described the Great Train Robbery as his Sistine Chapel.

The Great Train Robbery

In the dark early hours of August 8, 1963, an intrepid gang of robbers hot-wired a six-volt battery to a railroad signal not far from the town of Leighton Buzzard, some forty miles north of London. Shortly, the engineer of an approaching mail train, spotting the red light ahead, slowed his train to a halt and sent one of his crew down the track, on foot, to investigate. Within minutes, the gang overpowered the train’s crew and, in less than twenty minutes, made off with the equivalent of more than $60 million in cash.

Years later, Bruce Reynolds, the mastermind of what quickly became known as the Great Train Robbery, described the spectacular heist as “my Sistine Chapel.”

Use the familiar to explain something less familiar

Reynolds exploits the public’s basic familiarity with the famous chapel in the Vatican City, which after Leonardo da Vinci’s Mona Lisa is perhaps the best-known work of Renaissance art in the world. Millions of people, even those who aren’t art connoisseurs, would likely share the cultural opinion that the paintings in the chapel represent “great art” (as compared to a smaller subset of people who might feel the same way about Jackson Pollock’s drip paintings, or Marcel Duchamp’s upturned urinal).

Highlight similarities and obscure differences

Reynold’s analogy highlights, through implication, similarities between the heist and the chapel—both took meticulous planning and masterful execution. After all, stopping a train and stealing the equivalent of $60m—and doing it without guns—does require a certain artistry. At the same time, the analogy obscures important differences. By invoking the image of a holy sanctuary, Reynolds triggers a host of associations in the audience’s mind—God, faith, morality, and forgiveness, among others—that camouflage the fact that he’s describing an action few would consider morally commendable, even if the artistry involved in robbing that train was admirable.

Identify useful abstractions

The analogy offers a subtle but useful abstraction: Genius is genius and art is art, no matter what the medium. The logic? If we believe that genius and artistry can transcend genre, we must concede that Reynolds, whose artful, ingenious theft netted millions, is an artist.

Tell a coherent story

The analogy offers a coherent narrative. Calling the Great Train Robbery his Sistine Chapel offers the audience a simple story that, at least on the surface makes sense: Just as Michelangelo was called by God, the pope, and history to create his greatest work, so too was Bruce Reynolds called by destiny to pull off the greatest robbery in history. And if the Sistine Chapel endures as an expression of genius, so too must the Great Train Robbery. Yes, robbing the train was wrong. But the public perceived it as largely a victimless crime, committed by renegades who were nothing if not audacious. And who but the most audacious in history ever create great art? Ergo, according to this narrative, Reynolds is an audacious genius, master of his chosen endeavor, and an artist to be admired in public.

There is an important point here. The narrative need not be accurate. It is the feelings and ideas the analogy evokes that make it powerful. Within the structure of the analogy, the argument rings true. The framing is enough to establish it succulently and subtly. That’s what makes it so powerful.

Resonate emotionally

The analogy resonates emotionally. To many people, mere mention of the Sistine Chapel brings an image to mind, perhaps the finger of Adam reaching out toward the finger of God, or perhaps just that of a lesser chapel with which they are personally familiar. Generally speaking, chapels are considered beautiful, and beauty is an idea that tends to evoke positive emotions. Such positive emotions, in turn, reinforce the argument that Reynolds is making—that there’s little difference between his work and that of a great artist.

Jumping to Conclusions

Daniel Kahneman explains the two thinking structures that govern the way we think: System one and system two . In his book, Thinking Fast and Slow, he writes “Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake are acceptable, and if the jump saves much time and effort.”

“A good analogy serves as an intellectual springboard that helps us jump to conclusions,” Pollack writes. He continues:

And once we’re in midair, flying through assumptions that reinforce our preconceptions and preferences, we’re well on our way to a phenomenon known as confirmation bias. When we encounter a statement and seek to understand it, we evaluate it by first assuming it is true and exploring the implications that result. We don’t even consider dismissing the statement as untrue unless enough of its implications don’t add up. And consider is the operative word. Studies suggest that most people seek out only information that confirms the beliefs they currently hold and often dismiss any contradictory evidence they encounter.

The ongoing battle between fact and fiction commonly takes place in our subconscious systems. In The Political Brain: The Role of Emotion in Deciding the Fate of the Nation, Drew Westen, an Emory University psychologist, writes: “Our brains have a remarkable capacity to find their way toward convenient truths—even if they are not all true.”

This also helps explain why getting promoted has almost nothing to do with your performance.

Remember Apollo Robbins? He’s a professional pickpocket. While he has unique skills, he succeeds largely through the choreography of people’s attention. “Attention,” he says “is like water. It flows. It’s liquid. You create channels to divert it, and you hope that it flows the right way.”

“Pickpocketing and analogies are in a sense the same,” Pollack concludes, “as the misleading analogy picks a listener’s mental pocket.”

And this is true whether someone else diverts our attention through a resonant but misleading analogy—“Judges are like umpires”—or we simply choose the wrong analogy all by ourselves.

Reasoning by Analogy

We rarely stop to see how much of our reasoning is done by analogy. In a 2005 study published in the Harvard Business Review, Giovanni Gavettie and Jan Rivkin wrote: “Leaders tend to be so immersed in the specifics of strategy that they rarely stop to think how much of their reasoning is done by analogy.” As a result they miss things. They make connections that don’t exist. They don’t check assumptions. They miss useful insights. By contrast “Managers who pay attention to their own analogical thinking will make better strategic decisions and fewer mistakes.”


Shortcut goes on to explore when to use analogies and how to craft them to maximize persuasion.

Map and Territory

“(History) offers a ridiculous spectacle of a fragment expounding the whole.”
— Will Durant in Our Oriental Heritage


In 1931, in New Orleans, Louisiana, mathematician Alfred Korzybski presented a paper on mathematical semantics. To the non-technical reader, most of the paper reads like an abstruse argument on the relationship of mathematics to human language, and of both to physical reality. Important stuff certainly, but not necessarily immediately useful for the layperson.

However, in his string of arguments on the structure of language, Korzybski introduced and popularized the idea that the map is not the territory. In other words, the description of the thing is not the thing itself. The model is not reality. The abstraction is not the abstracted. This has enormous practical consequences.

To continue reading this post you must be a member. (Current members log-in here.)


To learn more about our membership program please visit this page. Or instantly sign up for a basic membership:

Join for $10/Month
Join for $100/Year


By signing up for a membership you’re helping me earn a living, funding more exclusive content and research, and enabling me to hire some people to help out.

If you love Farnam Street’s regular content, you’ll love our membership program.


Why Early Decisions Have the Greatest Impact and Why Growing too Much is a Bad Thing

I never went to Engineering school. My undergrad is Computer Science. Despite that I’ve always wanted to learn more about Engineering.

John Kuprenas and Matthew Frederick have put together a book, 101 Things I Learned in Engineering School, which contains some of the big ideas.

In the author’s note, Kuprenas writes:

(This book) introduces engineering largely through its context, by emphasizing the common sense behind some of its fundamental concepts, the themes intertwined among its many specialities, and the simple abstract principles that can be derived from real-world circumstances. It presents, I believe, some clear glimpses of the forest as well as the trees within it.

Here are three (of the many) things I noted in the book.


#8 An object receives a force, experiences stress, and exhibits strain.


Force, stress, and strain are used somewhat interchangeably in the lay world and may even be used with less than ideal rigor by engineers. However, they have different meanings.

A force, sometimes called “load,” exists external to and acts upon a body, causing it to change speed, direction, or shape. Examples of forces include water pressure on a submarine hull, snow loads on a bridge, and wind loads on the sides of a skyscraper.

Stress is the “Experience” of a body—its internal resistance to an external force acting on it. Stress is force divided by unit area, and is expressed in units such as pounds per square inch.

Strain is a product of stress. It is the measurable percentage of deformation or change in an object such as a change in length.

#48 Early decisions have the greatest impact.

Early Decisions Have Greater Impact

Decisions made just days or weeks into a project—assumptions of end-user needs, commitments to a schedule, the size and shape of a building footprint, and so on—have the most significant impact on design, feasibility, and cost. As decisions are made later and later in the design process, their influence decreases. Minor cost savings sometimes can be realized through value engineering in the later stages of design, but the biggest cost factors are embedded at the outset in a project’s DNA.

Everyone seems to understand this point on the surface and yet few people consider the implications. I know a lot of people who make their career on cleaning up their own mess. That is, they make a poor initial decision and then work extra hours while running around with stress and panic as they clean up their own mess. In the worst organizations these people are promoted for doing an exceptional job.

Proper management of early decisions produces more free time and lower stress.

#75 A successful system won’t necessarily work at a different scale.

Systems Scale

An imaginary team of engineers sought to build a “super-horse” that would be twice as tall as a normal horse. When they created it, they discovered it to be a troubled, inefficient beast. Not only was it two times the height of a normal horse, it was twice as wide and twice as long, resulting in an overall mass eight times greater than normal. But the cross sectional area of its veins and arteries was only four times that of a normal horse calling for its heart to work twice as hard. The surface area of its feed was four times that of a normal horse, but each foot had to support twice the weight per unit of surface area compared to a normal horse. Ultimately, the sickly animal had to be put down.

This becomes interesting when you think of the ideal size for things and how we, as well intentioned humans, often make things worse. This has a name. It’s called iatrogenics.

Let us briefly put an organizational lens on this. Inside organizations resources are scarce. Generally the more people you have under you the more influence and authority you have inside the organization. Unless there is a proper culture and incentive system in place, your incentive is to grow and not shrink. In fact, in all the meetings I’ve ever been in with senior management, I can’t recall anyone who ran a division saying I have too many resources. It’s a derivative of Parkinson’s Law — only work isn’t expanding to fill the time available. Instead, work is expanding to fill the number of people.

Contrast that with Berkshire Hathaway, run by Warren Buffett. In a 2010 letter to shareholders he wrote:

Our flexibility in respect to capital allocation has accounted for much of our progress to date. We have been able to take money we earn from, say, See’s Candies or Business Wire (two of our best-run businesses, but also two offering limited reinvestment opportunities) and use it as part of the stake we needed to buy BNSF.

In the 2014 letter he wrote:

To date, See’s has earned $1.9 billion pre-tax, with its growth having required added investment of only $40 million. See’s has thus been able to distribute huge sums that have helped Berkshire buy other businesses that, in turn, have themselves produced large distributable profits. (Envision rabbits breeding.) Additionally, through watching See’s in action, I gained a business education about the value of powerful brands that opened my eyes to many other profitable investments.

There is an optimal size to See’s. Had they retained that $1.9 billion in earnings they distributed to Berkshire, the CEO and management team might have a claim to bigger pay checks, they’d be managing ~$2 billion in assets instead of $40 million, but the result would have been very sub-optimal.

Our pursuit of growth beyond a certain point often ensures that one of the biggest forces in the world, time, is working against us. “What is missing,” writes Jeff Stibel in BreakPoint, “is that the unit of measure for progress isn’t size, it’s time.”


Other books in the series:
101 Things I Learned in Culinary School
101 Things I Learned in Business School
101 Things I Learned in Law School
101 Things I Learned in Film School

Bias from Self-Interest — Self Deception and Denial to Reduce Pain or Increase Pleasure; Regret Avoidance (Tolstoy effect)

We can ignore reality, but we cannot ignore the consequences of reality.

Bias from self-interest affects everything from how we see and filter information to how we avoid pain. It affects our self-preservation instincts and helps us rationalize our choices. In short, it permeates everything.


Our Self-Esteem

Our self-esteem can be a very important aspect of personal well-being, adjustment and happiness. It has been reported that people with higher self-esteem are happier with their lives, have fewer interpersonal problems, achieve at a higher and more consistent level and give in less to peer pressure.

The strong motivation to preserve a positive and consistent self-image is more than evident in our lives.

We attribute success to our own abilities and failures to environmental factors and we continuously rate ourselves as better than average on any subjective measure – ethics, beauty and ability to get along with others.

Look around – these positive illusions appear to be the rule rather than the exception in well-adjusted people.

However, sometimes life is harsh on us and gives few if any reasons for self-love.

We get fired, a relationship ends, and we end up making decisions which are not well aligned with our inner selves. And so we come up with ways to straighten our damaged self-image.

Under the influence of bias from self-interest we may find ourselves drifting away from facts and spinning them to the point they become acceptable. While the tendency is mostly harmless and episodical, there are cases when it grows extreme.

The imperfect and confusing realities of our life can activate strong responses, which helps us preserve ourselves and our fragile self-images. Usually amplified by love, death or chemical dependency, strong self-serving bias may leave the person with little capacity to assess the situation objectively.

In his speech, The Psychology of Human Misjudgment, Charlie Munger reflects on the extreme tendencies that serious criminals display in Tolstoy’s novels and beyond. Their defense mechanisms can be divided in two distinct types – they are either in denial of committing the crime at all or they think that the crime is justifiable in light of their hardships.

Munger coins the two cases the Tolstoy effect.

Avoiding Reality by Denying It

Denial occurs, when we encounter a serious thought about reality, but decide to ignore it.

Imagine one day you notice a strange, dark spot on your skin. You feel a sudden sense of anxiety, but soon go on with your day and forget about it. Weeks later, it has not gone away and has slowly become darker and you eventually decide to take action and visit the doctor.

In such cases, small doses of denial might serve us well. We have time to absorb the information slowly and figure out the next steps for action, in case our darkest fears come true. However, once denial becomes a prolonged measure for coping with troubling matters, causing our problems to amplify, we are bound to suffer from consequences.

The consequences can be different. The mildest one is a simple inability to move on with our lives.

Charlie Munger was startled to see a case of persistent denial in a family friend:

This first really hit me between the eyes when a friend of our family had a super-athlete, super-student son who flew off a carrier in the north Atlantic and never came back, and his mother, who was a very sane woman, just never believed that he was dead.

The case made him realize that denial is often amplified by intense feelings of love and death. We’re denying to avoid pain.

While denial of the death of someone close is usually harmless and understandable, it can become a significant problem, when we deny an issue that is detrimental to ourselves and others.

A good example of such issues are physical dependencies, such as alcoholism or drug addiction.

Munger advises to stay away from any opportunity to slip into an addiction, since the psychological effects are most damaging. The reality distortion that happens in the minds of drug addicts leads them to believe that they have remained in a respectable condition and with reasonable prospects even as their condition keeps deteriorating.

Rationalizing Our Choices

A less severe case of distortion, but no less foolish, is our tendency to rationalize the choices we have made.

Most of us have a positive concept of ourselves and we believe ourselves to be competent, moral and smart.

We can go to great lengths to preserve this self-image. No doubt we have all engaged in behaviors that are less than consistent with our inner self-image and then used phrases, such as “not telling the truth is not lying”, “I didn’t have the time” and “others are even worse” to justify our less than ideal actions.

This tendency in part can be explained by the engine that drives self-justification called cognitive dissonance. It is the state of tension that occurs, whenever we hold two opposing facts in our heads, such as “smoking is bad” and “I smoke two packs a day”.

Dissonance bothers us under any circumstances, but it becomes particularly unbearable, when our self-concept is threatened by it. After all, we spend our lives trying to lead lives that are consistent and meaningful. This drive “to save face” is so powerful that it often overrules and contradicts the pure effects of rewards and punishments as assumed by economic theory or observed in simple animal behavioral research.

The most obvious way to quiet dissonance is by quitting. However, a smoker that has tried to quit and failed can also quiet the other belief – namely that smoking is not all that bad. It is the simple and failure-free option that allows her to feel good about herself and requires hardly any effort. Having suspended our moral compass only once and found rationales for the bad, but fixable, choices gives us permission to repeat them in the future and continue the vicious cycle.

The Vicious Cycle of Self-Justification

Carol Tavris

Carol Tavris and Elliot Aronson in their book Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts explain the vicious cycle of choices with an analogy of a pyramid.

Consider the case of two reasonably honest students at the beginning of the term. They face the temptation to cheat on an important test. One of them gives in and the other does not. How do you think they will feel about cheating a week later?

Most likely their initially torn opinions will have polarized in light of their initial choices. Now take this effect and amplify it over the term. By the time they are through with the term two things will have happened:
1) They will be very far from each other in their beliefs
2) They will be convinced that they have always felt strongly about the issue and their side of the argument

Just like those students, we are often at the top of the choice pyramid, facing a decision whose consequences are morally ambiguous. This first choice then starts a process of entrapment of action – justification – further action, which increases the intensity of our commitment


Over time our choices reinforce themselves and towards the bottom of the pyramid, we find ourselves rolling toward increasingly extreme views.

Consider the famous Stanley Milgram experiment, where two thirds of the 3,000 subjects administered a life threatening level of electric shock to another person. While this study is often used to illustrate our obedience to authority, it also a demonstrates the effects of self-justification.

Simply imagine the scenario of someone asking you to do the favor inflicting 500V of potentially deadly and incredibly painful shock on another person for the sake of science. Chances are most of us would refuse it under any circumstances.

Now suppose the researcher tells you he is interested in effects of punishment on learning and you will have to inflict hardly noticeable electric impulses on another person. You are even encouraged to try the lower levels of 10V yourself to feel that the pain is hardly noticeable.

When you come along, suddenly the experimenter asks you to increase the shock to 20V, which seems like a small increase, so you agree without thinking much. Then the cascade continues – if you gave 20V shock, what is the harm in giving 30V? Suddenly you find yourself unable to draw the line, so you simply tag along with the instructions.

When people are asked in advance whether they would administer shock above 450V, nearly nobody believes they would. However, when facing the choice under pressing circumstances, two thirds of them did!

The implications here are powerful – if we don’t actively draw the line ourselves, our habits and circumstances will decide for us.

Making Smarter Choices

We will all do dumb things. We can’t help it. We are wired that way. However, we are not doomed to live in denial or keep striving to justify our actions. We always have the choice to correct our tendencies, once we recognize them.

A better understanding of our minds serves as the first step towards breaking the self-justification habit. It takes time, self-reflection and willingness to become more mindful about our behavior and reasons for our behavior, but it is well worth the effort.

The authors of Mistakes Were Made (But not By Me) give an example of conservative William Safire, who wrote a column criticizing (then and current) American presidential candidate Hillary Clinton’s efforts to conceal the identity of her health care task force. A few years later Dick Cheney, a Republican (conservative) candidate whom Safire admired, made a similar move to Clinton by insisting on keeping his energy task force secret.

The alarm bell in Safire’s head rang and he admits that the temptation to rationalize the occasion and apply double standards was enormous. However, he recognized the dissonance effects and ended up writing a similar column about Cheney.

We know that Safire’s ability to spot his own dissonance and do the fair thing is rare. People will bend over backward to reduce dissonance in a way that is favorable to them and their team. Resisting that urge is not easy to do, but it is much better than letting the natural psychological tendencies cripple the integrity of our behaviors. There are ways to make fairness easier.

Making Things Easier

On the personal level Charlie Munger suggests we should face two simple facts. Firstly, fixable, but unfixed bad performance is bad character and tends to create more of itself and cause more damage — a sort of Gresham’s Law. And, secondly, in demanding places like athletic teams, excuses and bad behavior will not get us far.

On the institutional level Munger advises to build a fair, meritocratic, demanding culture plus personnel handling methods that build up morale. His second piece of advice is severance of the worst offenders, when possible.

Munger expands on the second point by noting that it is not in any case possible to let go our children, but we must therefore try to fix them to our best ability. He gives a real life example of a child, who had the habit of taking candy from the stock of his father’s employer with the excuse that he had intended to replace it later. The father said words that never left the child:

“Son, it would be better for you to simply take all you want and call yourself a thief every time you do it.”

Turns out the child in this example was the dean of University of Southern California Business School, where Munger delivered the speech.

If we are effective, the lessons we teach our children will serve them well throughout their lives.


There is so much more to touch on with bias from self interest, including its relation to hierarchy, how it distorts information, how it feeds our desire for self-preservation and scarcity, how it impacts group preservation, its relationship to terrority etc.

Bias From Self-Interest is part of the Farnam Street latticework of mental models

The Four States of Mind

The truth is, you have control of your thoughts, reactions, and responses. And once you understand how powerful that choice can be, you’ll be able to change more aspects of your life than you can imagine.

We’re busier than ever. We’re often on autopilot.

We “go through the motions” without really paying attention to the decisions we’re making or the implications. This is often where we go in the wrong direction and our view becomes narrow – we miss the bigger opportunity.

Sebastian Bailey elaborates on this in Mind Gym: Achieve More by Thinking Differently.

We can focus internally or externally.

Internal Focus

When your focus is internal, it’s much like you’re having a conversation with yourself. Consider the voice you hear in your head as you read this book. Even while you’re reading our words, another dialogue might be asking if it’s worth continuing to read this chapter or if now is the time to have a cup of coffee. … When your focus is internal, you are conscious of the fact that you are thinking; you can hear and pay attention to the running commentary in your head.

External Focus
Where are you right now? What’s happening? What noises do you hear? Who is close to you?

External focus is an awareness of the things outside your own head. And when you focus in this way, you aren’t aware of what you’re thinking. Your attention is on what is going on, not on what you think about it, how to interpret it, or whether it could have an impact on your future.

When you are really caught up in something, whether it’s the thrill of a football game or the latest twist in your favorite reality show, you are externally focused. And when you find yourself thinking, Why am I wasting time watching this ridiculous reality show? you have returned to an internal focus.

Where should your focus be?

Your mind is always occupied in one of two places: what is going on inside your head or what is going on outside your head. It is impossible to focus at the same time on both what’s internal and what’s external, just as it is to focus on neither. What is possible, though, is to switch between them, which, with a little mental discipline, you can do pretty much whenever you want.

The truth is we need to alternate between being internally focused and being externally focused.

The Four States of Mind

When you combine the types of focus (internal and external) with the ways we focus (helpful and harmful) you get four distinct states of mind: autopilot, critical, thinking, and engaged.

We want to be in the helpful states. And we want to flip between thinking and engaged.

The Four States of Mind
First things first, we need to recognize what state of mind we’re in.

Autopilot: Recognizing Habits of the Mind

Autopilot kicks in when you allow what was once exciting and challenging to become boring or mundane. You stop thinking about the situation and, instead, respond in preprogrammed ways.

This happens in several ways. What turns autopilot on (and turns the thinking mind off)?

The Familiarity Trap

We label things and experiences to help us understand how they fit with the world around us. For example, you see someone crying and automatically think, Crying equals sad; therefore, that person must be upset. Your automatic response prevents you from considering alternative explanations. The person crying could be acting, chopping onions, or laughing so hard that tears are streaming down his or her face. But when you are caught in the familiarity trap, you are unlikely to consider these alternatives. The familiarity trap explains, say, why security officials at the airport rotate roles. If a person looks at an X-ray screen for long enough, a nuclear bomb might go through without that person noticing. Some pianists learn their pieces away from a keyboard so they won’t become too familiar with it and fall into autopilot when they perform.

The Single View

Of course, we all see the world through our own eyes. My eyes are different from your eyes. But when we try to consider an issue or solve a problem, we tend to assume that the way we see the world is the right way to see it. Why wouldn’t we? And yet our view isn’t always the right one. Thinking creatively demands that you look at a familiar problem with fresh eyes— using a perspective different from your own. To actually achieve this, you need to recognize that your mind is functioning on autopilot, temporarily fixed by your worldview and your life experiences.


To demonstrate that pressure often leads us to behave in autopilot mode, psychologists John Darley and Daniel Batson asked a group of seminary students to prepare a talk on the Good Samaritan parable. With the parable at the forefront of their minds, the seminarians were then asked to walk to the location where they were expected to deliver their talk. So far, the task seems pretty straightforward. However, this is where the cunning psychologists made life difficult. They had arranged for the seminarians to come across someone lying in the road, coughing, spluttering, and calling for help. To make matters more difficult, the psychologists had told half the seminarians that they were late for their talk and the other half that they had plenty of time. How many would stop to help the injured person? And which ones? Of those who were told they had plenty of time to reach their destination, 61 percent stopped to help, but of those who were told they were late, only 10 percent stopped. According to the observations of the psychologists, some seminarians literally stepped over the actor pretending to be injured. The slight change of situation moved the rushed seminarians into autopilot, making them forget what had been on their minds just moments before.

There is nothing wrong with letting your autopilot direct mundane activities you have to do and have no desire to change, like mowing the lawn or folding laundry. But as the study just described shows, there are times when you must take control of your thinking or risk missing key opportunities (in the case of the seminarians, the opportunity to put into action the very message they were about to deliver at a lecture).

Thinking: Actively Analyzing Your Thoughts

You are in a thinking state of mind when you are assessing options, deciding on a course of action, working through a problem, estimating the likely consequences or chain of events, or simply organizing your thoughts to make more sense of them. When you’re at your best in this state, your thoughts feel clear, precise, and positive.

This is useful when: solving problems and making decisions, correcting mistakes, making sense of a situation, and reflecting on the past.

One of the most effective ways of improving yourself is to learn from your past experiences, consider what you did well, and decide what you could do better in the future if you were in a similar situation.

So what does it mean to have an engaged state of mind?

An engaged state of mind exists when your focus is external, on something in your immediate environment, and when you’re performing at your best. If you can drive, you might recall the moment when you first drove somewhere on your own without thinking, Check mirror, change gear, right blinker, but instead your attention was completely on the road ahead and the other motorists while you sang along to the radio. Or you might recall the first time you skied to the bottom of a slope and you were not quite sure how you got there, but it felt great.


When you are absorbed by what you are doing, you are engaged and totally present. By not judging yourself, you interfere less with the task at hand and allow your potential to take over.

This is what Mihaly Csikszentmihalyi calls flow.

Turning the Autopilot Off

Look for something new.

Practice scanning your environment, consciously looking for what is new, different, and unusual. Ask yourself questions, like How has this street changed since the last time I walked down it? What are the differences between the people on the train? What do I notice today about my colleagues? These questions might seem silly, but they force you to live in, think about, and focus on the present— to become aware of your surroundings and not slip back into autopilot.

Learn that “always” isn’t absolute.

One of the reasons why all of us can get caught in autopilot is that we tend to see the world as a set of absolutes. You are apt to believe that such and such will always happen, because so far it always has. This is a mental shortcut, which saves you from having to think about it again. As a result, your thinking falls into patterns of your own making and you are, in effect, switching on the autopilot.

Accept other people’s perspectives.

Have you ever had a boss or colleague you thought was overbearing, dogmatic, aggressive, or rude? Do you think they saw themselves in that way? Surprisingly enough, they might not. If they were asked to describe themselves, they might say they were assertive, direct, honest, and candid. One of the reasons why conflicts can get so ugly is that it’s easy to fall into a state of autopilot and respond to others without thinking or without considering others’ perspectives. By staying alert to other people’s perspectives, you can move out of autopilot and into a more constructive state of awareness.

Another tip, build reflection into your routine. Check out this guide on meditation to get started. You also want to focus on process not outcome.

By focusing on the steps you need to take to get where you want to go, rather than on the eventual outcome, your mind switches from critical noise to being engaged.

The Ideal State of Mind

An ideal state of mind fluctuates between thinking and engaged— whatever a current situation demands of you. There isn’t a formula that dictates when you should be in one state and when you should be in the other, but much like dancing, you need to find a rhythm and delicately move as the situation (or music) requires.

Try listening to your thoughts without critiquing. Attempt to stay neutral. Once you’ve mastered that try to consciously notice more, make an effort to practice and be present in the moment.


Mind Gym: Achieve More by Thinking Differently is full of mind-expanding content.

The Psychology of Risk and Reward

The Psychology of Risk and Reward

An excerpt from The Aspirational Investor: Taming the Markets to Achieve Your Life’s Goals that I think you’d enjoy.

Most of us have a healthy understanding of risk in the short term.

When crossing the street, for example, you would no doubt speed up to avoid an oncoming car that suddenly rounds the corner.

Humans are wired to survive: it’s a basic instinct that takes command almost instantly, enabling our brains to resolve ambiguity quickly so that we can take decisive action in the face of a threat.

The impulse to resolve ambiguity manifests itself in many ways and in many contexts, even those less fraught with danger. Glance at the (above) picture for no more than a couple of seconds. What do you see?

Some observers perceive the profile of a young woman with flowing hair, an elegant dress, and a bonnet. Others see the image of a woman stooped in old age with a wart on her large nose. Still others—in the gifted minority—are able to see both of the images simultaneously.

What is interesting about this illusion is that our brains instantly decide what image we are looking at, based on our first glance. If your initial glance was toward the vertical profile on the left-hand side, you were all but destined to see the image of the elegant young woman: it was just a matter of your brain interpreting every line in the picture according to the mental image that you already formed, even though each line can be interpreted in two different ways. Conversely, if your first glance fell on the central dark horizontal line that emphasizes the mouth and chin, your brain quickly formed an image of the older woman.

Regardless of your interpretation, your brain wasn’t confused. It simply decided what the picture was and filled in the missing pieces. Your brain resolved ambiguity and extracted order from conflicting information.

What does this have to do with decision making? Every bit of information can be interpreted differently according to our perspective. Ashvin Chhabra directs us to investing. I suggest you reframe this in the context of decision making in general.

Every trade has a seller and a buyer: your state of mind is paramount. If you are in a risk-averse mental framework, then you are likely to interpret a further fall in stocks as additional confirmation of your sell bias. If instead your framework is positive, you will interpret the same event as a buying opportunity.

The challenge of investing is compounded by the fact that our brains, which excel at resolving ambiguity in the face of a threat, are less well equipped to navigate the long term intelligently. Since none of us can predict the future, successful investing requires planning and discipline.

Unfortunately, when reason is in apparent conflict with our instincts—about markets or a “hot stock,” for example—it is our instincts that typically prevail. Our “reptilian brain” wins out over our “rational brain,” as it so often does in other facets of our lives. And as we have seen, investors trade too frequently, and often at the wrong time.

One way our brains resolve conflicting information is to seek out safety in numbers. In the animal kingdom, this is called “moving with the herd,” and it serves a very important purpose: helping to ensure survival. Just as a buffalo will try to stay with the herd in order to minimize its individual vulnerability to predators, we tend to feel safer and more confident investing alongside equally bullish investors in a rising market, and we tend to sell when everyone around us is doing the same. Even the so-called smart money falls prey to a herd mentality: one study, aptly titled “Thy Neighbor’s Portfolio,” found that professional mutual fund managers were more likely to buy or sell a particular stock if other managers in the same city were also buying or selling.

This comfort is costly. The surge in buying activity and the resulting bullish sentiment is self-reinforcing, propelling markets to react even faster. That leads to overvaluation and the inevitable crash when sentiment reverses. As we shall see, such booms and busts are characteristic of all financial markets, regardless of size, location, or even the era in which they exist.

Even though the role of instinct and human emotions in driving speculative bubbles has been well documented in popular books, newspapers, and magazines for hundreds of years, these factors were virtually ignored in conventional financial and economic models until the 1970s.

This is especially surprising given that, in 1951, a young PhD student from the University of Chicago, Harry Markowitz, published two very important papers. The first, entitled “Portfolio Selection,” published in the Journal of Finance, led to the creation of what we call modern portfolio theory, together with the widespread adoption of its important ideas such as asset allocation and diversification. It earned Harry Markowitz a Nobel Prize in Economics.

The second paper, entitled “The Utility of Wealth” and published in the prestigious Journal of Political Economy, was about the propensity of people to hold insurance (safety) and to buy lottery tickets at the same time. It delved deeper into the psychological aspects of investing but was largely forgotten for decades.

The field of behavioral finance really came into its own through the pioneering work of two academic psychologists, Amos Tversky and Daniel Kahneman, who challenged conventional wisdom about how people make decisions involving risk. Their work garnered Kahneman the Nobel Prize in Economics in 2002. Behavioral finance and neuroeconomics are relatively new fields of study that seek to identify and understand human behavior and decision making with regard to choices involving trade-offs between risk and reward. Of particular interest are the human biases that prevent individuals from making fully rational financial decisions in the face of uncertainty.

As behavioral economists have documented, our propensity for herd behavior is just the tip of the iceberg. Kahneman and Tversky, for example, showed that people who were asked to choose between a certain loss and a gamble, in which they could either lose more money or break even, would tend to choose the double down (that is, gamble to avoid the prospect of losses), a behavior the authors called “loss aversion.” Building on this work, Hersh Shefrin and Meir Statman, professors at the University of Santa Clara Leavey School of Business, have linked the propensity for loss aversion to investors’ tendency to hold losing investments too long and to sell winners too soon. They called this bias the disposition effect.

The lengthy list of behaviorally driven market effects often converge in an investor’s tale of woe. Overconfidence causes investors to hold concentrated portfolios and to trade excessively, behaviors that can destroy wealth. The illusion of control causes investors to overestimate the probability of success and underestimate risk because of familiarity—for example, causing investors to hold too much employer stock in their 401(k) plans, resulting in under-diversification. Cognitive dissonance causes us to ignore evidence that is contrary to our opinions, leading to myopic investing behavior. And the representativeness bias leads investors to assess risk and return based on superficial characteristics—for example, by assuming that shares of companies that make products you like are good investments.

Several other key behavioral biases come into play in the realm of investing. Framing can cause investors to make a decision based on how the question is worded and the choices presented. Anchoring often leads investors to unconsciously create a reference point, say for securities prices, and then adjust decisions or expectations with respect to that anchor. This bias might impede your ability to sell a losing stock, for example, in the false hope that you can earn your money back. Similarly, the endowment bias might lead you to overvalue a stock that you own and thus hold on to the position too long. And regret aversion may lead you to avoid taking a tough action for fear that it will turn out badly. This can lead to decision paralysis in the wake of a market crash, even though, statistically, it is a good buying opportunity.

Behavioral finance has generated plenty of debate. Some observers have hailed the field as revolutionary; others bemoan the discipline’s seeming lack of a transcendent, unifying theory. This much is clear: behavioral finance treats biases as mistakes that, in academic parlance, prevent investors from thinking “rationally” and cause them to hold “suboptimal” portfolios.

But is that really true? In investing, as in life, the answer is more complex than it appears. Effective decision making requires us to balance our “reptilian brain,” which governs instinctive thinking, with our “rational brain,” which is responsible for strategic thinking. Instinct must integrate with experience.

Put another way, behavioral biases are nothing more than a series of complex trade-offs between risk and reward. When the stock market is taking off, for example, a failure to rebalance by selling winners is considered a mistake. The same goes for a failure to add to a position in a plummeting market. That’s because conventional finance theory assumes markets to be inherently stable, or “mean-reverting,” so most deviations from the historical rate of return are viewed as fluctuations that will revert to the mean, or self-correct, over time.

But what if a precipitous market drop is slicing into your peace of mind, affecting your sleep, your relationships, and your professional life? What if that assumption about markets reverting to the mean doesn’t hold true and you cannot afford to hold on for an extended period of time? In both cases, it might just be “rational” to sell and accept your losses precisely when investment theory says you should be buying. A concentrated bet might also make sense, if you possess the skill or knowledge to exploit an opportunity that others might not see, even if it flies in the face of conventional diversification principles.

Of course, the time to create decision rules for extreme market scenarios and concentrated bets is when you are building your investment strategy, not in the middle of a market crisis or at the moment a high-risk, high-reward opportunity from a former business partner lands on your desk and gives you an adrenaline jolt. A disciplined process for managing risk in relation to a clear set of goals will enable you to use the insights offered by behavioral finance to your advantage, rather than fall prey to the common pitfalls. This is one of the central insights of the Wealth Allocation Framework. But before we can put these insights to practical use, we need to understand the true nature of financial markets.