Author Archives: Farnam Street Team

The 10 Qualities of Creative Leaders

David Ogilvy was an advertising legend and perhaps the original “Mad Man.”

He’s offered advice on how to create advertising that sells and ten amazing tips on writing.

The Unpublished David Ogilvy offers a remarkably candid glimpse of the private man behind the public image.

Ogilvy was fond of lists. This one outlines the ten qualities of creative leaders.

The qualifications I look for in our (creative) leaders are these:

  1. High standards of personal ethics.
  2. Big people, without pettiness.
  3. Guts under pressure, resilience in defeat.
  4. Brilliant brains — not safe plodders.
  5. A capacity for hard work and midnight oil.
  6. Charisma — charm and persuasiveness.
  7. A streak of unorthodoxy — creative innovators.
  8. The courage to make tough decisions.
  9. Inspiring enthusiasts — with trust and gusto.
  10. A sense of humor.


Still curious about Ogilvy? Read why education is a priceless opportunity to furnish your mind and scientific advertising.

Albert Bandura on Acquiring Self-Efficacy and Personal Agency

Albert Bandura

Psychologist Albert Bandura is famous for his social learning theory which is really more of a model than a theory.

He stresses the importance of observational learning. Who you spend time with matters. “Learning would be exceedingly laborious, not to mention hazardous, if people had to rely solely on the effects of their own actions to inform them what to do,” Bandura explains.

There is an excerpt in Stronger: Develop the Resilience You Need to Succeed that explains how we can acquire and maintain the factors of personal resilience.

1. Seek to successfully demonstrate and repeatedly practice each of our five factors of personal resilience. Success is a powerful learning tool—Just do it! If the challenge is too large or complex at first, start by taking small steps in the desired direction. Don’t try to achieve too much at first. And keep trying until you succeed. The first success is the hardest.

2. Observe resilient people. Use them as role models. Human beings learn largely by observation. Frequent venues where you can watch people exhibiting the skills you wish to acquire. Read books about people who have overcome obstacles similar to those you face. Call or write them. Ask them to share their lessons learned. Their successes will be contagious.

3. Vigorously pursue the encouragement and support of others. Affiliate with supportive and compassionate people who are willing to give of themselves to be supportive of you.

4. Practice self-control. In highly stressful times, myriad physiological and behavioral reactions occur. Physiologically, people experience the fight-or-flight response we mentioned in Chapter One. This cascade of hormones such as adrenalin better prepares you to fight or to flee a threat. They increase your heart rate, muscle strength, and tension. They dramatically improve your memory for certain things while decreasing your ability to remember others, and they cause your blood vessels to shift their priorities. This often results in headaches, cold hands and feet, and even an upset gastrointestinal system. The most significant problem, however, is that this very basic survival mechanism also tends to interfere with rational judgment and problem solving.

According to Bandura we need to control the stress around us so that it doesn’t become excessive, in part because we often act without thinking in stressful situations.

People often act impulsively in reaction to stressful events, sometimes running away from them. Remember the 1999 movie Runaway Bride, starring Richard Gere and Julia Roberts? It was the fictional story of a woman who had a penchant for falling in love and getting engaged, then developing cold feet and leaving her fiances at the altars. On a more somber note, after the conclusion of the Vietnam War, many veterans chose to retreat to lives of isolation and solitude. The stress of war and the lack of social support motivated many to simply withdraw from society.

Similarly, over many years of clinical practice, we have seen individuals who have great difficulty establishing meaningful relationships after surviving a traumatic or vitriolic divorce. It’s hard for them to trust another person after having been “betrayed.” They exhibit approach-avoidance behaviors—engaging in a relationship initially but backing away when it intensifies.

Contrary to these patterns of escape and avoidance, sometimes people will impulsively act aggressively in response to stressful situations. Chronic irritability is often an early warning sign of subsequent escalating aggressive behavior. Rarely, although sometimes catastrophically, people will choose to lie, cheat, or steal in highly stressful situations. For years, psychologists have tried to predict dishonesty using psychological testing. The results have been uninspiring. The reason is that the best predictor of dishonesty is finding oneself in a highly stressful situation. So in highly stressful times, resist the impulsive urges to take the easy way out.

Also, remember to take care of yourself, physically as well as psychologically. Maladaptive self-medication is a common pattern of behavior for people who find themselves in the abyss. Alcohol has long been observed as a chemical crutch. Others that have only recently emerged are the myriad energy drinks on the market. Both of these crutches have been linked to numerous physical ailments and even deaths. If you are looking for the best single physical mechanism to aid you in your ascent from the abyss, it’s establishing healthy patterns of rest and sleep.

But note the distinction between controlling and suppressing. Often controlling is impossible so we suppress and fool ourselves into thinking we’re controlling. And suppressing volatility is often a horrible idea, especially in the long-run.

Instead of what’s intended, we create a coiled spring that most often leads to negative leaping emergent effects. In the end this moves us toward fragility and away from robustness and resiliency.


If you’re still curious, The Hour Between Dog and Wolf: How Risk Taking Transforms Us, Body and Mind discusses a bit of this topic as well.

Rendez-Vous with Art: The Pleasures and Pitfalls of Art

The pleasures and pitfalls of art

Philippe de Montebello was the longest-serving Director of the Metropolitan Museum of Art in New York (1977-2008). Martin Gayford was an acclaimed art critic. Their book, Rendez-Vous with Art, is structured around the conversations they had in churches, museums, and art galleries around the world. It’s an intimate look into the pleasures and pitfalls of art.

Starting with a fragment that’s left of the face of an Egyptian woman who lived 3,000 years ago, de Montebello and Gayford’s book confronts the elusive questions: how and why do we look at art? That is a large subject we will leave you to explore, but there are two parts of this book we wish to draw your attention to.


“If,” they write, “we stand in front of a work of art twice, at least one party — the viewer or the object — will be transformed on the second occasion. Works of art mutate through time, albeit slowly, as they are cleaned or ‘conserved’, or as their constituent materials age.”

As far as the object, Van Gogh’s Irises and Roses collection comes to mind. Van Gogh employed bright pigments in a way that encouraged them to lose their vibrancy over time, anticipating “time will only soften them too much.” The contrast between the originals and those we witness today is stark — color like all living organisms fades over time.

But even more important than the physical evolution of pieces are the ones happening internal to us.

Gayford writes:

Inevitably, we all inhabit a world of dissolving perspectives and ever-shifting views. The present is always moving, so from that vantage point the past constantly changes in appearance. That is on the grand, historical scale; but the same is true of our personal encounters with art, from the day to day. You can stand in front of Velazquez’s Las Meninas a thousand times, and every time it will be different because you will be altered: tired or full of energy, or dissimilar from your previous self in a multitude of ways.

… Our idea was to make a book that was neither art history not art criticism but an experiment in shared appreciation. It is, in other words, an attempt to get at not history or theory but the actual experience of looking at art: what it feels like on a particular occasion, which is of course the only way any of us can ever look at anything.

This brings to mind the famous fragment of Heraclitus: “You cannot step in the same river twice.”


There is undeniably a curatorial aspect to art. De Montebello notes that contrast between addition and subtraction when it comes to selection.  He writes:

In Europe, one often has a sense that a selection has been made by paring down a lot of inherited dynastic objects or spoils of colonization or war. Then a curatorial mind has built on that base. In the USA, you start ab initio. American museums large and small tend to be encyclopedic, whether you are in Toledo, Minneapolis, or elsewhere, because they started from nothing, and from the premise that they’d like to buy a little bit of everything: a couple of Chinese things, a few medieval things, and so on.

While there are differences in what’s on display at American museums, de Montebello also alludes to the “sameness in their governing principles and the criteria used for acquisitions.” The great museums, he argues, “are organisms, constantly changing, and mainly expanding. The collections grow, move in new directions, and, on rare occasion, get sold off. The buildings are adapted and frequently enlarged.” A visit to a museum in itself is part of the learning process.

De Montebello writes:

I have found that when I have forced myself — often with the help of curators — to look at things about which I was indifferent or that even repelled me, I discovered that, with a little knowledge, what had been hidden from me became manifest. I’ll give you an example: for a long time I approached galleries of Greek vases with a sense of dread; whether black- or red-figured, the vases all looked alike to me. Museums were often culpable as they tended to show far too many. So I’d walk into one of those rooms, take one look and dash for the exit. But a curator at the Met, Joan Mertens, told me once to go to the vitrines where only fragments, or shards, were shown. She stood beside me and said, look at one of them as if it were a drawing on paper.

I found I was able to look at it this way, forgetting that it was a fragment of a vessel, a three-dimensional utilitarian object. I could focus on the drawing itself, the line, the composition, and how marvellous it was. But the epiphany came when I was able to put surface decoration and vessel shape together, and look at them as one. It is the only correct way, incidentally.

Fragments are a representation of the whole—to appreciate them we have to engage beyond the instant gratification we so often seek. It often takes us repressing our ego and asking for help to truly see a piece or an exhibit, much like an adult who takes classes to appreciate Shakespeare.

De Montebello concludes:

[O]ne can be taught, and needs to taught, how to look, how to push aside one’s prejudices, one’s overly hasty negative reactions. For me, it was a long learning process, and I have to imagine that for the majority of visitors it can’t be easy either. …  The appreciation of art requires an engagement that is wholly different from the instant gratification provided by most popular forms of popular culture, and museums have a responsibility to help visitors achieve this.

This strikes a familiar note, as we have often called Farnam Street “curated interestingness” for the very same reason: We feel it’s our job to help you find and appreciate the best wisdom the world has to offer.


Rendez-Vous with Art adds to our expanding library on art, sitting next to The Power of Art and The Story of Art.

Richard Feynman On Fooling Ourselves Into Believing What Isn’t True

Richard Feynman Cargo Cult Science

Richard Feynman has long been one of my favorites — for both his wisdom and heart.

Reproduced below you can find the entirety of his 1974 commencement address at Caltech entitled Cargo Cult Science.

The entire speech requires about 10 minutes to read, which is time well invested if you ask me. If you’re pressed for time, however, there are two sections I wish to draw to your attention.

In the South Seas there is a Cargo Cult of people. During the war they saw airplanes land with lots of good materials, and they want the same thing to happen now. So they’ve arranged to make things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head like headphones and bars of bamboo sticking out like antennas—he’s the controller—and they wait for the airplanes to land. They’re doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn’t work. No airplanes land. So I call these things Cargo Cult Science, because they follow all the apparent precepts and forms of scientific investigation, but they’re missing something essential, because the planes don’t land.

You’re probably chuckling at this point. Yet many of us are no better. This is all around us. Thinking is hard and we fool ourselves, in part, because it’s easy. That’s Feynman’s point.

The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that. After you’ve not fooled yourself, it’s easy not to fool other scientists. You just have to be honest in a conventional way after that.

Your job is to find the current cargo cults.

When we start a project without determining what success looks like … when we mistake the map for the territory … when we look at outcomes without looking at process … when we blindly copy what others have done …. when we confuse correlation and causation we find ourselves on the runway.


Cargo Cult Science

During the Middle Ages there were all kinds of crazy ideas, such as that a piece of rhinoceros horn would increase potency. (Another crazy idea of the Middle Ages is these hats we have on today—which is too loose in my case.) Then a method was discovered for separating the ideas—which was to try one to see if it worked, and if it didn’t work, to eliminate it. This method became organized, of course, into science. And it developed very well, so that we are now in the scientific age. It is such a scientific age, in fact, that we have difficulty in understanding how­ witch doctors could ever have existed, when nothing that they proposed ever really worked—or very little of it did.

But even today I meet lots of people who sooner or later get me into a conversation about UFOs, or astrology, or some form of mysticism, expanded consciousness, new types of awareness, ESP, and so forth. And I’ve concluded that it’s not a scientific world.

Most people believe so many wonderful things that I decided to investigate why they did. And what has been referred to as my curiosity for investigation has landed me in a difficulty where I found so much junk to talk about that I can’t do it in this talk. I’m overwhelmed. First I started out by investigating various ideas of mysticism, and mystic experiences. I went into isolation tanks (they’re dark and quiet and you float in Epsom salts) and got many hours of hallucinations, so I know something about that. Then I went to Esalen, which is a hotbed of this kind of thought (it’s a wonderful place; you should go visit there). Then I became overwhelmed. I didn’t realize how much there was.

I was sitting, for example, in a hot bath and there’s another guy and a girl in the bath. He says to the girl, “I’m learning massage and I wonder if I could practice on you?” She says OK, so she gets up on a table and he starts off on her foot—working on her big toe and pushing it around. Then he turns to what is apparently his instructor, and says, “I feel a kind of dent. Is that the pituitary?” And she says, “No, that’s not the way it feels.” I say, “You’re a hell of a long way from the pituitary, man.” And they both looked at me—I had blown my cover, you see—and she said, “It’s reflexology.” So I closed my eyes and appeared to be meditating.

That’s just an example of the kind of things that overwhelm me. I also looked into extrasensory perception and PSI phenomena, and the latest craze there was Uri Geller, a man who is supposed to be able to bend keys by rubbing them with his finger. So went to his hotel room, on his invitation, to see a demonstration of both mind reading and bending keys. He didn’t do any mind reading that succeeded; nobody can read my mind, I guess. And my boy held a key and Geller rubbed it, and nothing happened. Then he told us it works better under water, and so you can picture all of us standing in the bathroom with the water turned on and the key under it, and him rubbing the key with his finger. Nothing happened. So I was unable to investigate that phenomenon.

But then I began to think, what else is there that we believe? (And I thought then about the witch doctors, and how easy it would have been to check on them by noticing that nothing really worked.) So I found things that even more people believe, such as that we have some knowledge of how to educate. There are big schools of reading methods and mathematics methods, and so forth, but if you notice, you’ll see the reading scores keep going down—or hardly going up—in spite of the fact that we continually use these same people to improve the methods. There’s a witch doctor remedy that doesn’t work. It ought to be looked into: how do they know that their method should work? Another example is how to treat criminals. We obviously have made no progress—lots of theory, but no progress—in decreasing the amount of crime by the method that we use to handle criminals.

Yet these things are said to be scientific. We study them. And I think ordinary people with commonsense ideas are intimidated by this pseudoscience. A teacher who has some good idea of how to teach her children to read is forced by the school system to do it some other way—or is even fooled by the school system into thinking that her method is not necessarily a good one. Or a parent of bad boys, after disciplining them in one way or another, feels guilty for the rest of her life because she didn’t do “the right thing,” according to the experts.

So we really ought to look into theories that don’t work, and science that isn’t science.

I tried to find a principle for discovering more of these kinds of things, and came up with the following system. Any time you find yourself in a conversation at a cocktail party—in which you do not feel uncomfortable that the hostess might come around and say, “Why are you fellows talking shop?’’ or that your wife will come around and say, “Why are you flirting again?”—then you can be sure you are talking about something about which nobody knows anything.

Using this method, I discovered a few more topics that I had forgotten—among them the efficacy of various forms of psychotherapy. So I began to investigate through the library, and so on, and I have so much to tell you that I can’t do it at all. I will have to limit myself to just a few little things. I’ll concentrate on the things more people believe in. Maybe I will give a series of speeches next year on all these subjects. It will take a long time.

I think the educational and psychological studies I mentioned are examples of what I would like to call Cargo Cult Science. In the South Seas there is a Cargo Cult of people. During the war they saw airplanes land with lots of good materials, and they want the same thing to happen now. So they’ve arranged to make things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head like headphones and bars of bamboo sticking out like antennas—he’s the controller—and they wait for the airplanes to land. They’re doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn’t work. No airplanes land. So I call these things Cargo Cult Science, because they follow all the apparent precepts and forms of scientific investigation, but they’re missing something essential, because the planes don’t land.

Now it behooves me, of course, to tell you what they’re missing. But it would be just about as difficult to explain to the South Sea Islanders how they have to arrange things so that they get some wealth in their system. It is not something simple like telling them how to improve the shapes of the earphones. But there is one feature I notice that is generally missing in Cargo Cult Science. That is the idea that we all hope you have learned in studying science in school—we never explicitly say what this is, but just hope that you catch on by all the examples of scientific investigation. It is interesting, therefore, to bring it out now and speak of it explicitly. It’s a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty—a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid—not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you’ve eliminated by some other experiment, and how they worked—to make sure the other fellow can tell they have been eliminated.

Details that could throw doubt on your interpretation must be given, if you know them. You must do the best you can—if you know anything at all wrong, or possibly wrong—to explain it. If you make a theory, for example, and advertise it, or put it out, then you must also put down all the facts that disagree with it, as well as those that agree with it. There is also a more subtle problem. When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.

In summary, the idea is to try to give all of the information to help others to judge the value of your contribution; not just the information that leads to judgment in one particular direction or another.

The easiest way to explain this idea is to contrast it, for example, with advertising. Last night I heard that Wesson Oil doesn’t soak through food. Well, that’s true. It’s not dishonest; but the thing I’m talking about is not just a matter of not being dishonest, it’s a matter of scientific integrity, which is another level. The fact that should be added to that advertising statement is that no oils soak through food, if operated at a certain temperature. If operated at another temperature, they all will—including Wesson Oil. So it’s the implication which has been conveyed, not the fact, which is true, and the difference is what we have to deal with.

We’ve learned from experience that the truth will out. Other experimenters will repeat your experiment and find out whether you were wrong or right. Nature’s phenomena will agree or they’ll disagree with your theory. And, although you may gain some temporary fame and excitement, you will not gain a good reputation as a scientist if you haven’t tried to be very careful in this kind of work. And it’s this type of integrity, this kind of care not to fool yourself, that is missing to a large extent in much of the research in Cargo Cult Science.

A great deal of their difficulty is, of course, the difficulty of the subject and the inapplicability of the scientific method to the subject. Nevertheless, it should be remarked that this is not the only difficulty. That’s why the planes don’t land—but they don’t land.

We have learned a lot from experience about how to handle some of the ways we fool ourselves. One example: Millikan measured the charge on an electron by an experiment with falling oil drops and got an answer which we now know not to be quite right. It’s a little bit off, because he had the incorrect value for the viscosity of air. It’s interesting to look at the history of measurements of the charge of the electron, after Millikan. If you plot them as a function of time, you find that one is a little bigger than Millikan’s, and the next one’s a little bit bigger than that, and the next one’s a little bit bigger than that, until finally they settle down to a number which is higher.

Why didn’t they discover that the new number was higher right away? It’s a thing that scientists are ashamed of—this history—because it’s apparent that people did things like this: When they got a number that was too high above Millikan’s, they thought something must be wrong—and they would look for and find a reason why something might be wrong. When they got a number closer to Millikan’s value they didn’t look so hard. And so they eliminated the numbers that were too far off, and did other things like that. We’ve learned those tricks nowadays, and now we don’t have that kind of a disease.

But this long history of learning how to not fool ourselves—of having utter scientific integrity—is, I’m sorry to say, something that we haven’t specifically included in any particular course that I know of. We just hope you’ve caught on by osmosis.

The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that. After you’ve not fooled yourself, it’s easy not to fool other scientists. You just have to be honest in a conventional way after that.

I would like to add something that’s not essential to the science, but something I kind of believe, which is that you should not fool the layman when you’re talking as a scientist. I’m not trying to tell you what to do about cheating on your wife, or fooling your girlfriend, or something like that, when you’re not trying to be a scientist, but just trying to be an ordinary human being. We’ll leave those problems up to you and your rabbi. I’m talking about a specific, extra type of integrity that is not lying, but bending over backwards to show how you’re maybe wrong, that you ought to do when acting as a scientist. And this is our responsibility as scientists, certainly to other scientists, and I think to laymen.

For example, I was a little surprised when I was talking to a friend who was going to go on the radio. He does work on cosmology and astronomy, and he wondered how he would explain what the applications of this work were. “Well,” I said, “there aren’t any.” He said, “Yes, but then we won’t get support for more research of this kind.” I think that’s kind of dishonest. If you’re representing yourself as a scientist, then you should explain to the layman what you’re doing—and if they don’t want to support you under those circumstances, then that’s their decision.

One example of the principle is this: If you’ve made up your mind to test a theory, or you want to explain some idea, you should always decide to publish it whichever way it comes out. If we only publish results of a certain kind, we can make the argument look good. We must publish both kinds of result. For example—let’s take advertising again—suppose some particular cigarette has some particular property, like low nicotine. It’s published widely by the company that this means it is good for you—they don’t say, for instance, that the tars are a different proportion, or that something else is the matter with the cigarette. In other words, publication probability depends upon the answer. That should not be done.

I say that’s also important in giving certain types of government advice. Supposing a senator asked you for advice about whether drilling a hole should be done in his state; and you decide it would be better in some other state. If you don’t publish such a result, it seems to me you’re not giving scientific advice. You’re being used. If your answer happens to come out in the direction the government or the politicians like, they can use it as an argument in their favor; if it comes out the other way, they don’t publish it at all. That’s not giving scientific advice.

Other kinds of errors are more characteristic of poor science. When I was at Cornell. I often talked to the people in the psychology department. One of the students told me she wanted to do an experiment that went something like this—I don’t remember it in detail, but it had been found by others that under certain circumstances, X, rats did something, A. She was curious as to whether, if she changed the circumstances to Y, they would still do, A. So her proposal was to do the experiment under circumstances Y and see if they still did A.

I explained to her that it was necessary first to repeat in her laboratory the experiment of the other person—to do it under condition X to see if she could also get result A—and then change to Y and see if A changed. Then she would know that the real difference was the thing she thought she had under control.

She was very delighted with this new idea, and went to her professor. And his reply was, no, you cannot do that, because the experiment has already been done and you would be wasting time. This was in about 1935 or so, and it seems to have been the general policy then to not try to repeat psychological experiments, but only to change the conditions and see what happens.

Nowadays there’s a certain danger of the same thing happening, even in the famous field of physics. I was shocked to hear of an experiment done at the big accelerator at the National Accelerator Laboratory, where a person used deuterium. In order to compare his heavy hydrogen results to what might happen to light hydrogen he had to use data from someone else’s experiment on light hydrogen, which was done on different apparatus. When asked he said it was because he couldn’t get time on the program (because there’s so little time and it’s such expensive apparatus) to do the experiment with light hydrogen on this apparatus because there wouldn’t be any new result. And so the men in charge of programs at NAL are so anxious for new results, in order to get more money to keep the thing going for public relations purposes, they are destroying—possibly—the value of the experiments themselves, which is the whole purpose of the thing. It is often hard for the experimenters there to complete their work as their scientific integrity demands.

All experiments in psychology are not of this type, however. For example, there have been many experiments running rats through all kinds of mazes, and so on—with little clear result. But in 1937 a man named Young did a very interesting one. He had a long corridor with doors all along one side where the rats came in, and doors along the other side where the food was. He wanted to see if he could train the rats to go in at the third door down from wherever he started them off. No. The rats went immediately to the door where the food had been the time before.

The question was, how did the rats know, because the corridor was so beautifully built and so uniform, that this was the same door as before? Obviously there was something about the door that was different from the other doors. So he painted the doors very carefully, arranging the textures on the faces of the doors exactly the same. Still the rats could tell. Then he thought maybe the rats were smelling the food, so he used chemicals to change the smell after each run. Still the rats could tell. Then he realized the rats might be able to tell by seeing the lights and the arrangement in the laboratory like any commonsense person. So he covered the corridor, and, still the rats could tell.

He finally found that they could tell by the way the floor sounded when they ran over it. And he could only fix that by putting his corridor in sand. So he covered one after another of all possible clues and finally was able to fool the rats so that they had to learn to go in the third door. If he relaxed any of his conditions, the rats could tell.

Now, from a scientific standpoint, that is an A‑Number‑l experiment. That is the experiment that makes rat‑running experiments sensible, because it uncovers the clues that the rat is really using—not what you think it’s using. And that is the experiment that tells exactly what conditions you have to use in order to be careful and control everything in an experiment with rat‑running.

I looked into the subsequent history of this research. The subsequent experiment, and the one after that, never referred to Mr. Young. They never used any of his criteria of putting the corridor on sand, or being very careful. They just went right on running rats in the same old way, and paid no attention to the great discoveries of Mr. Young, and his papers are not referred to, because he didn’t discover anything about the rats. In fact, he discovered all the things you have to do to discover something about rats. But not paying attention to experiments like that is a characteristic of Cargo Cult Science.

Another example is the ESP experiments of Mr. Rhine, and other people. As various people have made criticisms—and they themselves have made criticisms of their own experiments—they improve the techniques so that the effects are smaller, and smaller, and smaller until they gradually disappear. All the parapsychologists are looking for some experiment that can be repeated—that you can do again and get the same effect—statistically, even. They run a million rats—no, it’s people this time—they do a lot of things and get a certain statistical effect. Next time they try it they don’t get it any more. And now you find a man saying that it is an irrelevant demand to expect a repeatable experiment. This is science?

This man also speaks about a new institution, in a talk in which he was resigning as Director of the Institute of Parapsychology. And, in telling people what to do next, he says that one of the things they have to do is be sure they only train students who have shown their ability to get PSI results to an acceptable extent—not to waste their time on those ambitious and interested students who get only chance results. It is very dangerous to have such a policy in teaching—to teach students only how to get certain results, rather than how to do an experiment with scientific integrity.

So I wish to you—I have no more time, so I have just one wish for you—the good luck to be somewhere where you are free to maintain the kind of integrity I have described, and where you do not feel forced by a need to maintain your position in the organization, or financial support, or so on, to lose your integrity. May you have that freedom. May I also give you one last bit of advice: Never say that you’ll give a talk unless you know clearly what you’re going to talk about and more or less what you’re going to say.

How Analogies Reveal Connections, Spark Innovation, and Sell Our Greatest Ideas

Image Source: XKCD

John Pollack is a former Presidential Speechwriter. If anyone knows the power of words to move people to action, shape arguments, and persuade, it is he.

In Shortcut: How Analogies Reveal Connections, Spark Innovation, and Sell Our Greatest Ideas, he explores the powerful role of analogy in persuasion and creativity.

One of the key tools he uses for this is analogy.

While they often operate unnoticed, analogies aren’t accidents, they’re arguments—arguments that, like icebergs, conceal most of their mass and power beneath the surface. In arguments, whoever has the best argument wins.

But analogies do more than just persuade others — they also play a role in innovation and decision making.

From the bloody Chicago slaughterhouse that inspired Henry Ford’s first moving assembly line, to the “domino theory” that led America into the Vietnam War, to the “bicycle for the mind” that Steve Jobs envisioned as a Macintosh computer, analogies have played a dynamic role in shaping the world around us.

Despite their importance, many people have only a vague sense of the definition.

What is an Analogy?

In broad terms, an analogy is simply a comparison that asserts a parallel—explicit or implicit—between two distinct things, based on the perception of a share property or relation. In everyday use, analogies actually appear in many forms. Some of these include metaphors, similes, political slogans, legal arguments, marketing taglines, mathematical formulas, biblical parables, logos, TV ads, euphemisms, proverbs, fables and sports clichés.

Because they are so disguised they play a bigger role than we consciously realize. Not only do analogies effectively make arguments, but they trigger emotions. And emotions make it hard to make rational decisions.

While we take analogies for granted, the ideas they convey are notably complex.

All day every day, in fact, we make or evaluate one analogy after the other, because some comparisons are the only practical way to sort a flood of incoming data, place it within the content of our experience, and make decisions accordingly.

Remember the powerful metaphor — that arguments are war. This shapes a wide variety of expressions like “your claims are indefensible,” “attacking the weakpoints,” and “You disagree, OK shoot.”

Or consider the Map and the Territory — Analogies give people the map but explain nothing of the territory.

Warren Buffett is one of the best at using analogies to communicate effectively. One of my favorite analogies is when he noted “You never know who’s swimming naked until the tide goes out.” In other words, when times are good everyone looks amazing. When times suck, hidden weaknesses are exposed. The same could be said for analogies:

We never know what assumptions, deceptions, or brilliant insights they might be hiding until we look beneath the surface.

Most people underestimate the importance of a good analogy. As with many things in life, this lack of awareness comes at a cost. Ignorance is expensive.

Evidence suggests that people who tend to overlook or underestimate analogy’s influence often find themselves struggling to make their arguments or achieve their goals. The converse is also true. Those who construct the clearest, most resonant and apt analogies are usually the most successful in reaching the outcomes they seek.

The key to all of this is figuring out why analogies function so effectively and how they work. Once we know that, we should be able to craft better ones.

Don’t Think of an Elephant

Effective, persuasive analogies frame situations and arguments, often so subtly that we don’t even realize there is a frame, let alone one that might not work in our favor. Such conceptual frames, like picture frames, include some ideas, images, and emotions and exclude others. By setting a frame, a person or organization can, for better or worse, exert remarkable influence on the direction of their own thinking and that of others.

He who holds the pen frames the story. The first person to frame the story controls the narrative and it takes a massive amount of energy to change the direction of the story. Sometimes even the way that people come across information, shapes it — stories that would be a non-event if disclosed proactively became front page stories because someone found out.

In Don’t Think of an Elephant, George Lakoff explores the issue of framing. The book famously begins with the instruction “Don’t think of an elephant.”

What’s the first thing we all do? Think of an elephant, of course. It’s almost impossible not to think of an elephant. When we stop consciously thinking about it, it floats away and we move on to other topics — like the new email that just arrived. But then again it will pop back into consciousness and bring some friends — associated ideas, other exotic animals, or even thoughts of the GOP.

“Every word, like elephant, evokes a frame, which can be an image of other kinds of knowledge,” Lakoff writes. This is why we want to control the frame rather than be controlled by it.

In Shortcut Pollack tells of Lakoff talking about an analogy that President George W. Bush made in the 2004 State of the Union address, in which he argued the Iraq war was necessary despite the international criticism. Before we go on, take Bush’s side here and think about how you would argue this point – how would you defend this?

In the speech, Bush proclaimed that “America will never seek a permission slip to defend the security of our people.”

As Lakoff notes, Bush could have said, “We won’t ask permission.” But he didn’t. Instead he intentionally used the analogy of permission slip and in so doing framed the issue in terms that would “trigger strong, more negative emotional associations that endured in people’s memories of childhood rules and restrictions.”

Commenting on this, Pollack writes:

Through structure mapping, we correlate the role of the United States to that of a young student who must appeal to their teacher for permission to do anything outside the classroom, even going down the hall to use the toilet.

But is seeking diplomatic consensus to avoid or end a war actually analogous to a child asking their teacher for permission to use the toilet? Not at all. Yet once this analogy has been stated (Farnam Street editorial: and tweeted), the debate has been framed. Those who would reject a unilateral, my-way-or-the-highway approach to foreign policy suddenly find themselves battling not just political opposition but people’s deeply ingrained resentment of childhood’s seemingly petty regulations and restrictions. On an even subtler level, the idea of not asking for a permission slip also frames the issue in terms of sidestepping bureaucratic paperwork, and who likes bureaucracy or paperwork.

Deconstructing Analogies

Deconstructing analogies, we find out how they function so effectively. Pollack argues they meet five essential criteria.

  1. Use the highly familiar to explain something less familiar.
  2. Highlight similarities and obscure differences.
  3. Identify useful abstractions.
  4. Tell a coherent story.
  5. Resonate emotionally.

Let’s explore how these work in greater detail. Let’s use the example of master-thief, Bruce Reynolds, who described the Great Train Robbery as his Sistine Chapel.

The Great Train Robbery

In the dark early hours of August 8, 1963, an intrepid gang of robbers hot-wired a six-volt battery to a railroad signal not far from the town of Leighton Buzzard, some forty miles north of London. Shortly, the engineer of an approaching mail train, spotting the red light ahead, slowed his train to a halt and sent one of his crew down the track, on foot, to investigate. Within minutes, the gang overpowered the train’s crew and, in less than twenty minutes, made off with the equivalent of more than $60 million in cash.

Years later, Bruce Reynolds, the mastermind of what quickly became known as the Great Train Robbery, described the spectacular heist as “my Sistine Chapel.”

Use the familiar to explain something less familiar

Reynolds exploits the public’s basic familiarity with the famous chapel in the Vatican City, which after Leonardo da Vinci’s Mona Lisa is perhaps the best-known work of Renaissance art in the world. Millions of people, even those who aren’t art connoisseurs, would likely share the cultural opinion that the paintings in the chapel represent “great art” (as compared to a smaller subset of people who might feel the same way about Jackson Pollock’s drip paintings, or Marcel Duchamp’s upturned urinal).

Highlight similarities and obscure differences

Reynold’s analogy highlights, through implication, similarities between the heist and the chapel—both took meticulous planning and masterful execution. After all, stopping a train and stealing the equivalent of $60m—and doing it without guns—does require a certain artistry. At the same time, the analogy obscures important differences. By invoking the image of a holy sanctuary, Reynolds triggers a host of associations in the audience’s mind—God, faith, morality, and forgiveness, among others—that camouflage the fact that he’s describing an action few would consider morally commendable, even if the artistry involved in robbing that train was admirable.

Identify useful abstractions

The analogy offers a subtle but useful abstraction: Genius is genius and art is art, no matter what the medium. The logic? If we believe that genius and artistry can transcend genre, we must concede that Reynolds, whose artful, ingenious theft netted millions, is an artist.

Tell a coherent story

The analogy offers a coherent narrative. Calling the Great Train Robbery his Sistine Chapel offers the audience a simple story that, at least on the surface makes sense: Just as Michelangelo was called by God, the pope, and history to create his greatest work, so too was Bruce Reynolds called by destiny to pull off the greatest robbery in history. And if the Sistine Chapel endures as an expression of genius, so too must the Great Train Robbery. Yes, robbing the train was wrong. But the public perceived it as largely a victimless crime, committed by renegades who were nothing if not audacious. And who but the most audacious in history ever create great art? Ergo, according to this narrative, Reynolds is an audacious genius, master of his chosen endeavor, and an artist to be admired in public.

There is an important point here. The narrative need not be accurate. It is the feelings and ideas the analogy evokes that make it powerful. Within the structure of the analogy, the argument rings true. The framing is enough to establish it succulently and subtly. That’s what makes it so powerful.

Resonate emotionally

The analogy resonates emotionally. To many people, mere mention of the Sistine Chapel brings an image to mind, perhaps the finger of Adam reaching out toward the finger of God, or perhaps just that of a lesser chapel with which they are personally familiar. Generally speaking, chapels are considered beautiful, and beauty is an idea that tends to evoke positive emotions. Such positive emotions, in turn, reinforce the argument that Reynolds is making—that there’s little difference between his work and that of a great artist.

Jumping to Conclusions

Daniel Kahneman explains the two thinking structures that govern the way we think: System one and system two . In his book, Thinking Fast and Slow, he writes “Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake are acceptable, and if the jump saves much time and effort.”

“A good analogy serves as an intellectual springboard that helps us jump to conclusions,” Pollack writes. He continues:

And once we’re in midair, flying through assumptions that reinforce our preconceptions and preferences, we’re well on our way to a phenomenon known as confirmation bias. When we encounter a statement and seek to understand it, we evaluate it by first assuming it is true and exploring the implications that result. We don’t even consider dismissing the statement as untrue unless enough of its implications don’t add up. And consider is the operative word. Studies suggest that most people seek out only information that confirms the beliefs they currently hold and often dismiss any contradictory evidence they encounter.

The ongoing battle between fact and fiction commonly takes place in our subconscious systems. In The Political Brain: The Role of Emotion in Deciding the Fate of the Nation, Drew Westen, an Emory University psychologist, writes: “Our brains have a remarkable capacity to find their way toward convenient truths—even if they are not all true.”

This also helps explain why getting promoted has almost nothing to do with your performance.

Remember Apollo Robbins? He’s a professional pickpocket. While he has unique skills, he succeeds largely through the choreography of people’s attention. “Attention,” he says “is like water. It flows. It’s liquid. You create channels to divert it, and you hope that it flows the right way.”

“Pickpocketing and analogies are in a sense the same,” Pollack concludes, “as the misleading analogy picks a listener’s mental pocket.”

And this is true whether someone else diverts our attention through a resonant but misleading analogy—“Judges are like umpires”—or we simply choose the wrong analogy all by ourselves.

Reasoning by Analogy

We rarely stop to see how much of our reasoning is done by analogy. In a 2005 study published in the Harvard Business Review, Giovanni Gavettie and Jan Rivkin wrote: “Leaders tend to be so immersed in the specifics of strategy that they rarely stop to think how much of their reasoning is done by analogy.” As a result they miss things. They make connections that don’t exist. They don’t check assumptions. They miss useful insights. By contrast “Managers who pay attention to their own analogical thinking will make better strategic decisions and fewer mistakes.”


Shortcut goes on to explore when to use analogies and how to craft them to maximize persuasion.

The Single Best Interview Question You Can Ask

In Peter Thiel’s book, Zero to One: Notes on Startups, or How to Build the Future — more of an exercise in thinking about the questions you must ask to move from zero to one — there is a great section on the single best interview question you can ask someone.

Whenever Peter Thiel interviews someone he likes to ask the following question: “What important truth do very few people agree with you on?

This question sounds easy because it’s straightforward. Actually, it’s very hard to answer. It’s intellectually difficult because the knowledge that everyone is taught in school is by definition agreed upon. And it’s psychologically difficult because anyone trying to answer must say something she knows to be unpopular. Brilliant thinking is rare, but courage is in even shorter supply than genius.

The most common answers, according to Thiel, are “Our educational system is broken and urgently needs to be fixed.” “America is exceptional.” “There is no God.”

These are bad answers.

The first and the second statements might be true, but many people already agree with them. The third statement simply takes one side in a familiar debate. A good answer takes the following form: “Most people believe in x, but the truth is the opposite of x.”


What does this contrarian question have to do with the future? In the most minimal sense, the future is simply the set of all moments yet to come.

We hope for progress when we think about the future. To Thiel, that progress takes place in two ways.

Horizontal or extensive progress means copying things that work— going from 1 to n. Horizontal progress is easy to imagine because we already know what it looks like. Vertical or intensive progress means doing new things— going from 0 to 1. Vertical progress is harder to imagine because it requires doing something nobody else has ever done. If you take one typewriter and build 100, you have made horizontal progress. If you have a typewriter and build a word processor, you have made vertical progress.

best interview question peter thiel

At the macro level, the single word for horizontal progress is globalization— taking things that work somewhere and making them work everywhere. … The single word for vertical, 0 to 1 progress, is technology. … Because globalization and technology are different modes of progress, it’s possible to have both, either, or neither at the same time.

Peter Thiel
Here is Thiel’s answer to his own question:

My own answer to the contrarian question is that most people think the future of the world will be defined by globalization, but the truth is that technology matters more. Without technological change, if China doubles its energy production over the next two decades, it will also double its air pollution. If every one of India’s hundreds of millions of households were to live the way Americans already do— using only today’s tools— the result would be environmentally catastrophic. Spreading old ways to create wealth around the world will result in devastation, not riches. In a world of scarce resources, globalization without new technology is unsustainable.