Over 400,000 people visited Farnam Street last month to learn how to make better decisions, create new ideas, and avoid stupid errors. With more than 100,000 subscribers to our popular weekly digest, we've become an online intellectual hub. To learn more about we what do, start here.

Tag Archives: Language

Words Like Loaded Pistols: Wartime Rhetoric

Rhetoric, or the art of persuasion, is an ancient topic that’s no less relevant today. We are in a golden age of information sharing, which means you are swimming in a pool of rhetoric every day, whether you realize it or not.

The book Words Like Loaded Pistols: Rhetoric from Aristotle to Obama by Sam Leith is one tool to help navigate our choppy waters. Leith does an impressive job of unpacking rhetorical concepts while also providing all the knowledge and nuance required to be a powerful speaker.

The book is laid out beautifully, with sections entitled ‘Champions of Rhetoric,’ in which he dissects the work of some of the most famous orators. The chapter comparing Adolf Hitler to Winston Churchill is particularly interesting. 


Churchill was a prolific speaker: Between 1900 and 1955 he averaged one speech a week. (That’s 2,860 speeches for those who like math). And they were not just speeches; They carried some of the most famous sayings produced in the twentieth century:

Among the phrases he minted were ‘blood, toil, tears, and sweat,’ ‘their finest hour,’ ‘the few,’ ‘the end of the beginning,’ ‘business as usual,’ ‘iron curtain,’ ‘summit meeting,’ and ‘peaceful coexistence.’

While this impressive resume and history solidified his place on the throne of oratory excellence, it’s important to note that he wasn’t a “born speaker” — in fact he made many mistakes. And he learned. 

Like many of us, Churchill would even get nervous to the point of nausea before addressing the public. To counter this he engaged in deliberate practiceHe would rehearse his speeches in the mirror, modify them as needed, and scribble meticulous notes including pauses and stage direction. In other words, one of history’s great orators painfully engaged himself in a process of trial, error, and practice

To shape himself as an orator he learned by heart the speeches of Disraeli, Gladstone, Cromwell, Burke, and Pitt. Churchill combined their example with his father Randolph’s gift for invective. But he added something of his own – and it was this that helped tether his high style to something more conversational. He was a master of the sudden change of register – a joke, or a phrase of unexpected intimacy.

Stylistically, Churchill was known for building up to a great crescendo and then suddenly becoming gentle and quiet. Students of rhetoric recognize this as a device to keep your audience engaged, to surprise it. The joking and intimacy showed his prowess with another important rhetorical device, ethos.

Ethos is about establishing a connection with your audience. A joke can help with this because humor is often based on joint assumptions and beliefs; sharing a laugh with someone tends to make us feel closer to them. It’s human nature to gravitate towards those people who are like us (see the principles of influence). 

Yet, for all the aspects of the ethos appeal which Churchill got right, on more than one occasion he didn’t judge his audience well and was unable to persuade them.

When he was an MP in 1935, his colleague Herbert Samuel reported, ‘The House always crowds in to hear him. It listens and admires. It laughs when he would have it laugh, and it trembles when he would have it tremble… but it remains unconvinced, and in the end it votes against him.’

Much like today, in Churchill’s time parliament was designed for a type of call and response dialogue, not a grand soapbox type speech that he was so fond of.

Leith argues that if it wasn’t for the war, Churchill might have never found his audience and surely would have been remembered much differently, if at all.

The thing about Churchill was that, like the stopped clock that’s right twice a day, he occupied one position and waited for the world to come to him. He spend much of his political career predicting the imminent end of Western civilization — and it was only by the damnedest good luck that it happened to be on his watch that it suddenly appeared to be coming about. If not, he might have been remembered as a self-aggrandizing windbag with an old-fashioned speaking style and a love of the sound of his own voice.

But when the country really was under threat, Churchill’s fierce certainties were what an anxious audience wanted, while his style — steeped in the language of the previous centuries — seemed to encapsulate the very traditions that he was exhorting them to fight for. What at another time might have been faults became rhetorical strengths. That, you could say, is kairos writ large.

What does that last phrase “kairos” mean? It’s all about timing and fit:

As a rhetorical concept, decorum encompasses not only the more obvious features of style, but kairos, or the timeliness of a speech, the tone and physical comportment of the speaker, the commonplaces and topics of argument chosen, and so on. It is a giant umbrella concept meaning no more nor less than the fitting of a speech to the temper and expectations of its audience.

You could argue that the war needed Churchill and that Churchill needed the war. And unlike conflicts of the past, he also had access to the public like no other leader had before. You didn’t need to crowd into a square to hear Churchill speak, you needed to only turn on the radio.

One of the virtues of Churchill’s wartime rhetoric, however, was that whatever his peers in the House of Commons thought, he was able to speak — as politicians a generation before had not been able to — directly to the public through the wireless.

After delivering many of his key speeches in the Commons, Churchill read them out on the radio. Here, that presidential style — all that gruffness and avunicularity all those rumbling climaxes — was able to take full effect without being interrupted by rustling order papers and barracking Opposition MPs. He was pure voice.

Churchill indeed was pure of voice, but there was another loud voice in this conflict: Adolf Hitler. When it came to speaking, the two shared many things in common, but their differences were just as noticeable.


Hitler understood the power of words: He saw them as a tool which he needed to master if he wanted to achieve his goals. He had a strong vision which he believed in passionately and he knew that he needed his people to share that passion if he was to succeed.

From Mein Kampf:

The power which has always started the greatest religious and political avalanches in history has been, from time immemorial, none but the magic power of the word, and that alone. Particularly the broad masses of the people can be moved only by the power of speech… Only a storm of hot passion can turn the destinies of peoples, and he alone can arouse passion who hears it within himself.

It would seem that Hitler associated passion with anger, his speeches were known to peak with shouting resembling rage. Even when writing his speeches he would work himself up into a frenzy.

Traudl Junge, the young secretary whose memoir of the last days in the Fuhrerbunker formed the basis for the film Downfall, recalled him composing the speech he gave to mark the tenth anniversary of his dictatorship. He started out mumbling almost inaudibly, and pacing up and down, but by the time his speech reached its crescendo he had his back to her and was yelling at the wall.

Like Churchill, Hitler would often practice in front of a mirror and choreograph the whole performance, but he would take it much further. With an eye for theatrics, he would pay close attention to the acoustics of the venue to accent both his booming voice and the martial music that would accompany him. He was particular about the visuals, with his dramatic lights and placement of flags.

Hitler also used pauses to his advantage. While Churchill would use them mid speech to maintain an audience’s attention or ‘reel them in’, Hitler would use them at the beginning.

It could go on for anything up to half a minute, which is (you’ll know if you’ve tried it) a very, very long time to stand on a stage without saying or doing anything. When he started – which he’d typically do while the applause was still fading out, causing the audience to prick up its ears the more — he would do so at a slow pace and in a deep voice. The ranting was something he built up to, taking the audience with him.

Hitler liked to control every aspect of his performance and paid close attention to those details that others dismissed, specifically the time of day that he gave his speeches (a lesson infomercials learned).

He preferred to speak in the evening, believing that ‘in the morning and during the day it seems that the power of the human will rebel with its strongest energy against any attempt to impose upon it the will or opinion of another. On the other hand, in the evening it easily succumbs to the domination of a stronger will.’

Hitler had a keen interest and insight into human nature. He knew what he needed from the German people and knew the psychological devices to use to sway the masses. He was even cognizant of how his attire would resonate with the population.

While other senior Nazis went about festooned with ribbons and medals, Hitler always dressed in a plain uniform, the only adornment being the Iron Cross First Class that he had won in 1914. That medal, let it be noted, is a token of bravery, not of rank.

This was a calculated move, an appeal to ethos: I am one of you. It was a tricky balance, because he needed to seem like one of the people but also to portray an air of exceptionality. Why else would people follow him if he wasn’t the only one who could do tend to Germany in its time of need?

As a wartime leader, you need to make yourself both of and above your audience. You need to stress the identify of their interests with yours, to create unity in a common purpose. You need, therefore, to cast yourself as the ideal exemplar of all that is best and most determined and most courageous in your people.

As expected, the same type of thing happens in modern politics, which is especially amplified during election time. Everyone is scrambling to seem like a leader of the people and to establish trust while still setting themselves apart from the crowd, convincing us that they are the only person fit for the job.

If you look closely, many of the rhetorical devices examined in Words Like Loaded Pistols are in high use today. Leith discusses a speechwriter for Reagan and one of his Champions of Rhetoric is Obama; these sections of the book are just as interesting as the piece on Churchill and Hitler.


Still Interested? If you have a fascination with politics and/or Rhetoric (or just want someone to skillfully distill the considerable amounts of information from Ad Herennium and Aristotle’s Rhetoricthen we highly recommend you pick the book up.

Steven Pinker Tells us Why our Professional Writing Sucks (And What to Do)

Harvard’s cognitive psychology giant Steven Pinker has had no shortage of big, interesting topics to write about so far.

Starting in 1994 with his first book aimed at popular audiences, The Language Instinct, Pinker has discussed not only the origins of language, but the nature of human beings, the nature of our minds, the nature of human violence, and a host of related topics.

His most recent book The Sense of Style narrows in on how to write well, but continues to showcase his brilliant synthetical mind. It’s a 21st century version of Strunk & White, a book aimed to help us understand why our writing often sucks, and how we might make it suck a little less.

His deep background in linguistics and cognitive psychology allows him to discuss language and writing more deeply than your average style guide; it’s also funny as hell in parts, which can’t be said of nearly any style guide.


Please No More “Ese”

In the third chapter, Pinker addresses the familiar problem of academese, legalese, professionalese…all the eses that make one want to throw a book, paper, or article in the trash rather than finish it. What causes them? Is it because we seek to obfuscate, as is commonly thought? Sometimes yes — especially when the author is trying to sell the reader something, be it a product or an idea.

But Pinker’s not convinced that concealment is driving most of our frustration with professional writing:

I have long been skeptical of the bamboozlement theory, because in my experience it does not ring true. I know many scholars who have nothing to hide and no need to impress. They do groundbreaking work on important subjects, reason well about clear ideas, and are honest, down-to-earth people, the kind you’d enjoy having a beer with. Still, their writing stinks.

So, if it’s not that we’re trying to mislead, what’s the problem?


Pinker first calls attention to the Curse of Knowledge — the inability to put ourselves in the shoes of a less informed reader.

The curse of knowledge is the single best explanation I know of why good people write bad prose. It simply doesn’t occur to the writer that her readers don’t know what she knows — that they haven’t mastered the patois of her guild, can’t divine the missing steps that seem too obvious to mention, have no way to visualize a scene that to her is as clear as day. And so she doesn’t bother to explain the jargon, or spell out the logic, or supply the necessary detail.

The first, simple, way this manifests itself is one we all encounter too frequently: Over-Abbreviation. It’s when we’re told to look up the date of the SALT conference for MLA sourcing on the HELMET system after our STEM meeting. (I only made one of those up.) Pinker’s easy way out is to recommend we always spell out acronyms the first time we use them, unless we’re absolutely sure readers will know what they mean. (And still maybe even then.)

The second obvious manifestation is our overuse of technical terms which the reader may or may not have encountered before. A simple fix is to add a few words of expository the first time you use the term, as in “Arabidopsis, a flowering mustard plant.” Don’t assume the reader knows all of your jargon.

In addition, the use of examples is so powerful that we might call them a necessary component of persuasive writing. If I give you a long rhetorical argument in favor of some action or another without anchoring it on a concrete example, it’s as if I haven’t explained it at all. Something like: “Reading a source of information that contradicts your existing beliefs is a useful practice, as in the case of a Democrat spending time reading Op-Eds written by Republicans.” The example makes the point far stronger.

Another deeper part of the problem is a little less obvious but a lot more interesting than you might think. Pinker ascribes a big source of messy writing to a mental process called chunking, in which we package groups of concepts into ever further abstraction in order to save space in our brain. Here’s a great example of chunking:

As children we see one person hand a cookie to another, and we remember it as an act of giving. One person gives another one a cookie in exchange for a banana; we chunk the two acts of giving together and think of the sequence as trading. Person 1 trades a banana to Person 2 for a shiny piece of metal, because he knows he can trade it to Person 3 for a cookie; we think of it as selling. Lots of people buying and selling make up a market. Activity aggregated over many markets get chunked into the economy. The economy can now be thought of as an entity which responds to action by central banks; we call that monetary policy. One kind of monetary policy, which involves the central bank buying private assets, is chunked as quantitative easing.

As we read and learn, we master a vast number of these abstractions, and each becomes a mental unit which we can bring to mind in an instant and share with others by uttering its name.

Chunking is an amazing and useful component of higher intelligence, but it gets us in trouble when we write because we assume our readers’ chunks are just like our own. They’re not.

A second issue is something he terms functional fixity. This compounds the problem induced by chunking:

Sometimes wording is maddeningly opaque without being composed of technical terminology from a private clique. Even among cognitive scientists, a “poststimulus event” is not a standard way way to refer to a tap on the arm. A financial customer might be reasonably familiar with the world of investments and still have to puzzle over what a company brochure means by “capital charges and rights.” A computer-savvy user trying to maintain his Web site might be mystified by instructions on the maintenance page which refer to “nodes,” “content type” and “attachments.” And heaven help the sleepy traveler trying to set the alarm clock in his hotel room who must interpret “alarm function” and “second display mode.”

Why do writers invent such confusing terminology? I believe the answer lies in another way in which expertise can make our thoughts more idiosyncratic and thus harder to share: as we become familiar with something, we think about it more in terms of the use we put it to and less in terms of what it looks like and what it is made of. This transition, another staple of the cognitive psychology curriculum, is called functional fixity (sometimes functional fixedness).

The opposite of functional fixity would be familiar to those who have bought their dog or cat a toy only to be puzzled to see them playing with the packaging it came in. The animal hasn’t fixated on the function of the objects, to him an object is just an object. The toy and the packaging are not categorized as toy and thing toy comes in the way they are for us. In this case, we have functional fixity and they do not.

And so Pinker continues:

Now, if you combine functional fixity with chunking, and stir in the curse that hides each one from our awareness, you get an explanation of why specialists use so much idiosyncratic terminology, together with abstractions, metaconcepts, and zombie nouns. They are not trying to bamboozle us, that’s just the way they think.


In a similar way, writers stop thinking — and thus stop writing — about tangible objects and instead refer to them by the role those objects play in their daily travails. Recall the example from chapter 2 in which a psychologist showed people sentences, followed by the label TRUE or FALSE. He explained what he did as “the subsequent presentation of an assessment word,” referring to the [true/false] label as an “assessment word” because that’s why he put it there — so that the participants in the experiment could assess whether it applied to the preceding sentence Unfortunately, he left it up to us to figure out what an “assessment word” is–while saving no characters, and being less rather than more scientifically precise.

In the same way, a tap on the wrist became a “stimulus” and a [subsequent] tap on the elbow become a “post-stimulus event,” because the writer cared about the fact that one event came after the other and no longer cared about the fact that the events were taps on the arm.

As we get deeper into our expertise, we substitute concrete, useful, everyday imagery for abstract, technical fluff that brings nothing to the mind’s eye of a lay reader. We use metaconcepts like levels, issues, contexts, frameworks, and perspectives instead of describing the actual thing in plain language. (Thus does a book become a “tangible thinking framework.”)


How do we solve the problem, then? Pinker partially defuses the obvious solution — remembering the reader over your shoulder while you write — because he feels it doesn’t always work. Even when we’re made aware that we need to simplify and clarify for our audience, we find it hard to regress our minds to a time when our professional knowledge was more primitive.

Pinker’s prescription has a few parts:

  1. Get rid of abstractions and use concrete nouns and refer to concrete things. Who did what to whom? Read over your sentences and look for nouns that refer to meta-abstractions and ask yourself whether there’s a way to put a tangible, everyday object or concept in its place. “The phrase ‘on the aspirational level’ adds nothing to ‘aspire,’ nor is a ‘prejudice reduction model’ any more sophisticated than ‘reducing prejudice.'”
  2. When in doubt, assume the reader knows a fair bit less than you about your topic. Clarity is not condescension. You don’t need to prove how smart you are — the reader won’t be impressed. “The key is to assume that your readers are as intelligent and sophisticated as you are, but that they happen not to know something you know.” 
  3. Get someone intelligent and part of your intended audience to read over your work and see if they understand it. You shouldn’t take every last suggestion, but do take seriously when they tell you certain sections are muddy or confusing. “The form in which thoughts occur to a writer is rarely the same as the form in which they can be absorbed by the reader.”
  4. Put your first draft down for enough time that, when you come back to it, you no longer feel deep familiarity with it. In this way, you become your intended audience. Your own fresh eyes will see the text in a new way. Don’t forget to read aloud, even if just under your breath.

Still interested? Check out Pinker’s The Sense of Style for a lot more on good writing, and check out his thoughts on what a broad education should entail.

Yuval Noah Harari on Why Humans Dominate the Earth: Myth-Making

“Ants and bees can also work together in huge numbers, but they do so in a very rigid manner and only with close relatives. Wolves and chimpanzees cooperate far more flexibly than ants, but they can do so only with small numbers of other individuals that they know intimately. Sapiens can cooperate in extremely flexible ways with countless numbers of strangers. That’s why Sapiens rule the world, whereas ants eat our leftovers and chimps are locked up in zoos and research laboratories.” —Yuval Noah Harari, Sapiens 


Yuval Noah Harari‘s Sapiens is one of those uniquely breathtaking books that comes along very rarely. It’s broad, yet scientific. It’s written for a popular audience but never feels dumbed down. It’s new and fresh, but is not based on any brand new primary research. Near and dear to our heart, Sapiens is pure synthesis.

An immediate influence that comes to mind is Jared Diamond, author of Guns, Germs, and Steel, The Third Chimpanzee, and other broad-yet-scientific works with vast synthesis and explanatory power. And of course, Harari, a history professor at the Hebrew University of Jerusalem, has noted that key influence and what it means to how he works:

(Harari) credits author Jared Diamond with encouraging him to take a much broader view—his Guns, Germs and Steel was an enormous influence. Harari says: “It made me realise that you can ask the biggest questions about history and try to give them scientific answers. But in order to do so, you have to give up the most cherished tools of historians. I was taught that if you’re going to study something, you must understand it deeply and be familiar with primary sources. But if you write a history of the whole world you can’t do this. That’s the trade-off.”

With this working model in mind, Harari sought to understand the history of humankind’s domination of the earth and its development of complex modern societies. His synthesis involves using evolutionary theory, forensic anthropology, genetics and the basic tools of the historian to generate a new conception of our past: Man’s success was due to its ability to create and sustain grand, collaborative myths.

Harari uses a smart trick to make his narrative more palatable and sensible: He uses the term Sapiens to refer to human beings. With this bit of depersonalization, Harari can go on to make some extremely bold statements about the history of humanity. We’re just another animal: the Homo Sapiens and our history can be described just like that of any other species. Our successes, failures, flaws and credits are part of the makeup of the Sapiens. (This biological approach to history is one we’ve looked at before with the work of Will and Ariel Durant.)


Sapiens was, of course, just one of many animals on the savannah if we go back about 100,000 years.

There were humans long before there was history. Animals much like modern humans first appeared about 2.5 million years ago. But for countless generations they did not stand out from the myriad other organisms with which they shared their habitats….

These archaic humans loved, played, formed close friendships and competed for status and power, but so did chimpanzees, baboons, and elephants. There was nothing special about humans. Nobody, least of all humans themselves had any inkling their descendants would one way walk on the moon, split the atom, fathom the genetic code and write history books. The most important thing to know about prehistoric humans is that they were insignificant animals with no more impact on their environment than gorillas, fireflies or jellyfish.

We like to think we have been a privileged species right from the start; that through a divine spark, we had the ability to dominate our environment and the lesser mammals we co-habitated with. But that was not so, at least not at first. We were simply another smart, social ape trying to survive in the wild. We had cousins: Homo neanderthalensis, Homo erectus, Homo rudolfensis…all considered human and with similar traits. If chimps and bonobos were our second cousins, these were our first cousins.

Eventually things changed. About 70,000 or so years ago, our DNA showed a mutation (Harari claims we’re not sure why — I don’t know the research well enough to disagree) which allowed us to make a leap that no other species, human or otherwise, was able to make: Cooperating flexibly in large groups with a unique and complex language. Harari calls this the “Cognitive Revolution.”

What was the Sapiens’ secret of success? How did we manage to settle so rapidly in so many distant and ecologically different habitats? How did we push all other human species into oblivion? Why couldn’t even the strong, brainy, cold-proof Neanderthals survive our onslaught? The debate continues to rage. The most likely answer is the very thing that makes the debate possible: Homo sapiens conquered the world thanks above all to its unique language.

Our newfound language had many attributes that couldn’t be found in our cousins’ languages, or in any other languages from ants to whales.

Firstly, we could give detailed explanations of events that had transpired. I saw a large lion in the forest three days back, with three companions, near the closest tree to the left bank of the river and I think, but am not totally sure, they were hunting us. Why don’t we ask for help from a neighboring tribe so we don’t all end up as lion meat?

Secondly, and maybe more importantly, we could also gossip about each other. I noticed Frank and Steve have not contributed to the hunt in about three weeks. They are not holding up their end of the bargain, and I don’t think we should include them in distributing the proceeds of our next major slaughter. Hey, does this headdress make me look fat?

As important as both of these abilities were to the development of Sapiens, they are probably not the major insights by Harari. Steven Pinker has written about The Language Instinct and where it got us over time, as have others.

Harari’s insight is that the above are not the most important reasons why our “uniquely supple” language gave us a massive, exponential, survival advantage: It was because we could talk about things that were not real

As far as we know, only Sapiens can talk about entire kinds of entities that they have never seen, touched, or smelled. Legends, myths, gods, and religions appeared for the first time with the Cognitive Revolution. Many animals and human species could previously say ‘Careful! A lion! Thanks to the Cognitive Revolution, Homo sapiens acquired the ability to say. ‘The lion is the guardian spirit of our tribe.’ This ability to speak about fictions is the most unique feature of Sapiens language…You could never convince a monkey to give you a banana by promising him limitless bananas after death in monkey heaven.

This is the core of Harari’s provocative thesis: It is our collected fictions that define us. Predictably, he mentions religion as one of the important fictions. But other fictions are just as important; the limited liability corporation; the nation-state; the concept of human “rights” deliverable at birth; the concept of money itself. All of these inventions allow us to do the thing that other species cannot do: Cooperate effectively and flexibly in large groups.

Ants and bees can also work together in huge numbers, but they do so in a very rigid manner and only with close relatives. Wolves and chimpanzees cooperate far more flexibly than ants, but they can do so only with small numbers of other individuals that they know intimately. Sapiens can cooperate in extremely flexible ways with countless numbers of strangers. That’s why Sapiens rule the world, whereas ants eat our leftovers and chimps are locked up in zoos and research laboratories.

Our success is intimately linked to scale, which we have discussed before. In many systems and in all species but ours, as far as we know, there are hard limits to the number of individuals that can cooperate in groups in a flexible way. (Ants can cooperate in great numbers with their relatives, but only based on simple algorithms. Munger has mentioned in The Psychology of Human Misjudgment that ants’ rules are so simplistic that if a group of ants start walking in a circle, their “follow-the-leader” algorithm can cause them to literally march until their collective death.)

Sapiens diverged when it discovered an ability to generate a collective myth, and there was almost no limit to the number of cooperating, believing individuals who could belong to a belief-group. And thus we see extremely different results in human culture than in whale culture, or dolphin culture, or bonobos culture. It’s a lollapalooza result from a combination of critical elements.

Any large-scale human cooperation — whether a modern state, a medieval church, an ancient city, or an archaic tribe — is rooted in common myths that exist only in people’s collective imagination. Churches are rooted in common religious myths. Two Catholics who have never met can nevertheless go together on crusade or pool funds to build a hospital because they both believe God was incarnated in human flesh and allowed Himself to be crucified to redeem our sins. States are rooted in common national myths. Two Serbs who have never met might risk their lives to save one another because both believe in the existence of the Serbian nation, the Serbian homeland and the Serbian flag. Judicial systems are rooted in common legal myths. Two lawyers who have never met can nevertheless combine efforts to defend a complete stranger because they both believe in the existence of laws, justice, human rights, and money paid out in fees.

Harari is quick to point out that these aren’t lies. We truly believe them, and we believe in them as a collective. They have literal truth in the sense that if I trust that you believe in money as much as I do, we can use it as an exchange of value. But just as you can’t get a chimpanzee to forgo a banana today for infinite bananas in heaven, you also can’t get him to accept 3 apples today with the idea that if he invests them in a chimp business wisely, he’ll get 6 bananas from it in five years, no matter how many compound interest tables you show him. This type of collaborative and complex fiction is uniquely human, and capitalism is as much of a collective myth as religion.

Of course, this leads to a fascinating result of human culture: If we collectively decide to to alter the myths, we can alter population behavior dramatically and quickly. We can decide slavery, one of the oldest institutions in human history, is no longer acceptable. We can declare monarchy an outdated form of governance. We can decide females should have the right to as much power as men, reversing the pattern of history. (Of course, we can also decide all Sapiens must worship the same religious text and devote ourselves to slaughtering the resisters.)

There is no parallel I’m aware of in other species for these quick, large-scale shifts. General behavior patterns in dogs or fish or ants change due to a change in environment or broad genetic evolution over a period of time. Lions will never sign a Declaration of Lion Rights and decide to banish the idea of an alpha male lion; their hierarchies are rigid.

But humans can collectively change the narrative in a period of a few years and begin acting very differently, with the same DNA and the same set of physical environments. And thus, says Harari: “The Cognitive Revolution is accordingly the point when history declared its independence from biology.” These ever shifting alliances, beliefs, myths, and ultimately, cultures, define what we call human history.

For now we will leave it here, but a thorough reading of Sapiens is recommended to understand where Professor Harari takes this idea, from the earliest humans to the fate of our descendants.

How Analogies Reveal Connections, Spark Innovation, and Sell Our Greatest Ideas

Image Source: XKCD
Source: xkcd.com


John Pollack is a former Presidential Speechwriter. If anyone knows the power of words to move people to action, shape arguments, and persuade, it is he.

In Shortcut: How Analogies Reveal Connections, Spark Innovation, and Sell Our Greatest Ideas, he explores the powerful role of analogy in persuasion and creativity.

One of the key tools he uses for this is analogy.

While they often operate unnoticed, analogies aren’t accidents, they’re arguments—arguments that, like icebergs, conceal most of their mass and power beneath the surface. In arguments, whoever has the best argument wins.

But analogies do more than just persuade others — they also play a role in innovation and decision making.

From the bloody Chicago slaughterhouse that inspired Henry Ford’s first moving assembly line, to the “domino theory” that led America into the Vietnam War, to the “bicycle for the mind” that Steve Jobs envisioned as a Macintosh computer, analogies have played a dynamic role in shaping the world around us.

Despite their importance, many people have only a vague sense of the definition.

What is an Analogy?

In broad terms, an analogy is simply a comparison that asserts a parallel—explicit or implicit—between two distinct things, based on the perception of a share property or relation. In everyday use, analogies actually appear in many forms. Some of these include metaphors, similes, political slogans, legal arguments, marketing taglines, mathematical formulas, biblical parables, logos, TV ads, euphemisms, proverbs, fables and sports clichés.

Because they are so disguised they play a bigger role than we consciously realize. Not only do analogies effectively make arguments, but they trigger emotions. And emotions make it hard to make rational decisions.

While we take analogies for granted, the ideas they convey are notably complex.

All day every day, in fact, we make or evaluate one analogy after the other, because some comparisons are the only practical way to sort a flood of incoming data, place it within the content of our experience, and make decisions accordingly.

Remember the powerful metaphor — that arguments are war. This shapes a wide variety of expressions like “your claims are indefensible,” “attacking the weakpoints,” and “You disagree, OK shoot.”

Or consider the Map and the Territory — Analogies give people the map but explain nothing of the territory.

Warren Buffett is one of the best at using analogies to communicate effectively. One of my favorite analogies is when he noted “You never know who’s swimming naked until the tide goes out.” In other words, when times are good everyone looks amazing. When times suck, hidden weaknesses are exposed. The same could be said for analogies:

We never know what assumptions, deceptions, or brilliant insights they might be hiding until we look beneath the surface.

Most people underestimate the importance of a good analogy. As with many things in life, this lack of awareness comes at a cost. Ignorance is expensive.

Evidence suggests that people who tend to overlook or underestimate analogy’s influence often find themselves struggling to make their arguments or achieve their goals. The converse is also true. Those who construct the clearest, most resonant and apt analogies are usually the most successful in reaching the outcomes they seek.

The key to all of this is figuring out why analogies function so effectively and how they work. Once we know that, we should be able to craft better ones.

Don’t Think of an Elephant

Effective, persuasive analogies frame situations and arguments, often so subtly that we don’t even realize there is a frame, let alone one that might not work in our favor. Such conceptual frames, like picture frames, include some ideas, images, and emotions and exclude others. By setting a frame, a person or organization can, for better or worse, exert remarkable influence on the direction of their own thinking and that of others.

He who holds the pen frames the story. The first person to frame the story controls the narrative and it takes a massive amount of energy to change the direction of the story. Sometimes even the way that people come across information, shapes it — stories that would be a non-event if disclosed proactively became front page stories because someone found out.

In Don’t Think of an Elephant, George Lakoff explores the issue of framing. The book famously begins with the instruction “Don’t think of an elephant.”

What’s the first thing we all do? Think of an elephant, of course. It’s almost impossible not to think of an elephant. When we stop consciously thinking about it, it floats away and we move on to other topics — like the new email that just arrived. But then again it will pop back into consciousness and bring some friends — associated ideas, other exotic animals, or even thoughts of the GOP.

“Every word, like elephant, evokes a frame, which can be an image of other kinds of knowledge,” Lakoff writes. This is why we want to control the frame rather than be controlled by it.

In Shortcut Pollack tells of Lakoff talking about an analogy that President George W. Bush made in the 2004 State of the Union address, in which he argued the Iraq war was necessary despite the international criticism. Before we go on, take Bush’s side here and think about how you would argue this point – how would you defend this?

In the speech, Bush proclaimed that “America will never seek a permission slip to defend the security of our people.”

As Lakoff notes, Bush could have said, “We won’t ask permission.” But he didn’t. Instead he intentionally used the analogy of permission slip and in so doing framed the issue in terms that would “trigger strong, more negative emotional associations that endured in people’s memories of childhood rules and restrictions.”

Commenting on this, Pollack writes:

Through structure mapping, we correlate the role of the United States to that of a young student who must appeal to their teacher for permission to do anything outside the classroom, even going down the hall to use the toilet.

But is seeking diplomatic consensus to avoid or end a war actually analogous to a child asking their teacher for permission to use the toilet? Not at all. Yet once this analogy has been stated (Farnam Street editorial: and tweeted), the debate has been framed. Those who would reject a unilateral, my-way-or-the-highway approach to foreign policy suddenly find themselves battling not just political opposition but people’s deeply ingrained resentment of childhood’s seemingly petty regulations and restrictions. On an even subtler level, the idea of not asking for a permission slip also frames the issue in terms of sidestepping bureaucratic paperwork, and who likes bureaucracy or paperwork.

Deconstructing Analogies

Deconstructing analogies, we find out how they function so effectively. Pollack argues they meet five essential criteria.

  1. Use the highly familiar to explain something less familiar.
  2. Highlight similarities and obscure differences.
  3. Identify useful abstractions.
  4. Tell a coherent story.
  5. Resonate emotionally.

Let’s explore how these work in greater detail. Let’s use the example of master-thief, Bruce Reynolds, who described the Great Train Robbery as his Sistine Chapel.

The Great Train Robbery

In the dark early hours of August 8, 1963, an intrepid gang of robbers hot-wired a six-volt battery to a railroad signal not far from the town of Leighton Buzzard, some forty miles north of London. Shortly, the engineer of an approaching mail train, spotting the red light ahead, slowed his train to a halt and sent one of his crew down the track, on foot, to investigate. Within minutes, the gang overpowered the train’s crew and, in less than twenty minutes, made off with the equivalent of more than $60 million in cash.

Years later, Bruce Reynolds, the mastermind of what quickly became known as the Great Train Robbery, described the spectacular heist as “my Sistine Chapel.”

Use the familiar to explain something less familiar

Reynolds exploits the public’s basic familiarity with the famous chapel in the Vatican City, which after Leonardo da Vinci’s Mona Lisa is perhaps the best-known work of Renaissance art in the world. Millions of people, even those who aren’t art connoisseurs, would likely share the cultural opinion that the paintings in the chapel represent “great art” (as compared to a smaller subset of people who might feel the same way about Jackson Pollock’s drip paintings, or Marcel Duchamp’s upturned urinal).

Highlight similarities and obscure differences

Reynold’s analogy highlights, through implication, similarities between the heist and the chapel—both took meticulous planning and masterful execution. After all, stopping a train and stealing the equivalent of $60m—and doing it without guns—does require a certain artistry. At the same time, the analogy obscures important differences. By invoking the image of a holy sanctuary, Reynolds triggers a host of associations in the audience’s mind—God, faith, morality, and forgiveness, among others—that camouflage the fact that he’s describing an action few would consider morally commendable, even if the artistry involved in robbing that train was admirable.

Identify useful abstractions

The analogy offers a subtle but useful abstraction: Genius is genius and art is art, no matter what the medium. The logic? If we believe that genius and artistry can transcend genre, we must concede that Reynolds, whose artful, ingenious theft netted millions, is an artist.

Tell a coherent story

The analogy offers a coherent narrative. Calling the Great Train Robbery his Sistine Chapel offers the audience a simple story that, at least on the surface makes sense: Just as Michelangelo was called by God, the pope, and history to create his greatest work, so too was Bruce Reynolds called by destiny to pull off the greatest robbery in history. And if the Sistine Chapel endures as an expression of genius, so too must the Great Train Robbery. Yes, robbing the train was wrong. But the public perceived it as largely a victimless crime, committed by renegades who were nothing if not audacious. And who but the most audacious in history ever create great art? Ergo, according to this narrative, Reynolds is an audacious genius, master of his chosen endeavor, and an artist to be admired in public.

There is an important point here. The narrative need not be accurate. It is the feelings and ideas the analogy evokes that make it powerful. Within the structure of the analogy, the argument rings true. The framing is enough to establish it succulently and subtly. That’s what makes it so powerful.

Resonate emotionally

The analogy resonates emotionally. To many people, mere mention of the Sistine Chapel brings an image to mind, perhaps the finger of Adam reaching out toward the finger of God, or perhaps just that of a lesser chapel with which they are personally familiar. Generally speaking, chapels are considered beautiful, and beauty is an idea that tends to evoke positive emotions. Such positive emotions, in turn, reinforce the argument that Reynolds is making—that there’s little difference between his work and that of a great artist.

Jumping to Conclusions

Daniel Kahneman explains the two thinking structures that govern the way we think: System one and system two . In his book, Thinking Fast and Slow, he writes “Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake are acceptable, and if the jump saves much time and effort.”

“A good analogy serves as an intellectual springboard that helps us jump to conclusions,” Pollack writes. He continues:

And once we’re in midair, flying through assumptions that reinforce our preconceptions and preferences, we’re well on our way to a phenomenon known as confirmation bias. When we encounter a statement and seek to understand it, we evaluate it by first assuming it is true and exploring the implications that result. We don’t even consider dismissing the statement as untrue unless enough of its implications don’t add up. And consider is the operative word. Studies suggest that most people seek out only information that confirms the beliefs they currently hold and often dismiss any contradictory evidence they encounter.

The ongoing battle between fact and fiction commonly takes place in our subconscious systems. In The Political Brain: The Role of Emotion in Deciding the Fate of the Nation, Drew Westen, an Emory University psychologist, writes: “Our brains have a remarkable capacity to find their way toward convenient truths—even if they are not all true.”

This also helps explain why getting promoted has almost nothing to do with your performance.

Remember Apollo Robbins? He’s a professional pickpocket. While he has unique skills, he succeeds largely through the choreography of people’s attention. “Attention,” he says “is like water. It flows. It’s liquid. You create channels to divert it, and you hope that it flows the right way.”

“Pickpocketing and analogies are in a sense the same,” Pollack concludes, “as the misleading analogy picks a listener’s mental pocket.”

And this is true whether someone else diverts our attention through a resonant but misleading analogy—“Judges are like umpires”—or we simply choose the wrong analogy all by ourselves.

Reasoning by Analogy

We rarely stop to see how much of our reasoning is done by analogy. In a 2005 study published in the Harvard Business Review, Giovanni Gavettie and Jan Rivkin wrote: “Leaders tend to be so immersed in the specifics of strategy that they rarely stop to think how much of their reasoning is done by analogy.” As a result they miss things. They make connections that don’t exist. They don’t check assumptions. They miss useful insights. By contrast “Managers who pay attention to their own analogical thinking will make better strategic decisions and fewer mistakes.”


Shortcut goes on to explore when to use analogies and how to craft them to maximize persuasion.

Wired for Culture

wired for culture

What makes us human? In part, argues evolutionary biologist Mark Pagel in Wired for Culture: Origins of the Human Social Mind, language is one of the keys to our evolutionary success, especially in the context of culture.

Humans had acquired the ability to learn from others, and to copy, imitate and improve upon their actions. This meant that elements of culture themselves— ideas, languages, beliefs, songs, art, technologies— could act like genes, capable of being transmitted to others and reproduced. But unlike genes, these elements of culture could jump directly from one mind to another, shortcutting the normal genetic routes of transmission. And so our cultures came to define a second great system of inheritance, able to transmit knowledge down the generations.

To be human at some point came to mean access to a growing and shared repository of “information, technologies, wisdom, and good luck.”

Our cultural inheritance is something we take for granted today, but its invention forever altered the course of evolution and our world. This is because knowledge could accumulate as good ideas were retained, combined, and improved upon, and others were discarded. And, being able to jump from mind to mind granted the elements of culture a pace of change that stood in relation to genetical evolution something like an animal’s behavior does to the more leisurely movement of a plant. Where you are stuck from birth with a sample of the genes that made your parents, you can sample throughout your life from a sea of evolving ideas. Not surprisingly, then, our cultures quickly came to take over the running of our day-to -day affairs as they outstripped our genes in providing solutions to the problems of our existence. Having culture means we are the only species that acquires the rules of its daily living from the accumulated knowledge of our ancestors rather than from the genes they pass to us. Our cultures and not our genes supply the solutions we use to survive and prosper in the society of our birth; they provide the instructions for what we eat, how we live, the gods we believe in, the tools we make and use, the language we speak, the people we cooperate with and marry, and whom we fight or even kill in a war.

Culture evolved primarily though language. This was the foundation of social learning. The best ideas were able to be passed on without having to reinvent them.

Pagel’s take on social learning is fascinating. “Theft” became part of our culture and part of what propelled us forward with such ferocity.

Social learning is really visual theft, and in a species that has it, it would become positively advantageous for you to hide your best ideas from others, lest they steal them. This not only would bring cumulative cultural adaptation to a halt, but our societies might have collapsed as we strained under the weight of suspicion and rancor.

So, beginning about 200,000 years ago, our fledgling species, newly equipped with the capacity for social learning had to confront two options for managing the conflicts of interest social learning would bring. One is that these new human societies could have fragmented into small family groups so that the benefits of any knowledge would flow only to one’s relatives. Had we adopted this solution we might still be living like the Neanderthals, and the world might not be so different from the way it was 40,000 years ago, when our species first entered Europe. This is because these smaller family groups would have produced fewer ideas to copy and they would have been more vulnerable to chance and bad luck. The other option was for our species to acquire a system of cooperation that could make our knowledge available to other members of our tribe or society even though they might be people we are not closely related to — in short, to work out the rules that made it possible for us to share goods and ideas cooperatively. Taking this option would mean that a vastly greater fund of accumulated wisdom and talent would become available than any one individual or even family could ever hope to produce.

This is the path we choose and our world is the result.

The Psychology of We

powers of two

Two categories of people that can be hard to have a conversation with are good friends and people who have worked together for a long time. Sometimes it’s like they are speaking their own language — and they are. But these connections can transcend conversation and touch on life.

In Powers of Two: Finding the Essence of Innovation in Creative Pairs, Joshua Shenk explores how the identity of pairs resemble that of a mosaic, “a series of pieces that connect to one another.”

A good place to begin is with ritual, since this is often the foundation of creative practice. Igor Stravinsky came into his studio and, first thing, sat down and played a Bach fugue. When he was writing The End of the Affair, Graham Greene produced five hundred words every day, and only five hundred, even if it meant stopping in the middle of a scene. The choreographer Twyla Tharp rises every morning at 5:30, puts on her workout clothes, and catches a taxi to the Pumping Iron gym at Ninety-First Street and First Avenue in Manhattan. “The ritual,” she writes in The Creative Habit, “is not the stretching and weight training I put my body through each morning at the gym; the ritual is the cab. The moment I tell the driver where to go I have completed the ritual.”

Tharp’s point is that ritual emerges from the smallest, most concrete action. For pairs, the most basic thing is a regular meeting time. James Watson and Francis Crick had lunch most days at the Eagle pub in Cambridge. Facebook CEO Mark Zuckerberg and COO Sheryl Sandberg begin and end every week with hourlong private meetings. After they began to exchange their work, J.R.R. Tolkien and C. S. Lewis set aside Mondays to meet at a pub and later met with a group, the Inklings, every Thursday night at Lewis’s apartment.

Meeting rituals may be tied to moments in time — as when partners like Buffett and Munger begin every day with a call—or to a physical space, as when Lennon and McCartney met at Paul’s house to write. Watson and Crick ended up sharing an office at the Cavendish Laboratory in Cambridge because the other scientists in the lab couldn’t stand their incessant chatter.

Moving towards each other as people often means leaving the rest of the world behind. “Every real friendship is a sort of secession, even a rebellion,” C. S. Lewis writes in The Four Loves.

In the midst of the feverish and entwined six-year collaboration between Braque and Picasso that led to cubism, both artists signed the back of each of their canvases; only they would know who did what. “People always ask Ulay and me the same questions,” Marina Abramovic told me. ‘”Whose idea was it?’ or ‘How was this done?’… But we never specify. Everything was interrelated and interdependent.”

Partnerships often form impediments to others trying to look in. Outsiders are not part of the club, they are not doing the work, they don’t have the shared understanding, the common goals, the …

This is one reason many epic partnerships end up as historical footnotes or become entirely effaced: “Things were said with Picasso during those years,” Braque said, “that no one will ever say again, things that no one could ever say any more, that no one could ever understand… things that would be incomprehensible and which gave us such joy.” This was one of the very few lines either man ever spoke about the relationship that helped give birth to modern art.

In addition to the physical gestures that a pair can share, there is also an unmistakable private language. This is the key to high-bandwidth communication.

Many pairs have what we could fairly call a private language. Tom Hanks described the communication between director Ron Howard and producer Brian Grazer as “some gestalt Vulcan.” Akio Morita and Ma- saru Ibuka, the cofounders of Sony, “would sit there talking to each other,” Morita’s son Hideo said, “and we would listen but we had no idea what they were saying … It was gibberish to us, but they were understanding each other, and interrupting them for any reason was forbidden.”

Private language emerges organically from constant exchange. Intimate pairs talk fluidly and naturally, having let go of what psychologists call “self-monitoring”—the process of watching impulses and protean thoughts, censoring some, allowing others to pass one’s lips. … The psychologist Daniel Kahneman makes the same point. “Like most people, I am somewhat cautious about exposing tentative thoughts to others,” he said. But after a while with Amos Tversky, “this caution was completely absent.”

You just get so high-bandwidth,” Bill Gates said about talking to Steve Ballmer, his longtime deputy at Microsoft (and eventual successor). “Steve and I would just be going from talking to meeting to talking to meeting, and then I’d stay up late at night, and write him five e-mails. He’d get up early in the morning and maybe not necessarily respond to them, but start thinking about them. And the minute I see him, he’s [at the office whiteboard] saying we could move this guy over here and do this thing here.” Facebook’s CEO Mark Zuckerberg used that same term, high- bandwidth, to describe his exchanges with his COO Sheryl Sandberg. “We can talk for 30 seconds and have more meaning be exchanged than in a lot of meetings that I have for an hour,” he said.

More than shared language, people develop into shared rhythms and syntactical structures of speech.

This is due in part to the astonishing power of mimicry, which psychologists call “social contagion.” Just by being near each other, the psychologist Elaine Hatfield has shown, people come to match accents, speech rates, vocal intensity, vocal frequency, pauses, and quickness to respond.

Psychologists used to think that people imitated each other in a deliberate attempt to be liked, but mimicry is far more pervasive than this — and largely nonconscious. Intimate partners share physical postures and breathing patterns too. They use the same muscles so often, the psychologist Robert Zajonc and colleagues found in a study of spouses, that they even come to look alike. Warren Buffett has said that he and Charlie Munger are “Siamese twins, practically.” In addition to wearing the same gray suits, the same Clark Kent glasses, and the same comb-overs, writes Buffett biographer Alice Schroeder, they also share a “lurching, awkward gait” and a flickering intensity in their eyes. Whether or not this is due to what Zajonc calls “repeated empathic mimicry,” we can’t be sure, but one does wonder.

The larger point about any physical convergence is that it reflects what psychologists call a “shared coordinative structure.” Shared mannerisms, like similar walking gaits, often come along with shared emotions and ideas. Just as physical qualities are “highly communicable,” write psychologists Molly Ireland and James Pennebaker, so are behaviors, affective states, and beliefs.

Language is an unusually potent mechanism for psychic convergence, because it is so closely tied to thinking. “Linguistic coordination,” Ireland and Pennebaker explain, leads to “the cultivation of common ground (i.e., matching cognitive frameworks in which conversants adopt shared assumptions, linguistic referents, and knowledge).”

Of course eventually this goes telepathic.

Barry Sonnenfeld, who has directed photography on several films for the Coen brothers, remembers Ethan saying, after a take, “Hey, Joel, you know what?” And Joel replying: “Yeah, I know, I’m going to tell him.” When the writer David Zax visited The Daily Show to profile Steve Bodow, Jon Stewart’s head writer at the time, Zax could understand only a small fraction of their exchanges, given the dominance of “workplace argot and quasi-telepathy.” “If you work with Jon for any length of time, you learn to interpret the short hand,” Bodow said. For example, Stewart might say: “Cut the thing and bring the thing around and do the thing.” ‘”Cut the thing’: You know what thing needs to be cut,” Bodow explained. “‘Bring the thing around’: There’s a thing that works, but it needs to move up in order to set up the ‘do the thing’ thing, which is probably the ‘blow,’ the big joke at the end. It takes time and repetition and patience and frustration, and suddenly you know how to bring the thing around and do the thing.”