Tag: Kathryn Schulz

How To Win An Argument

talk

We spend a lot of our lives trying to persuade others.

This is one of the reasons that Daniel Pink says that we're all in sales.

Some of you, no doubt, are selling in the literal sense— convincing existing customers and fresh prospects to buy casualty insurance or consulting services or homemade pies at a farmers’ market. But all of you are likely spending more time than you realize selling in a broader sense—pitching colleagues, persuading funders, cajoling kids. Like it or not, we’re all in sales now.

There are many ways to change minds. We often try to convince people.

In the difference between persuading and convincing, Seth Godin writes:

Marketers don’t convince. Engineers convince. Marketers persuade. Persuasion appeals to the emotions and to fear and to the imagination. Convincing requires a spreadsheet or some other rational device.

It’s much easier to persuade someone if they’re already convinced, if they already know the facts. But it’s impossible to change someone’s mind merely by convincing them of your point.

But what do we do when this doesn't work?

Kathryn Schulz, in her book Being Wrong: Adventures in the Margin of Error, explains:

… The first thing we usually do when someone disagrees with us is that we just assume they are ignorant. You know, they don’t have access to the same information we do and when we generously share that information with them, they are going to see the light and come on over to our team.

When that doesn’t work. When it turns out those people have all the same information and they still don’t agree with us we move onto a second assumption. They’re idiots …

This is what we normally do. We try to convince them that we're right and they are wrong. (Most people, however, are not idiots.)

In many cases this is just us being overconfident about what we think — the illusion of explanatory depth. We really believe that we understand how something works when we don't.

In a study about a decade ago, Yale professors Leonid Rozenblit and Frank Keil, asked students to explain how simple things work, like a flushing toilet, a sewing machine, piano keys, a zipper, and a cylinder lock. It turns out, we're not nearly as smart as we think.

When our knowledge was put to the test, their familiarity with these things led to an (unwarranted) overconfidence about how they worked.

Most of the time people don't put us to the test. When they do, the results don't match our confidence. (Interestingly, one of the best ways to really learn how something works is to flip this around. It's called the Feynman Technique.)

***
The Era of Fake Knowledge

It's never been easier to fake what you know: to yourself and others.

It's about energy conservation. Why put in the effort to learn something if we can get by most of the time without learning it? Why read the entire document when you can just skim the executive summary?

Unable to discern between what we know and what we pretend to know, we ultimately become victims of our own laziness and intellectual dishonesty.

However, we end up fooling ourselves.

In a lecture at the Galileo Symposium in Italy in 1964, future Nobel laureate Richard Feynman said “The first principle is that you must not fool yourself, and you are the easiest person to fool.”

***
How to Win an Argument

Research published last year and brought to my attention by Mind Hacks shows how this effect might help you convince people they are wrong.

Mind Hacks summarizes the work:

One group was asked to give their opinion and then provide reasons for why they held that view. This group got the opportunity to put their side of the issue, in the same way anyone in an argument or debate has a chance to argue their case.

Those in the second group did something subtly different. Rather than provide reasons, they were asked to explain how the policy they were advocating would work. They were asked to trace, step by step, from start to finish, the causal path from the policy to the effects it was supposed to have.

The results were clear. People who provided reasons remained as convinced of their positions as they had been before the experiment. Those who were asked to provide explanations softened their views, and reported a correspondingly larger drop in how they rated their understanding of the issues.

***

This simple technique is one to add to our tool belt.

If you want to win an argument, ask the person trying to convince you of something to explain how it would work.

Odds are they have not done the work required to hold an opinion. If they can explain why they are correct and how things would work, you'll learn something. If they can't you'll soften their views, perhaps nudging them ever so softly toward your views.

It is worth bearing in mind, however, that someone might do the same to you.

Being Wrong: Adventures in the Margin of Error

"It infuriates me to be wrong when I know I’m right." — Molière
“It infuriates me to be wrong when I know I’m right.” — Molière

“Why is it so fun to be right?”

That's the opening line from Kathryn Schulz' excellent book Being Wrong: Adventures in the Margin of Error.

As pleasures go, it is, after all, a second-order one at best. Unlike many of life’s other delights—chocolate, surfing, kissing—it does not enjoy any mainline access to our biochemistry: to our appetites, our adrenal glands, our limbic systems, our swoony hearts. And yet, the thrill of being right is undeniable, universal, and (perhaps most oddly) almost entirely undiscriminating.

While we take pleasure in being right, we take as much, if not more, in feeling we are right.

A whole lot of us go through life assuming that we are basically right, basically all the time, about basically everything: about our political and intellectual convictions, our religious and moral beliefs, our assessment of other people, our memories, our grasp of facts. As absurd as it sounds when we stop to think about it, our steady state seems to be one of unconsciously assuming that we are very close to omniscient.

Schulz argues this makes sense. We're right most of the time and in these moments we affirm “our sense of being smart.” But Being Wrong is about … well being wrong.

If we relish being right and regard it as our natural state, you can imagine how we feel about being wrong. For one thing, we tend to view it as rare and bizarre—an inexplicable aberration in the normal order of things. For another, it leaves us feeling idiotic and ashamed.

In our collective imagination, error is associated not just with shame and stupidity but also with ignorance, indolence, psychopathology, and moral degeneracy. This set of associations was nicely summed up by the Italian cognitive scientist Massimo Piattelli-Palmarini, who noted that we err because of (among other things) “inattention, distraction, lack of interest, poor preparation, genuine stupidity, timidity, braggadocio, emotional imbalance,…ideological, racial, social or chauvinistic prejudices, as well as aggressive or prevaricatory instincts.” In this rather despairing view—and it is the common one—our errors are evidence of our gravest social, intellectual, and moral failings.

But of all the things we are wrong about, “this idea of error might well top the list.”

It is our meta-mistake: we are wrong about what it means to be wrong. Far from being a sign of intellectual inferiority, the capacity to err is crucial to human cognition. Far from being a moral flaw, it is inextricable from some of our most humane and honorable qualities: empathy, optimism, imagination, conviction, and courage. And far from being a mark of indifference or intolerance, wrongness is a vital part of how we learn and change. Thanks to error, we can revise our understanding of ourselves and amend our ideas about the world.

“As with dying,” Schulz pens, “we recognize erring as something that happens to everyone, without feeling that it is either plausible or desirable that it will happen to us.”

Being wrong is something we have a hard time culturally admitting.

As a culture, we haven’t even mastered the basic skill of saying “I was wrong.” This is a startling deficiency, given the simplicity of the phrase, the ubiquity of error, and the tremendous public service that acknowledging it can provide. Instead, what we have mastered are two alternatives to admitting our mistakes that serve to highlight exactly how bad we are at doing so. The first involves a small but strategic addendum: “I was wrong, but…”—a blank we then fill in with wonderfully imaginative explanations for why we weren’t so wrong after all. The second (infamously deployed by, among others, Richard Nixon regarding Watergate and Ronald Reagan regarding the Iran-Contra affair) is even more telling: we say, “mistakes were made.” As that evergreen locution so concisely demonstrates, all we really know how to do with our errors is not acknowledge them as our own.

Being wrong feels a lot like being right.

This is the problem of error-blindness. Whatever falsehoods each of us currently believes are necessarily invisible to us. Think about the telling fact that error literally doesn’t exist in the first person present tense: the sentence “I am wrong” describes a logical impossibility. As soon as we know that we are wrong, we aren’t wrong anymore, since to recognize a belief as false is to stop believing it. Thus we can only say “I was wrong.” Call it the Heisenberg Uncertainty Principle of Error: we can be wrong, or we can know it, but we can’t do both at the same time.

Error-blindness goes some way toward explaining our persistent difficulty with imagining that we could be wrong. It’s easy to ascribe this difficulty to various psychological factors—arrogance, insecurity, and so forth—and these plainly play a role. But error-blindness suggests that another, more structural issue might be at work as well. If it is literally impossible to feel wrong—if our current mistakes remain imperceptible to us even when we scrutinize our innermost being for signs of them—then it makes sense for us to conclude that we are right.

If our current mistakes are necessarily invisible to us, our past errors have an oddly slippery status as well. Generally speaking, they are either impossible to remember or impossible to forget. This wouldn’t be particularly strange if we consistently forgot our trivial mistakes and consistently remembered the momentous ones, but the situation isn’t quite that simple.

It’s hard to say which is stranger: the complete amnesia for the massive error, or the perfect recall for the trivial one. On the whole, though, our ability to forget our mistakes seems keener than our ability to remember them.

Part of what’s going on here is, in essence, a database-design flaw. Most of us don’t have a mental category called “Mistakes I Have Made.”

Like our inability to say “I was wrong,” this lack of a category called “error” is a communal as well as an individual problem. As someone who tried to review the literature on wrongness, I can tell you that, first, it is vast; and, second, almost none of it is filed under classifications having anything to do with error. Instead, it is distributed across an extremely diverse set of disciplines: philosophy, psychology, behavioral economics, law, medicine, technology, neuroscience, political science, and the history of science, to name just a few. So too with the errors in our own lives. We file them under a range of headings—“embarrassing moments,” “lessons I’ve learned,” “stuff I used to believe”—but very seldom does an event live inside us with the simple designation “wrong.”

This category problem is only one reason why our past mistakes can be so elusive. Another is that (as we’ll see in more detail later) realizing that we are wrong about a belief almost always involves acquiring a replacement belief at the same time: something else instantly becomes the new right.

What with error-blindness, our amnesia for our mistakes, the lack of a category called “error,” and our tendency to instantly overwrite rejected beliefs, it’s no wonder we have so much trouble accepting that wrongness is a part of who we are. Because we don’t experience, remember, track, or retain mistakes as a feature of our inner landscape, wrongness always seems to come at us from left field—that is, from outside ourselves. But the reality could hardly be more different. Error is the ultimate inside job.

For us to learn from error, we have to see it differently. The goal of Being Wrong then is “to foster an intimacy with our own fallibility, to expand our vocabulary for and interest in talking about our mistakes, and to linger for a while inside the normally elusive and ephemeral experience of being wrong.”

Anne Lamott: Some Instructions on Writing and Life

ks

That tweet from Kathryn Schulz set off my quest to find Anne Lamott's Bird by Bird: Some Instructions on Writing and Life. If you've ever wondered how I find things, this is a perfect example.

And what a find this turned out to be. Lamott's advice is down to earth, real, and void of any pretentiousness. In fact, it's one of the best books I've ever come across on writing.

Getting started is often the hardest part of writing.

The very first thing I tell my new students … is that good writing is about telling the truth. We are a species that needs and wants to understand who we are. Sheep lice do not seem to share this longing, which is one reason they write so very little. But we do. We have so much we want to say and figure out. Year after year my students are bursting with stories to tell, and they start writing projects with excitement and maybe even joy— finally their voices will be heard, and they are going to get to devote themselves to this one thing they’ve longed to do since childhood. But after a few days at the desk, telling the truth in an interesting way turns out to be about as easy and pleasurable as bathing a cat.

Everyone wants to know how to write. Routines are common. Maya Angelou likes to work in dirty hotel rooms. Hemingway wrote standing up. But more to the point perhaps is the advice of Philip Roth, who said every writer needs “the ability to sit still in the deeply uneventful business,” a comment that Lamott echoes.

Anne Lamott
“There is ecstasy in paying attention.”

You sit down, I say. You try to sit down at approximately the same time every day. This is how you train your unconscious to kick in for you creatively. So you sit down at, say, nine every morning, or ten every night. You put a piece of paper in the typewriter, or you turn on your computer and bring up the right file, and then you stare at it for an hour or so. You begin rocking, just a little at first, and then like a huge autistic child. You look at the ceiling, and over at the clock, yawn, and stare at the paper again. Then, with your fingers poised on the keyboard, you squint at an image that is forming in your mind— a scene, a locale, a character, whatever— and you try to quiet your mind so you can hear what that landscape or character has to say above the other voices in your mind. The other voices are banshees and drunken monkeys. They are the voices of anxiety, judgment, doom, guilt. Also, severe hypochondria. There may be a Nurse Ratched– like listing of things that must be done right this moment: foods that must come out of the freezer, appointments that must be canceled or made, hairs that must be tweezed. But you hold an imaginary gun to your head and make yourself stay at the desk.

Yet somehow in the face of all this, you clear a space for the writing voice, hacking away at the others with machetes, and you begin to compose sentences. You begin to string words together like beads to tell a story. You are desperate to communicate, to edify or entertain, to preserve moments of grace or joy or transcendence, to make real or imagined events come alive.

To be a good writer you need reverence and awe.

In order to be a writer, you have to learn to be reverent. If not, why are you writing? Why are you here?

Let's think of reverence as awe, as presence in and openness to the world. Think of those times when you’ve read prose or poetry that is presented in such a way that you have a fleeting sense of being startled by beauty or insight, by a glimpse into someone’s soul. All of a sudden everything seems to fit together or at least to have some meaning for a moment. This is our goal as writers, I think; to help others have this sense of — please forgive me — wonder, of seeing things anew, things that can catch us off guard, that break in on our small, bordered worlds. When this happens, everything feels more spacious.

There is ecstasy in paying attention. You can get into a kind of Wordsworthian openness to the world, where you see in everything the essence of holiness.

Drama is how the writer holds the attention of the reader through an arc.

The basic formula for drama is setup, buildup, payoff—just like a joke. The setup tells us what the game is. The buildup is where you put in all the moves, the forward motion, where you get all the meat off the turkey. The payoff answers the question, Why are we here anyway? What is it that you’ve been trying to give? Drama must move forward and upward, or the seats on which the audience is sitting will become very hard and uncomfortable. So, in fact, will the audience. And eventually the audience will become impatient, disappointed, and unhappy. There must be movement.

You need to be moving your characters forward, even if they only go slowly. Imagine moving them across a lily pond. If each lily pad is beautifully, carefully written, the reader will stay with you as you move toward the other side of the pond, needing only the barest of connections— such as rhythm, tone, or mood.

Commenting on Alice Adams' short story formula of ABDCE, for Action, Background, Development, Climax, and Ending, Lammott writes:

You begin with action that is compelling enough to draw us in, make us want to know more. Background is where you let us see and know who these people are, how they’ve come to be together, what was going on before the opening of the story. Then you develop these people, so that we learn what they care most about. The plot— the drama, the actions, the tension— will grow out of that. You move them along until everything comes together in the climax, after which things are different for the main characters, different in some real way. And then there is the ending: what is our sense of who these people are now, what are they left with, what happened, and what did it mean?

A formula can be a great way to get started. And it feels so great finally to dive into the water; maybe you splash around and flail for a while, but at least you’re in. Then you start doing whatever stroke you can remember how to do, and you get this scared feeling inside you— of how hard it is and how far there is to go— but still you’re in, and you’re afloat, and you’re moving.

The act of writing is its own reward.

But I still encourage anyone who feels at all compelled to write to do so. I just try to warn people who hope to get published that publication is not all that it is cracked up to be. But writing is. Writing has so much to give, so much to teach, so many surprises. That thing you had to force yourself to do— the actual act of writing— turns out to be the best part. It’s like discovering that while you thought you needed the tea ceremony for the caffeine, what you really needed was the tea ceremony.

I tell my students that the odds of their getting published and of it bringing them financial security, peace of mind, and even joy are probably not that great. Ruin, hysteria, bad skin, unsightly tics, ugly financial problems, maybe; but probably not peace of mind. I tell them that I think they ought to write anyway.

Lamott uses index cards to not only help her remember things in an otherwise busy world but as an aid to creativity.

I like to think that Henry James said his classic line, “A writer is someone on whom nothing is lost,” while looking for his glasses, and that they were on top of his head. We have so much to remember these days. So we make all these lists, filled with hope that they will remind us of all the important things to do and buy and mail, all the important calls we need to make, all the ideas we have for short stories or articles. And yet by the time you get around to everything on any one list, you’re already behind on another. Still, I believe in lists and I believe in taking notes, and I believe in index cards for doing both.

Now, I have a number of writer friends who do not take notes out there in the world, who say it’s like not taking notes in class but listening instead. I think that if you have the kind of mind that retains important and creative thoughts— that is, if your mind still works— you’re very lucky and you should not be surprised if the rest of us do not want to be around you. I actually have one writer friend— whom I think I will probably be getting rid of soon— who said to me recently that if you don’t remember it when you get home, it probably wasn’t that important. And I felt eight years old again, with something important to say that had suddenly hopped down one of the rabbit holes in my mind, while an adult nearby was saying priggishly, “Well ! It must not have been very important then.”

So you have to decide how you feel about this. You may have a perfectly good memory and be able to remember three hours later what you came up with while walking on the mountain or waiting at the dentist’s. And then again, you may not.

My index-card life is not efficient or well organized. Hostile, aggressive students insist on asking what I do with all my index cards. And all I can say is that I have them, I took notes on them, and the act of having written something down gives me a fifty-fifty shot at having it filed away now in my memory. If I’m working on a book or an article, and I’ve taken some notes on index cards, I keep them with that material, paperclip them to a page of rough draft where that idea or image might bring things to life. Or I stack them on my desk along with the pages for the particular chapter or article I’m working on, so I can look at them. When I get stuck or lost or the jungle drums start beating in my head, proclaiming that the jig is about to be up and I don’t know what I’m doing and the well has run dry, I’ll look through my index cards. I try to see if there’s a short assignment on any of them that will get me writing again, give me a small sense of confidence, help me put down one damn word after another, which is, let’s face it, what writing finally boils down to.

She cautions that perfectionism is not only counter-productive but blocks playfulness and thus creativity.

Perfectionism is the voice of the oppressor, the enemy of the people. It will keep you cramped and insane your whole life, and it is the main obstacle between you and a shitty first draft. I think perfectionism is based on the obsessive belief that if you run carefully enough, hitting each stepping -stone just right, you won’t have to die. The truth is that you will die anyway and that a lot of people who aren’t even looking at their feet are going to do a whole lot better than you, and have a lot more fun while they’re doing it.

Besides, perfectionism will ruin your writing, blocking inventiveness and playfulness and life force (these are words we are allowed to use in California). Perfectionism means that you try desperately not to leave so much mess to clean up. But clutter and mess show us that life is being lived. Clutter is wonderfully fertile ground— you can still discover new treasures under all those piles, clean things up, edit things out, fix things, get a grip. Tidiness suggests that something is as good as it’s going to get. Tidiness makes me think of held breath, of suspended animation, while writing needs to breathe and move.

Writing is about “learning to pay attention and to communicate what is going on”

Now, if you ask me, what’s going on is that we’re all up to here in it, and probably the most important thing is that we not yell at one another. Otherwise we’d all just be barking away like Pekingese: “Ah! Stuck in the shit! And it’s your fault, you did this …” Writing involves seeing people suffer and, as Robert Stone once put it, finding some meaning therein. But you can’t do that if you’re not respectful. If you look at people and just see sloppy clothes or rich clothes, you’re going to get them wrong.

And, in the end, writing makes you a better reader.

One reads with a deeper appreciation and concentration, knowing now how hard writing is, especially how hard it is to make it look effortless. You begin to read with a writer’s eyes. You focus in a new way. You study how someone portrays his or her version of things in a way that is new and bold and original. You notice how a writer paints in a mesmerizing character or era for you, without your having the sense of being given a whole lot of information, and when you realize how artfully this has happened, you may actually put the book down for a moment and savor it, just taste it.

She concludes that “writing and reading decrease our sense of isolation.”

They deepen and widen and expand our sense of life : they feed the soul. When writers make us shake our heads with the exactness of their prose and their truths, and even make us laugh about ourselves or life, our buoyancy is restored. We are given a shot at dancing with, or at least clapping along with, the absurdity of life, instead of being squashed by it over and over again. It’s like singing on a boat during a terrible storm at sea. You can’t stop the raging storm, but singing can change the hearts and spirits of the people who are together on that ship.

Bird by Bird: Some Instructions on Writing and Life is a must read.

(Image source)

The Relativity of Wrong

The Relativity of Wrong

“The basic trouble, you see, is that people think that “right” and “wrong” are absolute;
that everything that isn't perfectly and completely right is totally and equally wrong.”
Isaac Asimov

Isaac Asimov received a letter one day from a fellow who wanted to argue with one of Asimov's essays.

Asimov used this short essay to highlight the nuances of being wrong.

It seemed that in one of my innumerable essays, I had expressed a certain gladness at living in a century in which we finally got the basis of the universe straight.

I didn't go into detail in the matter, but what I meant was that we now know the basic rules governing the universe, together with the gravitational interrelationships of its gross components, as shown in the theory of relativity worked out between 1905 and 1916. We also know the basic rules governing the subatomic particles and their interrelationships, since these are very neatly described by the quantum theory worked out between 1900 and 1930. What's more, we have found that the galaxies and clusters of galaxies are the basic units of the physical universe, as discovered between 1920 and 1930.

These are all twentieth-century discoveries, you see.

The young specialist in English Lit, having quoted me, went on to lecture me severely on the fact that in every century people have thought they understood the universe at last, and in every century they were proved to be wrong. It follows that the one thing we can say about our modern “knowledge” is that it is wrong. The young man then quoted with approval what Socrates had said on learning that the Delphic oracle had proclaimed him the wisest man in Greece. “If I am the wisest man,” said Socrates, “it is because I alone know that I know nothing.” the implication was that I was very foolish because I was under the impression I knew a great deal.

My answer to him was, “John, when people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together.”

The basic trouble, you see, is that people think that “right” and “wrong” are absolute; that everything that isn't perfectly and completely right is totally and equally wrong.

However, I don't think that's so. It seems to me that right and wrong are fuzzy concepts, and I will devote this essay to an explanation of why I think so.

When my friend the English literature expert tells me that in every century scientists think they have worked out the universe and are always wrong, what I want to know is how wrong are they? Are they always wrong to the same degree?

Asimov's friend, with the mental framing of absolute rights and wrongs, believed that all theories are wrong because they are eventually proven incorrect. But he ignored the degree of incorrectness. There is an important distinction to be made between the degree of wrongness.

What actually happens is that once scientists get hold of a good concept they gradually refine and extend it with greater and greater subtlety as their instruments of measurement improve. Theories are not so much wrong as incomplete.

This can be pointed out in many cases other than just the shape of the earth. Even when a new theory seems to represent a revolution, it usually arises out of small refinements. If something more than a small refinement were needed, then the old theory would never have endured.

Copernicus switched from an earth-centered planetary system to a sun-centered one. In doing so, he switched from something that was obvious to something that was apparently ridiculous. However, it was a matter of finding better ways of calculating the motion of the planets in the sky, and eventually the geocentric theory was just left behind. It was precisely because the old theory gave results that were fairly good by the measurement standards of the time that kept it in being so long.

Again, it is because the geological formations of the earth change so slowly and the living things upon it evolve so slowly that it seemed reasonable at first to suppose that there was no change and that the earth and life always existed as they do today. If that were so, it would make no difference whether the earth and life were billions of years old or thousands. Thousands were easier to grasp.

But when careful observation showed that the earth and life were changing at a rate that was very tiny but not zero, then it became clear that the earth and life had to be very old. Modern geology came into being, and so did the notion of biological evolution.

If the rate of change were more rapid, geology and evolution would have reached their modern state in ancient times. It is only because the difference between the rate of change in a static universe and the rate of change in an evolutionary one is that between zero and very nearly zero that the creationists can continue propagating their folly.

Since the refinements in theory grow smaller and smaller, even quite ancient theories must have been sufficiently right to allow advances to be made; advances that were not wiped out by subsequent refinements.

The Greeks introduced the notion of latitude and longitude, for instance, and made reasonable maps of the Mediterranean basin even without taking sphericity into account, and we still use latitude and longitude today.

The Sumerians were probably the first to establish the principle that planetary movements in the sky exhibit regularity and can be predicted, and they proceeded to work out ways of doing so even though they assumed the earth to be the center of the universe. Their measurements have been enormously refined but the principle remains.

Naturally, the theories we now have might be considered wrong in the simplistic sense of my English Lit correspondent, but in a much truer and subtler sense, they need only be considered incomplete.

Kathryn Schulz writes about something very similar:

Because so many scientific theories from bygone eras have turned out to be wrong, we must assume that most of today’s theories will eventually prove incorrect as well. And what goes for science goes in general. Politics, economics, technology, law, religion, medicine, child-rearing, education: no matter the domain of life, one generation’s verities so often become the next generation’s falsehoods that we might as well have a Pessimistic Meta-Induction from the History of Everything.

Good scientists understand this. They recognize that they are part of a long process of approximation. They know that they are constructing models rather than revealing reality…

Still curious? Read Kathryn Schulz's explanation of how we feel when people disagree with us. Aslo check out why old knowledge isn't necessarily a waste.

Kathryn Schulz on why Knowledge Collapses as often as it Accretes

this will make you smarter, book

Kathryn Schulz comments on the fantasy that knowledge is static in This Will Make You Smarter: New Scientific Concepts to Improve Your Thinking.

Because so many scientific theories from bygone eras have turned out to be wrong, we must assume that most of today’s theories will eventually prove incorrect as well. And what goes for science goes in general. Politics, economics, technology, law, religion, medicine, child-rearing, education: no matter the domain of life, one generation’s verities so often become the next generation’s falsehoods that we might as well have a Pessimistic Meta-Induction from the History of Everything.

Good scientists understand this. They recognize that they are part of a long process of approximation. They know that they are constructing models rather than revealing reality…

The rest of us, by contrast, often engage in a kind of tacit chronological exceptionalism. Unlike all those suckers who fell for the flat earth or the geocentric universe or cold fusion or the cosmological constant, we ourselves have the great good luck to be alive during the very apex of accurate human thought. The literary critic Harry Levin put this nicely: “The habit of equating one’s age with the apogee of civilization, one’s town with the hub of the universe, one’s horizons with the limits of human awareness, is paradoxically widespread.” At best, we nurture the fantasy that knowledge is always cumulative, and therefore concede that future eras will know more than we do. But we ignore or resist the fact that knowledge collapses as often as it accretes, that our own most cherished beliefs might appear patently false to posterity.

That fact is the essence of the meta-induction — and yet, despite its name, this idea is not pessimistic. Or rather, it is only pessimistic if you hate being wrong. If, by contrast, you think that uncovering your mistakes is one of the best ways to revise and improve your understanding of the world, then this is actually a highly optimistic insight.

The book is a curated collection of answers to the question: What scientific concept would improve everybody’s cognitive toolkit? All of the answers are available online in their entirety.

Still curious? Knowledge, like milk, has an expiry date.

What We Do When Someone Disagrees With Us

Being Wrong: Adventures in the Margin of Error

Kathryn Schulz, author of Being Wrong: Adventures in the Margin of Error, gave an excellent talk at TED this past year.

There is a moment in her talk when she summarizes what we do when someone disagrees with us that is worth pondering.

…The first thing we usually do when someone disagrees with us is that we just assume they are ignorant. You know, they don't have access to the same information we do and when we generously share that information with them, they are going to see the light and come on over to our team.

When that doesn't work. When it turns out those people have all the same information and they still don't agree with us we move onto a second assumption. They're idiots. They have all the right pieces of the puzzle and they are too moronic to put them together.

And when that doesn't work. When it turns out that people have all the same facts that we do and they are pretty smart we move onto a third assumption. They know the truth and they are deliberately distorting it for their own malevolent purposes.

So this is a catastrophe: our attachment to our own rightness. It prevents us from preventing mistakes when we need to and causes us to treat each other terribly.

12