We spend a lot of our lives trying to persuade others.
This is one of the reasons that Daniel Pink says that we’re all in sales.
Some of you, no doubt, are selling in the literal sense— convincing existing customers and fresh prospects to buy casualty insurance or consulting services or homemade pies at a farmers’ market. But all of you are likely spending more time than you realize selling in a broader sense—pitching colleagues, persuading funders, cajoling kids. Like it or not, we’re all in sales now.
There are many ways to change minds. We often try to convince people.
In the difference between persuading and convincing, Seth Godin writes:
Marketers don’t convince. Engineers convince. Marketers persuade. Persuasion appeals to the emotions and to fear and to the imagination. Convincing requires a spreadsheet or some other rational device.
It’s much easier to persuade someone if they’re already convinced, if they already know the facts. But it’s impossible to change someone’s mind merely by convincing them of your point.
But what do we do when this doesn’t work?
Kathryn Schulz, in her book Being Wrong: Adventures in the Margin of Error, explains:
… The first thing we usually do when someone disagrees with us is that we just assume they are ignorant. You know, they don’t have access to the same information we do and when we generously share that information with them, they are going to see the light and come on over to our team.
When that doesn’t work. When it turns out those people have all the same information and they still don’t agree with us we move onto a second assumption. They’re idiots …
This is what we normally do. We try to convince them that we’re right and they are wrong. (Most people, however, are not idiots.)
In many cases this is just us being overconfident about what we think — the illusion of explanatory depth. We really believe that we understand how something works when we don’t.
In a study about a decade ago, Yale professors Leonid Rozenblit and Frank Keil, asked students to explain how simple things work, like a flushing toilet, a sewing machine, piano keys, a zipper, and a cylinder lock. It turns out, we’re not nearly as smart as we think.
When our knowledge was put to the test, their familiarity with these things led to an (unwarranted) overconfidence about how they worked.
Most of the time people don’t put us to the test. When they do, the results don’t match our confidence. (Interestingly, one of the best ways to really learn how something works is to flip this around. It’s called the Feynman Technique.)
The Era of Fake Knowledge
It’s never been easier to fake what you know: to yourself and others.
It’s about energy conservation. Why put in the effort to learn something if we can get by most of the time without learning it? Why read the entire document when you can just skim the executive summary?
Unable to discern between what we know and what we pretend to know, we ultimately become victims of our own laziness and intellectual dishonesty.
However, we end up fooling ourselves.
In a lecture at the Galileo Symposium in Italy in 1964, future Nobel laureate Richard Feynman said “The first principle is that you must not fool yourself, and you are the easiest person to fool.”
How to Win an Argument
Research published last year and brought to my attention by Mind Hacks shows how this effect might help you convince people they are wrong.
Mind Hacks summarizes the work:
One group was asked to give their opinion and then provide reasons for why they held that view. This group got the opportunity to put their side of the issue, in the same way anyone in an argument or debate has a chance to argue their case.
Those in the second group did something subtly different. Rather than provide reasons, they were asked to explain how the policy they were advocating would work. They were asked to trace, step by step, from start to finish, the causal path from the policy to the effects it was supposed to have.
The results were clear. People who provided reasons remained as convinced of their positions as they had been before the experiment. Those who were asked to provide explanations softened their views, and reported a correspondingly larger drop in how they rated their understanding of the issues.
This simple technique is one to add to our tool belt.
If you want to win an argument, ask the person trying to convince you of something to explain how it would work.
Odds are they have not done the work required to hold an opinion. If they can explain why they are correct and how things would work, you’ll learn something. If they can’t you’ll soften their views, perhaps nudging them ever so softly toward your views.
It is worth bearing in mind, however, that someone might do the same to you.