Three years ago, Dan Ariely, a psychology and behavioral economics professor at Duke, put out a book called The (Honest) Truth About Dishonesty: How We Lie to Everyone–Especially Ourselves. I read the book back closer to when it was released, and I recently revisited it to see how it held up to my initial impressions.
It was even better. In fact, this is one of the most useful books I have ever come across, and my copy is now marked, flagged, and underlined. Let's get in deep.
We're Cheaters All
Dan is both an astute researcher and a good writer; he knows how to get to the point, and his points matter. His books, which include Predictably Irrational and The Upside of Irrationality, are not filled with fluff. We've mentioned his demonstrations of pluralistic ignorance here before.
In The Honest Truth, Ariely doesn't just explore where cheating comes from but he digs into which situations make us more likely to cheat than others. Those discussions are what make the book eminently practical, and not just a meditation on cheating. It's a how-to guide on our own dishonesty.
Ariely was led down that path because of a friend of his who had worked with Enron:
It was of course, possible that John and everyone else involved with Enron was deeply corrupt, but I began to think that there may have been a different type of dishonest at work–one that relates more to wishful blindness and is practiced by people like John, you, and me. I started wondering if the problem of dishonesty goes deeper than just a few bad apples and if this kind of wishful blindness takes place in other companies as well. I also wondered if my friends and I would have behaved similarly if we had been the ones consulting for Enron.
This is a beautiful setup that led him to to a lot of interesting conclusions in his years of subsequent research. Here's (some of) what Dan found.
- Cheating was standard, but only a little. Ariely and his co-researchers ran the same experiment in many different variations, and with many different topics to investigate. Nearly every time, he found evidence of a standard level of cheating. In other experiments, the outcome was the same. A little cheating was everywhere. People generally did not grab all they could, but only as much as they could justify psychologically.
- Increasing the cheating reward or moderately altering the risk of being caught didn't affect the outcomes much. In Ariely's experience, the cheating stayed steady: A little bit of stretching every time.
- The more abstracted from the cheating we are, the more we cheat. This was an interesting one–it turns out the less “connected” we feel to our dishonesty, the more we're willing to do it. This ranges from being more willing to cheat to earn tokens exchangeable for real money than to earn actual money, to being more willing to “tap” a golf ball to improve its lie than actually pick it up and move it with our hands.
- A nudge not to cheat works better before we cheat than after. In other words, we need to strengthen our morals just before we're tempted to cheat, not after. And even more interesting, when Ariely took his findings to the IRS and other organizations who could benefit from being cheated less, they barely let him in the door! The incentives in organizations are interesting.
- We think we're more honest than everyone else. Ariely showed this pretty conclusively by studying golfers and asking them how much they thought others cheated and how much they thought they cheated themselves. It was a rout: They consistently underestimated their own dishonesty versus others'. I wasn't surprised by this finding.
- We underestimate how blinded we can become to incentives. In a brilliant chapter called “Blinded by our Motivations,” Ariely discusses how incentives skew our judgment and our moral compass. He shows how pharma reps are masters of this game–and yet we allow it to continue. If we take Ariely seriously, the laws against conflicts of interest need to be stronger.
- Related to (6), disclosure does not seem to decrease incentive-caused bias. This reminds me of Charlie Munger's statement, “I think I've been in the top 5% of my age cohort all my life in understanding the power of incentives, and all my life I've underestimated it. Never a year passes that I don't get some surprise that pushes my limit a little farther.” Ariely has discussed incentive-caused bias in teacher evaluation before.
- We cheat more when our willpower is depleted. This doesn't come as a total surprise: Ariely found that when we're tired and have exerted a lot of mental or physical energy, especially in resisting other temptations, we tend to increase our cheating. (Or perhaps more accurately, decrease our non-cheating.)
- We cheat ourselves, even if we have direct incentive not to. Ariely was able to demonstrate that even with a strong financial incentive to honestly assess our own abilities, we still think we cheat less than we do, and we hurt ourselves in the process.
- Related to (9), we can delude ourselves into believing we were honest all along. This goes to show the degree to which we can damage ourselves by our cheating as much as others. Ariely also discusses how good we are at pounding our own conclusions into our brain even if no one else is being persuaded, as Munger has mentioned before.
- We cheat more when we believe the world “owes us one.” This section of the book should feel disturbingly familiar to anyone. When we feel like we've been cheated or wronged “over here,” we let the universe make it up to us “over there.” (By cheating, of course.) Think about the last time you got cut off in traffic, stiffed on proper change, and then unloaded on by your boss. Didn't you feel more comfortable reaching for what wasn't yours afterwards? Only fair, right?
- Unsurprisingly, cheating has a social contagion aspect. If we see someone who we identify with and whose group we feel we belong to cheating, it makes us (much) more likely to cheat. This has wide ranging social implications.
- Finally, nudging helps us cheat less. If we're made more aware of our moral compass through specific types of reminders and nudges, we can decrease our own cheating. Perhaps most important is to keep ourselves out of situations where we'll be tempted to cheat or act dishonestly, and to take pre-emptive action if it's unavoidable.
There's much more in the book, and we highly recommend you read it for that as well as Dan's general theory on cheating. The final chapter on the steps that old religions have taken to decrease dishonesty among their followers is a fascinating bonus. (Reminded me of Nassim Taleb's retort that heavy critics of religion, like Dawkins, take it too literally and under-appreciate the social value of its rules and customs. It's also been argued that religion has an evolutionary basis.)
Check out the book, and while you're at up, pick up his other two: Predictably Irrational, and The Upside of Irrationality.