Tag: Dan Ariely

Life Changing Books (New Guy Edition)

Back in 2013, I posted the Books that Changed my Life. In doing so, I was responding to a reader request to post up the books that “literally changed my life.”

Now that we have Jeff on board, I've asked him to do the same. Here are his choices, presented in a somewhat chronological order. As always, these lists leave off a lot of important books in the name of brevity.

Rich Dad, Poor Dad – Robert Kiyosaki

Before I get hanged for apostasy, let me explain. The list is about books that changed my life and this one absolutely did. I pulled this off my father's shelf and read it in high school, and it kicked off a lifelong interest in investments, business, and the magic of compound interest. That eventually led me to find Warren Buffett and Charlie Munger, affecting the path of my life considerably. With that said, I would probably not recommend you start here. I haven't re-read the book since high school and what I've learned about Kiyosaki doesn't make me want to recommend anything to you from him. But for better or worse, this book had an impact. Another one that probably holds up better is The Millionaire Next Door, which my father recommended when I was in high school and stuck with me for a long time too.

Buffett: Making of an American Capitalist/Buffett's Letters to Shareholders – Roger Lowenstein, Warren Buffett

These two and the next book are duplicates off Shane's list, but they are also probably the reason we know each other. Learning about Warren Buffett took the kid who liked “Rich Dad, Poor Dad” and watched The Apprentice, and might have been on a path to highly leveraged real estate speculation and who knows what else, and put him on a more sound path. I read this biography many times in college, and decided I wanted to emulate some of Buffett's qualities. (I actually now prefer The Snowball, by Alice Schroeder, but Lowenstein's came first and changed my life more.) Although I have a business degree, I learned a lot more from reading and applying the collected Letters to Shareholders.

Poor Charlie's Almanack – Peter Kaufman, Charlie Munger et al.

The Almanack is the greatest book I have ever read, and I knew it from the first time I read it. As Charlie says in the book, there is no going back from the multi-disciplinary approach. It would feel like cutting off your hands. I re-read this book every year in whole or in part, and so far, 8 years on, I haven't failed to pick up a meaningful new insight. Like any great book, it grows as you grow. I like to think I understand about 40% of it on a deep level now, and I hope to add a few percent every year. I literally cannot conceive of a world in which I didn't read this.

The Nurture Assumption – Judith Rich Harris

This book affected my thinking considerably. I noticed in the Almanack that Munger recommended this book and another, No Two Alike, towards the end. Once I read it, I could see why. It is a monument to clear and careful thinking. Munger calls the author Judith Rich Harris a combination of Darwin and Sherlock Holmes, and he's right. If this book doesn't change how you think about parenting, social development, peer pressure, education, and a number of other topics, then re-read it.

Filters Against Folly/Living within Limits – Garrett Hardin

Like The Nurture Assumption, these two books are brilliantly well thought-through. Pillars of careful thought. It wasn't until years after I read them that I realized Garrett Hardin was friends with, and in fact funded by, Charlie Munger. The ideas about overpopulation in Living within Limits made a deep impression on me, but the quality of thought in general hit me the hardest. Like the Almanack, it made me want to become a better and more careful thinker.

The Black Swan – Nassim Taleb

Who has read this and not been affected by it? Like many, Nassim's books changed how I think about the world. The ideas from The Black Swan and Fooled by Randomness about the narrative fallacy and the ludic fallacy cannot be forgotten, as well as the central idea of the book itself that rare events are not predictable and yet dominate our landscape. Also, Nassim's writing style made me realize deep, practical writing didn't have to be dry and sanitized. Like him or not, he wears his soul on his sleeve.

Good Calories, Bad Calories / Why We Get Fat: And What to do About it – Gary Taubes

I've been interested in nutrition since I was young, and these books made me realize most of what I knew was not very accurate. Gary Taubes is a scientific journalist of the highest order. Like Hardin, Munger, and Harris, he thinks much more carefully than most of his peers. Nutrition is a field that is still sort of growing up, and the quality of the research and thought shows it. Taubes made me recognize that nutrition can be a real science if it's done more carefully, more Feynman-like. Hopefully his NuSi initiative will help shove the field in the right direction.

The (Honest) Truth about Dishonesty – Dan Ariely

This book by Ariely was a game-changer in that it helped me realize the extent to which we rationalize our behavior in a million little ways. I had a lot of nights thinking about my own propensity for dishonesty and cheating after I read this one, and I like to think I'm a pretty moral person to start with. I had never considered how situational dishonesty was, but now that I do, I see it constantly in myself and others. There are also good sections on incentive-caused bias and social pressure that made an impact.

Sapiens – Yuval Noah Harrari

This is fairly new so I'm still digesting this book, and I have a feeling it will take many years. But Sapiens has a lot of (for me) deep insights about humanity and how we got here. I think Yuval is a very good thinker and an excellent writer. A lot of the ideas in this book will set some people off, and not in a good way. But that doesn't mean they're not correct. Highly recommended if you're open-minded and want to learn.

***

At the end of the day, what gets me excited is my Antilibrary, all the books I have on my shelf or on my Amazon wish list that I haven't read yet. The prospect of reading another great book that changes my life like these books did is an exciting quest.

Dan Ariely on How and Why We Cheat

Three years ago, Dan Ariely, a psychology and behavioral economics professor at Duke, put out a book called The (Honest) Truth About Dishonesty: How We Lie to Everyone–Especially Ourselves. I read the book back closer to when it was released, and I recently revisited it to see how it held up to my initial impressions.

It was even better. In fact, this is one of the most useful books I have ever come across, and my copy is now marked, flagged, and underlined. Let's get in deep.

Why We Cheat

We're Cheaters All

Dan is both an astute researcher and a good writer; he knows how to get to the point, and his points matter. His books, which include Predictably Irrational and The Upside of Irrationality, are not filled with fluff. We've mentioned his demonstrations of pluralistic ignorance here before.

In The Honest Truth, Ariely doesn't just explore where cheating comes from but he digs into which situations make us more likely to cheat than others. Those discussions are what make the book eminently practical, and not just a meditation on cheating. It's a how-to guide on our own dishonesty.

Ariely was led down that path because of a friend of his who had worked with Enron:

It was of course, possible that John and everyone else involved with Enron was deeply corrupt, but I began to think that there may have been a different type of dishonest at work–one that relates more to wishful blindness and is practiced by people like John, you, and me. I started wondering if the problem of dishonesty goes deeper than just a few bad apples and if this kind of wishful blindness takes place in other companies as well. I also wondered if my friends and I would have behaved similarly if we had been the ones consulting for Enron.

This is a beautiful setup that led him to a lot of interesting conclusions in his years of subsequent research. Here's (some of) what Dan found.

  1. Cheating was standard, but only a little. Ariely and his co-researchers ran the same experiment in many different variations, and with many different topics to investigate. Nearly every time, he found evidence of a standard level of cheating. In other experiments, the outcome was the same. A little cheating was everywhere. People generally did not grab all they could, but only as much as they could justify psychologically.
  2. Increasing the cheating reward or moderately altering the risk of being caught didn't affect the outcomes much. In Ariely's experience, the cheating stayed steady: A little bit of stretching every time.
  3. The more abstracted from the cheating we are, the more we cheat. This was an interesting one–it turns out the less “connected” we feel to our dishonesty, the more we're willing to do it. This ranges from being more willing to cheat to earn tokens exchangeable for real money than to earn actual money, to being more willing to “tap” a golf ball to improve its lie than actually pick it up and move it with our hands.
  4. A nudge not to cheat works better before we cheat than after. In other words, we need to strengthen our morals just before we're tempted to cheat, not after. And even more interesting, when Ariely took his findings to the IRS and other organizations who could benefit from being cheated less, they barely let him in the door! The incentives in organizations are interesting.
  5. We think we're more honest than everyone else. Ariely showed this pretty conclusively by studying golfers and asking them how much they thought others cheated and how much they thought they cheated themselves. It was a rout: They consistently underestimated their own dishonesty versus others'. I wasn't surprised by this finding.
  6. We underestimate how blinded we can become to incentives. In a brilliant chapter called “Blinded by our Motivations,” Ariely discusses how incentives skew our judgment and our moral compass. He shows how pharma reps are masters of this game–and yet we allow it to continue. If we take Ariely seriously, the laws against conflicts of interest need to be stronger.
  7. Related to (6), disclosure does not seem to decrease incentive-caused bias. This reminds me of Charlie Munger's statement, “I think I've been in the top 5% of my age cohort all my life in understanding the power of incentives, and all my life I've underestimated it. Never a year passes that I don't get some surprise that pushes my limit a little farther.” Ariely has discussed incentive-caused bias in teacher evaluation before.
  8. We cheat more when our willpower is depleted. This doesn't come as a total surprise: Ariely found that when we're tired and have exerted a lot of mental or physical energy, especially in resisting other temptations, we tend to increase our cheating. (Or perhaps more accurately, decrease our non-cheating.)
  9. We cheat ourselves, even if we have direct incentive not to. Ariely was able to demonstrate that even with a strong financial incentive to honestly assess our own abilities, we still think we cheat less than we do, and we hurt ourselves in the process.
  10. Related to (9), we can delude ourselves into believing we were honest all along. This goes to show the degree to which we can damage ourselves by our cheating as much as others. Ariely also discusses how good we are at pounding our own conclusions into our brain even if no one else is being persuaded, as Munger has mentioned before.
  11. We cheat more when we believe the world “owes us one.” This section of the book should feel disturbingly familiar to anyone. When we feel like we've been cheated or wronged “over here,” we let the universe make it up to us “over there.” (By cheating, of course.) Think about the last time you got cut off in traffic, stiffed on proper change, and then unloaded on by your boss. Didn't you feel more comfortable reaching for what wasn't yours afterwards? Only fair, right?
  12. Unsurprisingly, cheating has a social contagion aspect. If we see someone who we identify with and whose group we feel we belong to cheating, it makes us (much) more likely to cheat. This has wide-ranging social implications.
  13. Finally, nudging helps us cheat less. If we're made more aware of our moral compass through specific types of reminders and nudges, we can decrease our own cheating. Perhaps most important is to keep ourselves out of situations where we'll be tempted to cheat or act dishonestly, and to take pre-emptive action if it's unavoidable.

There's much more in the book, and we highly recommend you read it for that as well as Dan's general theory on cheating. The final chapter on the steps that old religions have taken to decrease dishonesty among their followers is a fascinating bonus. (Reminded me of Nassim Taleb's retort that heavy critics of religion, like Dawkins, take it too literally and under-appreciate the social value of its rules and customs. It's also been argued that religion has an evolutionary basis.)

Check out the book, and while you're at up, pick up his other two: Predictably Irrational, and The Upside of Irrationality.

Pluralistic Ignorance

“If everyone is thinking alike, then somebody isn't thinking.”
— George S. Patton

***

I bet you live this almost every day.

Imagine you're in a meeting with a lot of important people. The boss comes in, takes a seat, and starts talking about “strategic market knowledge” this and “leveraging competitive advantages” that.

To you, it all sounds like gibberish. For a second you think you're in the wrong meeting. Surely someone else must feel equally confused??

So you take a quick sanity check. You look around the room at your colleagues and … what?? They are paying attention and nodding their head in total agreement? How can this be?

They must know something you don't know.

You quickly determine the best option is to keep your mouth shut and say nothing, hiding what you think is your own ignorance. A wise career move perhaps, but makes for a pretty dull life.

This is pluralistic ignorance, a psychological state characterized by the belief that one's private thoughts are different from those of others. The term was coined in 1932 by psychologists Daniel Katz and Floyd Allport and describes the common group situation where we privately believe one thing, but feel everyone else in the group believes something else.

In the case above, pluralistic ignorance means that rather than interrupting the meeting to ask for a clarification, we'll sit tight and nod like everyone else.

It's a real life version of The Emperor’s New Clothes, the fairy tale where everyone pretends the king is wearing clothes until a child points out the emperor isn't wearing any clothes.

Dan Ariely, in this short video, explains and demonstrates pluralistic ignorance better than I can. Make sure you watch the whole thing, the kicker is at the end.

Basically we look toward others for cues about how to act when we really should take a page out of Richard Feynman's book: What Do You Care What Other People Think?

The (Honest) Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves

“Essentially, we cheat up to the level that allows us to
retain our self-image as reasonably honest individuals.”

— Dan Ariely

***

In his book, The (Honest) Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves, Dan Ariely attempts to answer the question: “is dishonesty largely restricted to a few bad apples or is it a more widespread problem?”

He concludes that we're mostly honest as long as the conditions are right:

We are going to take things from each other if we have a chance … many people need controls around them for them to do the right thing. … [T]he locksmith told Peter that locks are on doors only to keep honest people honest. “One percent of people will always be honest and never steal,” the locksmith said. “Another one percent will always be dishonest and always try to pick your lock and steal your television. And the rest will be honest as long as the conditions are right—but if they are tempted enough, they’ll be dishonest too. Locks won’t protect you from the thieves, who can get in your house if they really want to. They will only protect you from the mostly honest people who might be tempted to try your door if it had no lock.”

We're ok cheating, as long as its just a little and unnoticeable.

as long as we cheat by only a little bit, we can benefit from cheating and still view ourselves as marvelous human beings. This balancing act is the process of rationalization, and it is the basis of what we’ll call the fudge factor theory.

Something that stood out for me was the chapter on the relationship between creativity and dishonesty. According to Ariely, the link between creativity and dishonesty is not as straightforward as we might think — The more creative we are the better we are at rationalising dishonest behavior.

We may not always know exactly why we do what we do, choose what we choose, or feel what we feel. But the obscurity of our real motivations doesn’t stop us from creating perfectly logical-sounding reasons for our actions, decisions, and feelings.

… We all want explanations for why we behave as we do and for the ways the world around us functions. Even when our feeble explanations have little to do with reality. We’re storytelling creatures by nature, and we tell ourselves story after story until we come up with an explanation that we like and that sounds reasonable enough to believe. And when the story portrays us in a more glowing and positive light, so much the better.

We don't make rational decisions. Our choices are (mostly) not based on explicit preferences and thought through. Rather, we follow our intuition with “mental gymnastics” to justify our actions. Conveniently this allows us to get what we want and maintain our ego. We tell ourselves that we are acting rationally. The real difference Ariely found between more and less creative people is the creativity of the justifications. “The most creative we are,” he writes, “the more we are able to come up with good stories that help us justify our selfish interests.”

This really comes down to our storytelling nature:

We’re storytelling creatures by nature, and we tell ourselves story after story until we come up with an explanation that we like and that sounds reasonable enough to believe. And when the story portrays us in a more glowing and positive light, so much the better.

The idea that worries Ariely the most is the trend toward cashless payments. “From all the research I have done over the years,” he writes, “the idea that worries me the most is that the more cashless our society becomes, the more our moral compass slips.”

One factor that Ariely didn't contemplate that I think it is important is how our environment — whether we're in an environment of abundance or scarcity — affects our moral compass. Intuitively, I think it's a lot easier to rationalise moral transgressions in an environment of scarcity than one of abundance.

 

The (Honest) Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves is worth reading in its entirety.

The Difference Between Persuade, Convince, and Coerce

The difference is worth understanding.

In a recent slate article, K.C. Cole writes:

Persuasion requires understanding. Coercion requires only power. We usually equate coercion with obvious force, but sometimes it’s far more subtle. If you want people to stop smoking, for example, you don’t need to make it illegal; you can simply make smoking expensive (raise taxes) or offer bribes (lower health insurance premiums). Both are still coercive in that the power to give or take away resides entirely in the hands of the “coercer.”

Persuasion is fundamentally different because it relies on understanding what smoking does to the human body. Someone who’s persuaded of its dangers has an incentive to stop that’s entirely independent of anyone else’s actions.

I agree that coercion involves the use of (or the threat of) force.

Where I disagree — and where this gets slightly murky — is that I don't think you need to fully understand something (at least at a conscious level) to be persuaded to act. That assumes persuasion is rational.

I think you are persuaded by appeals to the irrational — emotions, psychology, and imagination.

Understanding something (e.g., what smoking does to the human body) largely comes from facts or arguments that appeal to intellect. When I get you to do something based on facts and reason I'm convincing you to act, which is different from persuading you to act.

Seth Goldin devised an interesting heuristic to think about this — “Engineers convince. Marketers persuade.”

Cole continues:

It’s a distinction I think about often in teaching. If I get students to do things a certain way for fear of getting an F or hopes of getting an A, it means I’ve influenced their behavior for the duration of the class. If I’ve managed to persuade them that my method has merit, I’ve likely made converts for life.

Cole argues that you can be coerced into doing something for the duration of class, yet persuaded by merit to do it for life. That's an appealing argument but it's flawed.

If you're persuading someone to do something by merit then you're appealing to intellect and reason not emotions or imagination — that's not persuading them, it's convincing them.

While morally better than coercion, I doubt Cole's appeal to reason alone would create a lifelong change in his students. Such a successful outcome (changing behavior for life) would likely be the result of a confluence of factors, not just one.

If I'm trying to get you to do something, there are a number of possible end states (for simplicity, I'll remove coercion). You can be (1) convinced of something but don't take action (e.g., I can convince you that smoking is bad for you, yet you fail to quit); (2) convinced of something and you do take action (e.g., I convince you smoking is bad and you quit); (3) convinced and persuaded (e.g., maybe you were in the camp of #1 but now I've persuaded to act); (4) unpersuaded and unconvinced; or (5) unconvinced yet persuaded to act.

I think Cole convinced but didn't persuade his students (#2).

I looked up ‘persuade/convince' in my copy of Garner's Modern American Usage. The entry reads:

persuade; convince. In the best usage, one persuades another to do something but convinces another of something.

Of course, coming from a usage dictionary you get also get usage instructions:

Avoid convince to—the phrasing *she convinced him to resign is traditionally viewed as less good than she persuaded him to resign.

But that means that you can never be convinced to do something – only persuaded. I don't agree.

I think Seth Goldin is closer to the mark. He points out:

Persuasion appeals to the emotions and to fear and to the imagination. Convincing requires a spreadsheet or some other rational device.

You can convince someone to do something based on reason. You can coerce someone to do something under threat. The way to persuade someone, however, is to appeal to their emotions.

The hardest thing to do is convince someone they're wrong. If you find yourself in this circumstance, attempt to persuade them.

It's easier to persuade someone if you've convinced them, and it's easier to convince them if you've persuaded them.

Persuading > Convincing > Coercion

Ideally you want to convince and persuade.

Happy Holidays!

 

Everyone Lies. Dan Ariely Explains Why

Research shows that nearly everyone cheats a little if given the opportunity. Dan Ariely, author of the new book, “The (Honest) Truth About Dishonesty,” explains why.

Over the past decade or so, my colleagues and I have taken a close look at why people cheat, using a variety of experiments and looking at a panoply of unique data sets—from insurance claims to employment histories to the treatment records of doctors and dentists. What we have found, in a nutshell: Everybody has the capacity to be dishonest, and almost everybody cheats—just by a little. Except for a few outliers at the top and bottom, the behavior of almost everyone is driven by two opposing motivations. On the one hand, we want to benefit from cheating and get as much money and glory as possible; on the other hand, we want to view ourselves as honest, honorable people. Sadly, it is this kind of small-scale mass cheating, not the high-profile cases, that is most corrosive to society.

Knowing that most people cheat—but just by a little—the next logical question is what makes us cheat more or less.

One thing that increased cheating in our experiments was making the prospect of a monetary payoff more “distant,” in psychological terms.

and

Another thing that boosted cheating: Having another student in the room who was clearly cheating. … Other factors that increased the dishonesty of our test subjects included knowingly wearing knockoff fashions, being drained from the demands of a mentally difficult task and thinking that “teammates” would benefit from one's cheating in a group version of the matrix task. These factors have little to do with cost-benefit analysis and everything to do with the balancing act that we are constantly performing in our heads. If I am already wearing fake Gucci sunglasses, then maybe I am more comfortable pushing some other ethical limits (we call this the “What the hell” effect). If I am mentally depleted from sticking to a tough diet, how can you expect me to be scrupulously honest? (It's a lot of effort!) If it is my teammates who benefit from my fudging the numbers, surely that makes me a virtuous person!

What, then—if anything—pushes people toward greater honesty?

… simply being reminded of moral codes has a significant effect on how we view our own behavior.

Inspired by the thought, my colleagues and I ran an experiment at the University of California, Los Angeles. We took a group of 450 participants, split them into two groups and set them loose on our usual matrix task. We asked half of them to recall the Ten Commandments and the other half to recall 10 books that they had read in high school. Among the group who recalled the 10 books, we saw the typical widespread but moderate cheating. But in the group that was asked to recall the Ten Commandments, we observed no cheating whatsoever. We reran the experiment, reminding students of their schools' honor codes instead of the Ten Commandments, and we got the same result. We even reran the experiment on a group of self-declared atheists, asking them to swear on a Bible, and got the same no-cheating results yet again.

This experiment has obvious implications for the real world. While ethics lectures and training seem to have little to no effect on people, reminders of morality—right at the point where people are making a decision—appear to have an outsize effect on behavior.

One key takeaway:

All of this means that, although it is obviously important to pay attention to flagrant misbehaviors, it is probably even more important to discourage the small and more ubiquitous forms of dishonesty—the misbehavior that affects all of us, as both perpetrators and victims.

***

Still curious? This piece is adapted from his forthcoming book, The (Honest) Truth About Dishonesty: How We Lie to Everyone-Especially Ourselves.

Footnotes
12