Over 400,000 people visited Farnam Street last month to learn how to make better decisions, create new ideas, and avoid stupid errors. With more than 100,000 subscribers to our popular weekly digest, we've become an online intellectual hub. To learn more about we what do, start here.

Max Bazerman — You Are Not As Ethical As You Think

Ethical infractions are rooted in the intricacies of human psychology rather than integrity.

Max Bazerman’s book: Blind Spots will certainly make you think about your own actions more objectively.

Briefly, here are some of my takeaways.

  • We engage in behavioral forecasting errors. We believe we will behave a certain way in a certain situation. Yet, when actually faced with that situation we behave differently.
  • We are experts at deflecting blame and rationalizing our behavior in a positive light. A used car salesman can view himself as ethical despite selling someone a car that leaks oil, by noting the buyer failed to ask the right questions (bias from self-interest).
  • People often judge the ethicality of actions based on the outcome (outcome bias). We tend to be far more concerned with and show more sympathy when the actions taken affect “identifiable victims”.
  • Motivated blindness (when one party has an interest in overlooking the unethical behavior of another party) explains the financial crisis (bias from self-interest).
  • Research finds that cognitively busy people are more likely to cheat on a task than those who are less overloaded. Why? Because it takes cognitive effort to be reflective enough to skip the influence to cheat. Our brains are predisposed to make quick decisions and in the process they can fail to consider outside influences (such as ethical concerns). We also behave differently when facing a loss than a gain. We’re more willing to cheat when we’re trying to avoid a loss.
  • Snap decisions are especially prone to unconscious bias. The less time we have to think the more likely we default to in-group preference (racial stereotypes). When instructed to shoot “criminals” and not unarmed citizens one study found that participants incorrectly shot more black men than white men.
  • Research shows that most people view their own input into a group, their division’s input to the overall organization, and their firm’s contributions to a strategic alliance to be more important and substantial than reality can sustain. Over claiming this credit is, at least partly rooted in our bounded ethicality. That is, we exclude important and relevant information from our decisions by placing arbitrary and functional bounds around our definition of a problem (normally in a self-serving manner). This is part of the reason we fail to see eye to eye in disagreements — we pay attention to different data.
  • The difference in the way information is processed is often not intentional. Confirmation bias helps our minds absorb information that is in agreement with our beliefs and discount information that may contradict our thoughts. (We can’t remember our previous intentions either; How Our Brains Make Memories).
  • Egocentrism is dangerous when playing a Tragedy of the Commons game (Social Dilemma) such as the one we’re currently playing with debt and the environment as it encourages us to over claim resources.
  • In the end the kindergarten rule of fairness applies: one person cuts the cookie and the other has first pick on which half to eat.
  • In social dilemmas the easiest strategy is to defect.
  • A whole host of societal problems result from our tendency to use an extremely high discount rate regarding the future. One result is that we save far too little for retirement. Over-discounting the future can be immoral too as it robs future generations of opportunities and resources.
  • Compliance programs often include sanctioning systems that attempt to discourage unethical behavior, typically though punishment. Yet these programs often have the reverse effect, encouraging the behavior they are supposed to discourage. Why? In short because it removes the ethical consideration and makes it a business decision. (The number of late pick ups at daycares increase when there is a fine.)
  • When your informal culture doesn’t line up with your formal culture you have blind spots and employees will follow the informal culture.
  • Of course, we’re overconfident so informing us about our blind spots doesn’t seem to help us make better choices. We tend to believe that while others may fall prey to psychological biases, we don’t. Left to our own devices we dramatically understate the degree to which our own behavior is affected by incentives and situational factors.

***

Still curious? Check out Blind Spots. This book will help you see how your biases lead to your own immoral actions. And if you’re still curious try: Bounded Ethicality: The Perils of Loss Framing.