Ethical infractions are rooted in the intricacies of human psychology rather than integrity. We are wired far more strongly for unethical behavior than for integrity.
Max Bazerman and Ann Tebrusnel’s book: Blind Spots covers the ways in which we overestimate our ability to behave ethically.
Briefly, here are some of our takeaways.
- We engage in behavioral forecasting errors. We believe we will behave a certain way in a certain situation. Yet, when actually faced with that situation we behave differently.
- When we behave in unethical ways, we are experts at deflecting blame and rationalizing our behavior. For instance, a used car salesman may view selling a car that leaks oil as perfectly ethical because the buyer failed to ask. (bias from self-interest).
- People often judge the ethicality of actions based on the outcome (outcome bias). We tend to be far more concerned with and show more sympathy when the actions taken affect “identifiable victims”.
- We may experience motivated blindness when we have an incentive to overlook the consequences of an action. (when one party has an interest in overlooking the unethical behavior of another party)
- People who are mentally overloaded are more likely to cheat on a task than those who are not. Why? Because it takes cognitive effort to skip the temptation to cheat. Our brains are designed to make quick decisions. This means they can fail to consider outside influences, such as ethical concerns. We’re also more willing to cheat to avoid a loss than for gain.
- When making snap decisions, we are especially prone to unconscious bias. The less time we have to think, the more we default to in-group preferences along racial lines.
- Research shows that most people view their own input into a group, their division’s input to the overall organization, and their firm’s contributions to a strategic alliance to be more important and substantial than reality can sustain. Over-claiming this credit is, at least partly rooted in our bounded ethicality. That is, we exclude important and relevant information from our decisions by placing arbitrary and functional bounds around our definition of a problem (normally in a self-serving manner). This is part of the reason we fail to see eye to eye in disagreements — we pay attention to different data.
- The difference in the way information is processed is often not intentional. Confirmation bias helps our minds absorb information that is in agreement with our beliefs and discount information that may contradict our thoughts. (We can’t remember our previous intentions either; How Our Brains Make Memories).
- Egocentrism is dangerous when playing a Tragedy of the Commons game (Social Dilemma) such as the one we’re currently playing with debt and the environment as it encourages us to over claim resources.
- In the end the kindergarten rule of fairness applies: one person cuts the cookie and the other has first pick on which half to eat.
- In social dilemmas the easiest strategy is to defect.
- A whole host of societal problems result from our tendency to use an extremely high discount rate regarding the future. One result is that we save far too little for retirement. Over-discounting the future can be immoral too as it robs future generations of opportunities and resources.
- Compliance programs often include sanctioning systems that attempt to discourage unethical behavior, typically though punishment. Yet these programs often have the reverse effect, encouraging the behavior they are supposed to discourage. Why? In short because it removes the ethical consideration and makes it a business decision. (The number of late pick ups at daycares increase when there is a fine.)
- When your informal culture doesn’t line up with your formal culture you have blind spots and employees will follow the informal culture.
- Of course, we’re overconfident so informing us about our blind spots doesn’t seem to help us make better choices. We tend to believe that while others may fall prey to psychological biases, we don’t. Left to our own devices we dramatically understate the degree to which our own behavior is affected by incentives and situational factors.
***
Still curious? Check out Blind Spots. This book will help you see how your biases lead to your own immoral actions. And if you’re still curious try: Bounded Ethicality: The Perils of Loss Framing.