Farnam Street helps you make better decisions, innovate, and avoid stupidity.

With over 350,000 monthly readers and more than 87,000 subscribers to our popular weekly digest, we've become an online intellectual hub.

Moral Hypocrisy

Here is an interesting study demonstrating that we can maintain a belief even while acting contrary to it (Moral Hypocrisy):

Across 3 small studies, 80 female undergraduates were confronted with the dilemma of deciding whom-themselves or another research participant-to assign to a positive consequences task, leaving the other to do a dull, boring task. In Study 1, where morality was not mentioned, 16 of 20 assigned themselves to the positive consequences task, even though in retrospect only 1 said this was moral. In Studies 2 and 3, a moral strategy was suggested: either flipping a coin or accepting task assignment by the experimenter. In Study 2, 10 of 20 participants flipped a coin, but of these, 9 assigned themselves the positive consequences task. In Study 3, participants were significantly more likely to accept the experimenter’s assignment when it gave them the positive consequences task. Overall, results suggested motivation to appear moral yet still benefit oneself. Such motivation is called moral hypocrisy.

Luckily it appears as though our brain comes equipped to relieve ourselves of the cognitive dissonance these actions encourage. An interesting twist in all of this is an interpersonal phenomenon whereby individuals’ evaluations of their own moral transgressions differ substantially from their evaluations of the same transgressions committed by others (attribution error).

At a basic level, preservation of a positive self-image appears to trump the use of more objective moral principles.

The gaps between who we see ourselves as and who we actually are is, according to Max Bazerman, related to bounded awareness. Bazerman defines bounded awareness as “an individuals failure to see and use accessible and perceivable information while seeing and using other equally accessible and perceptible information. In other words, we “exclude important and relevant information from our decisions by placing arbitrary and functional bounds around our defintion of a problem.”

Bazerman is the author of Judgment in Managerial Decision Making and Blind Spots: Why We Fail to Do What’s Right and What to Do about It.