Tag: Bounded Awareness

Moral Hypocrisy

Here is an interesting study demonstrating that we can maintain a belief even while acting contrary to it (Moral Hypocrisy):

Across 3 small studies, 80 female undergraduates were confronted with the dilemma of deciding whom-themselves or another research participant-to assign to a positive consequences task, leaving the other to do a dull, boring task. In Study 1, where morality was not mentioned, 16 of 20 assigned themselves to the positive consequences task, even though in retrospect only 1 said this was moral. In Studies 2 and 3, a moral strategy was suggested: either flipping a coin or accepting task assignment by the experimenter. In Study 2, 10 of 20 participants flipped a coin, but of these, 9 assigned themselves the positive consequences task. In Study 3, participants were significantly more likely to accept the experimenter's assignment when it gave them the positive consequences task. Overall, results suggested motivation to appear moral yet still benefit oneself. Such motivation is called moral hypocrisy.

Luckily it appears as though our brain comes equipped to relieve ourselves of the cognitive dissonance these actions encourage. An interesting twist in all of this is an interpersonal phenomenon whereby individuals' evaluations of their own moral transgressions differ substantially from their evaluations of the same transgressions committed by others (attribution error).

At a basic level, preservation of a positive self-image appears to trump the use of more objective moral principles.

The gaps between who we see ourselves as and who we actually are is, according to Max Bazerman, related to bounded awareness. Bazerman defines bounded awareness as “an individuals failure to see and use accessible and perceivable information while seeing and using other equally accessible and perceptible information. In other words, we “exclude important and relevant information from our decisions by placing arbitrary and functional bounds around our defintion of a problem.”

Bazerman is the author of Judgment in Managerial Decision Making and Blind Spots: Why We Fail to Do What's Right and What to Do about It.

Can one person successfully play different roles that require different, and often competing, perspectives?

No, according to research by Max Bazerman, author of the best book on decision making I've ever read: Judgment in Managerial Decision Making.

Contrary to F. Scott Fitzgerald's famous quote, “the test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function,” evidence suggests that even the most intelligent find it difficult to sustain opposing beliefs without the two influencing each other.

Why?

One reason is a bias from incentives. Another is bounded awareness. The auditor who desperately wants to retain a client’s business may have trouble adopting the perspective of a dispassionate referee when it comes time to prepare a formal evaluation of the client’s accounting practices.

* * * * *

In many situations, professionals are called upon to play dual roles that require different perspectives. For example, attorneys embroiled in pretrial negotiations may exaggerate their chances of winning in court to extract concessions from the other side. But when it comes time to advise the client on whether to accept a settlement offer, the client needs objective advice.

Professors, likewise, have to evaluate the performance of graduate students and provide them with both encouragement and criticism. But public criticism is less helpful when faculty serve as their students’ advocates in the job market. And, although auditors have a legal responsibility to judge the accuracy of their clients’ financial accounting, the way to win a client’s business is not by stressing one’s legal obligation to independence, but by emphasizing the helpfulness and accommodation one can provide.

Are these dual roles psychologically feasible?; that is, can one person successfully play different roles that require different, and often competing, perspectives? No.

Abstract

This paper explores the psychology of conflict of interest by investigating how conflicting interests affect both public statements and private judgments. The results suggest that judgments are easily influenced by affiliation with interested partisans, and that this influence extends to judgments made with clear incentives for objectivity. The consistency we observe between public and private judgments indicates that participants believed their biased assessments. Our results suggest that the psychology of conflict of interest is at odds with the way economists and policy makers routinely think about the problem. We conclude by exploring implications of this finding for professional conduct and public policy.

Full Paper (PDF)

Read what you've been missing. Subscribe to Farnam Street via Email, RSS, or Twitter.

Shop at Amazon.com and support Farnam Street