Over 400,000 people visited Farnam Street last month to learn how to make better decisions, create new ideas, and avoid stupid errors. With more than 100,000 subscribers to our popular weekly digest, we've become an online intellectual hub. To learn more about we what do, start here.

Social Dilemmas

Social dilemmas arise when an individual receives a higher payoff for defecting than cooperating, when everyone else cooperates. When everyone defects they are worse off. That is, each member has a clear and unambiguous incentive to make a choice, which if made by all members provides a worse outcome.

A great example of a social dilemma, is to imagine yourself out with a group of your friends for dinner. Before the meal you all agree to share the cost equally. Looking at the menu you see a lot of items that appeal to you but are outside of your budget.

Pondering on this, you realize that you’re only on the hook for 1/(number of friends at the dinner) of the bill. Now you can enjoy yourself without having to pay the full cost.

But what if everyone at the table realized the same thing? My guess is you’d all be stunned by the bill, even the tragedy of the commons.

This is a very simple example but you can map this to the business word by thinking about healthcare and insurance.

If that sounds a lot like game theory, you’re on the right track.

I came across an excellent paper by Robyn Dawes and David Messick, which takes a closer look at social dilemmas.

A Psychological Analysis of Social Dilemmas

In the case of the public good, one strategy that has been employed is to create a moral sense of duty to support it—for instance, the public television station that one watches. The attempt is to reframe the decision as doing one’s duty rather than making a difference—again, in the wellbeing of the station watched. The injection of a moral element changes the calculation from “Will I make a difference” to “I must pay for the benefit I get.”

The final illustration, the shared meal and its more serious counterparts, requires yet another approach. Here there is no hierarchy, as in the organizational example, that can be relied upon to solve the problem. With the shared meal, all the diners need to be aware of the temptation that they have and there need to be mutually agreed-upon limits to constrain the diners. Alternatively, the rule needs to be changed so that everyone pays for what they ordered. The latter arrangement creates responsibility in that all know that they will pay for what they order. Such voluntary arrangements may be difficult to arrange in some cases. With the medical insurance, the insurance company may recognize the risk and insist on a principle of co-payments for medical services. This is a step in the direction of paying for one’s own meal, but it allows part of the “meal’ ‘ to be shared and part of it to be paid for by the one who ordered it.

The fishing version is more difficult. To make those harvesting the fish pay for some of the costs of the catch would require some sort of taxation to deter the unbridled exploitation of the fishery. Taxation, however, leads to tax avoidance or evasion. But those who harvest the fish would have no incentive to report their catches accurately or at all, especially if they were particularly successful, which simultaneously means particularly successfully—compared to others at least—in contributing to the problem of a subsequently reduced yield. Voluntary self-restraint would be punished as those with less of that personal quality would thrive while those with more would suffer. Conscience, as Hardin (1968) noted, would be self-eliminating. …

Relatively minor changes in the social environment can induce major changes in decision making because these minor changes can change the perceived appropriateness of a situation. One variable that has been shown to make such a difference is whether the decision maker sees herself as an individual or as a part of a group.