We propose that the time has come to move the study of biases in judgment and decision making beyond description and toward the development of improvement strategies.
System 1 and System 2
System 1 refers to our intuitive system, which is typically fast, automatic, effortless, implicit, and emotional. System 2 refers to reasoning that is slower, conscious, effortful, explicit, and logical.
People often lack important information regarding a decision, fail to notice available information, face time and cost constraints, and maintain a relatively small amount of information in their usable memory. The busier people are, the more they have on their minds, and the more time constraints they face, the more likely they will be to rely on System 1 thinking. This is not always a mistake.
While not always, System 1 can lead us down the wrong path:
… in the many situations where we know that decision biases are likely to plague us (e.g., when evaluating diverse job candidates, estimating our percent contribution to a group project, choosing between spending and saving), relying exclusively on System 1 thinking is likely to lead us to make costly errors.
How Can We Move From System 1 to System 2 When Appropriate?
One successful strategy for moving toward System 2 thinking relies on replacing intuition with formal analytic processes. For example, when data exists on past inputs to and outcomes from a particular decision-making process, decision makers can construct a linear model, or a formula that weights and sums the relevant predictor variables, to reach a quantitative forecast about the outcome. Researchers have found that linear models produce predictions that are superior to those of experts across an impressive array of domains
Another System 2 strategy, use an outsider's perspective:
involves taking an outsider’s perspective: trying to remove oneself mentally from a specific situation or to consider the class of decisions to which the current problem belongs. Taking an outsider’s perspective has been shown to reduce decision makers’ overconﬁdence about their knowledge, the time it would take them to complete a task, and their odds of entrepreneurial success. Decision makers may also be able to improve their judgments by asking a genuine outsider for his or her view regarding a decision.
Another approach, consider the opposite:
simply encouraging people to ‘‘consider the opposite’’ of whatever decision they are about to make reduces errors in judgment due to several particularly robust decision biases: overconfidence, the hindsight bias, and anchoring
Also, make decisions as groups, understand statistical reasoning, and hold people accountable:
Researchers have also successfully partially debiased errors in judgment typically classified as the result of biases and heuristics by having groups rather than individuals make decisions, by training individuals in statistical reasoning, and by making people accountable for their decisions.
And use multiple options:
people can move from suboptimal System 1 thinking toward improved System 2 thinking when they consider and choose between multiple options simultaneously rather than accepting or rejecting options separately. … people exhibit less willpower when they weigh choices separately rather than jointly.
Can We Leverage System 1 to Improve Decision Making?
Rather than trying to change a decision maker’s thinking from System 1 to System 2 in situations where System 1 processing is known to frequently result in biased decisions, this strategy tries to change the environment so that System 1 thinking will lead to good results. This type of improvement strategy, which Thaler and Sunstein discuss at length in their book Nudge, calls on those who design situations in which choices are made (whether they be the decision makers themselves or other ‘‘choice architects’’) to maximize the odds that decision makers will make wise choices given known decision biases.