We owe thanks to the publishing industry. Their ability to take a concept and fill an entire category with a shotgun approach is the reason that more people are talking about biases.
Unfortunately, talk alone will not eliminate them but it is possible to take steps to counteract them. Reducing biases can make a huge difference in the quality of any decision and it is easier than you think.
In a recent article for Harvard Business Review, Daniel Kahneman (and others) describe a simple way to detect bias and minimize its effects in the most common type of decision people make: determining whether to accept, reject, or pass on a recommendation.
The Munger two-step process for making decisions is a more complete framework, but Kahneman's approach is a good way to help reduce biases in our decision making. If you're short on time here is a simple checklist that will get you started on the path towards improving your decisions:
Preliminary Questions: Ask yourself
1. Check for Self-interested Biases
- Is there any reason to suspect the team making the recommendation of errors motivated by self-interest?
- Review the proposal with extra care, especially for overoptimism.
2. Check for the Affect Heuristic
- Has the team fallen in love with its proposal?
- Rigorously apply all the quality controls on the checklist.
3. Check for Groupthink
- Were there dissenting opinions within the team?
- Were they explored adequately?
- Solicit dissenting views, discreetly if necessary.
- Challenge Questions: Ask the recommenders
4. Check for Saliency Bias
- Could the diagnosis be overly influenced by an analogy to a memorable success?
- Ask for more analogies, and rigorously analyze their similarity to the current situation.
5. Check for Confirmation Bias
- Are credible alternatives included along with the recommendation?
- Request additional options.
6. Check for Availability Bias
- If you had to make this decision again in a year’s time, what information would you want, and can you get more of it now?
- Use checklists of the data needed for each kind of decision.
7. Check for Anchoring Bias
- Do you know where the numbers came from? Can there be
- …unsubstantiated numbers?
- …extrapolation from history?
- …a motivation to use a certain anchor?
- Reanchor with figures generated by other models or benchmarks, and request new analysis.
8. Check for Halo Effect
- Is the team assuming that a person, organization, or approach that is successful in one area will be just as successful in another?
- Eliminate false inferences, and ask the team to seek additional comparable examples.
9. Check for Sunk-Cost Fallacy, Endowment Effect
- Are the recommenders overly attached to a history of past decisions?
- Consider the issue as if you were a new CEO.
- Evaluation Questions: Ask about the proposal
10. Check for Overconfidence, Planning Fallacy, Optimistic Biases, Competitor Neglect
- Is the base case overly optimistic?
- Have the team build a case taking an outside view; use war games.
11. Check for Disaster Neglect
- Is the worst case bad enough?
- Have the team conduct a premortem: Imagine that the worst has happened, and develop a story about the causes.
12. Check for Loss Aversion
- Is the recommending team overly cautious?
- Realign incentives to share responsibility for the risk or to remove risk.
If you're looking to dramatically improve your decision making here is a great list of books to get started:
Nudge: Improving Decisions About Health, Wealth, and Happiness by Richard H. Thaler and Cass R. Sunstein
Think Twice: Harnessing the Power of Counterintuition by Michael J. Mauboussin
Think Again: Why Good Leaders Make Bad Decisions and How to Keep It from Happening to You by Sydney Finkelstein, Jo Whitehead, and Andrew Campbell
Thinking, Fast and Slow by Daniel Kahneman
Judgment and Managerial Decision Making by Max Bazerman