Mental Models

Acquiring knowledge may seem like a daunting task. There is so much to know and time is precious. Luckily, we don’t have to master everything. To get the biggest bang for the buck we can study the big ideas from physics, biology, psychology, philosophy, literature, and sociology.

Our aim is not to remember facts and try to repeat them when asked. We’re going to try and hang these ideas on a latticework of mental models. Doing this puts them in a useable form and enables us to make better decisions.

mental model is simply a representation of an external reality inside your head. Mental models are concerned with understanding knowledge about the world.

Decisions are more likely to be correct when ideas from multiple disciplines all point towards the same conclusion.

Let’s explore the big ideas together.

Wisdom

Charlie Munger explains worldly wisdom:

Well, the first rule is that you can’t really know anything if you just remember isolated facts and try and bang ‘em back. If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form.

You’ve got to have models in your head. And you’ve got to array your experience both vicarious and direct on this latticework of models. You may have noticed students who just try to remember and pound back what is remembered. Well, they fail in school and in life. You’ve got to hang experience on a latticework of models in your head.

What are the models? Well, the first rule is that you’ve got to have multiple models because if you just have one or two that you’re using, the nature of human psychology is such that you’ll torture reality so that it fits your models, or at least you’ll think it does…

It’s like the old saying, “To the man with only a hammer, every problem looks like a nail.” And of course, that’s the way the chiropractor goes about practicing medicine. But that’s a perfectly disastrous way to think and a perfectly disastrous way to operate in the world. So you’ve got to have multiple models.

And the models have to come from multiple disciplines because all the wisdom of the world is not to be found in one little academic department. That’s why poetry professors, by and large, are so unwise in a worldly sense. They don’t have enough models in their heads. So you’ve got to have models across a fair array of disciplines.

You may say, “My God, this is already getting way too tough.” But, fortunately, it isn’t that tough because 80 or 90 important models will carry about 90% of the freight in making you a worldly wise person. And, of those, only a mere handful really carry very heavy freight.(1)

John T. Reed, author of Succeeding offers an important insight:

When you first start to study a field, it seems like you have to memorize a zillion things. You don’t. What you need is to identify the core principles – generally three to twelve of them – that govern the field. The million things you thought you had to memorize are simply various combinations of the core principles.

How do we think?

One answer is that we rely on mental models. Perception yields models of the world that lie outside us. An understanding of discourse yields models of the world that the speaker describes to us. Thinking, which enables us to anticipate the world and to choose a course of action, relies on internal manipulations of these mental models.(2)

Evolution

Wikipedia provides the following example of a mental model that’s been put into us through evolution.

A simple example is the mental model of a wild animal as dangerous: upon encountering a raccoon or a snake, one who holds this model will likely retreat from the animal as if by reflex. Retreat is the result of the application of the mental model, and would probably not be the immediate reaction of one whose mental model of wild animals was formed solely from experience with similar stuffed toy animals, or who had not yet formed any mental models about wild raccoons or snakes.

Mental Models

Some observations on mental models from Donald Norman, author of The Design of Everyday Things.

1. Mental models are incomplete.
2. People’s abilities to “run” their models are severely limited.
3. Mental models are unstable. People forget the details of the system they are using, especially when those details (or the whole system) have not been used for some time.
4. Mental models do not have firm boundaries: similar devices and operations get confused with one another.
5. Mental models are unscientific. People maintain “superstitious” behavior patterns even when they know they are unneeded because they cost little in physical effort and save mental effort.
6. Mental models are parsimonious: Often people do extra physical operations rather than the mental planning that would allow them to avoid those actions; they are willing to trade-off extra physical action for reduced mental complexity. This is especially true where the extra actions allow one simplified rule to apply to a variety of devices, thus minimizing the chances for confusion. (3)

Chris Wu writes about Chris Anderson, the Wired editor and co-chief, and what happens when you rely on the same mental model to solve everything.

Chris Anderson’s The Long Tail does something that only the best books do—uncovers a phenomenon that’s undeniably going on and makes clear sense of it. Anderson, the Wired editor-in-chief who first wrote about the Long Tail concept in 2004, had two moments of genius: He visualized the demand for certain products as a “power curve,” and he came up with a catchy phrase to go with his observation. Like most good ideas, the Long Tail attaches to your mind and gets stuck there. Everything you take in—cult blogs, alternative music, festival films—starts looking like the Long Tail in action. But that’s also the problem. The Long Tail theory is so catchy it can overgrow its useful boundaries. Unfortunately, Anderson’s book exacerbates this problem. When you put it down, there’s one question you won’t be able to answer: When, exactly, doesn’t the Long Tail matter?

This insight goes only so far, but like many business books, The Long Tail commits the sin of overreaching. The tagline on the book’s cover reads, “Why the Future of Business Is Selling Less of More,” which is certainly wrong or at least exaggerated. Inside we learn about “the Long Tail of Everything.” Anderson’s book, unlike his original Wired article, threatens to turn a great theory of inventory economics into a bad theory of life and the universe. He writes that “there are now Long Tail markets practically everywhere you look,” calling offshoring the “Long Tail of labor,” and online universities “the Long Tail of education.” He quotes approvingly an analysis that claims, improbably, that there’s a “Long Tail of national security” in which al-Qaida is a “supercharged niche supplier.” At times, the Long Tail becomes the proverbial theory hammer looking for nails to pound.

What the book doesn’t get at is the relationship between these standards-driven industries where the Long Tail doesn’t matter, and the content industries where it does. There aren’t Long Tails everywhere.

The Farnam Street Latticework of Mental Models

Psychology (misjudgments)

Biases emanating from the Availability Heuristic:
1. Ease of Recall
2. Retrievability

Biases emanating from the Representativeness Heuristic
3. Bias from insensitivity to base rates
4. Bias from insensitivity to sample size
5. Misconceptions of chance
6. Regression to the mean
7. Bias from conjunction fallacy

Biases emanating from the Confirmation Heuristic
8. Confirmation bias
9. Bias from anchoring
10. Conjunctive and disjunctive-events bias
11. Bias from over-confidence
12. Hindsight Bias

Others
13. Bias from incentives and reinforcement
14. Bias from self-interest — self deception and denial to reduce pain or increase pleasure; regret avoidance.
15. Bias from association
16. Bias from liking/loving
17. Bias from disliking/hating
18. Commitment and Consistency Bias
19. Bias from excessive fairness
20. Bias from envy and jealousy
21. Reciprocation bias
22. Over-influence from authority
23. Tendency to super-react to deprival; stronger reaction when something we have or almost have is (or threatens to be) taken away. Loss aversion?
24. Bias from contrast
25. Bias from stress-influence (introduction | posts)
26. Bias from emotional arousal
27. Bias from physical or psychological pain
28. Bias from mis-reading people; character
29. Attribution Error; underestimating situation factors (including roles) when explaining reasons; one to one versus one to many relationships
30. Bias from the status quo
31. Do something tendency
32. Do nothing tendency
33. Over-influence from precision/models
34. Simplicity Bias
35. Uncertainty avoidance
36. Ideological bias
37. Not invented here bias — thinking that our own ideas are the best ones
38. Bias from over-weighting the short-term
39. Tendency to avoid extremes
40. Tendency to solve problems using only the field we know best / favored ideas. (Man with a hammer.)
41. Bias from social proof
42. Over-influence from framing effects
43. Lollapalooza

Other Mental Models:
– Asymmetric Information
Occam’s Razor
Deduction and Induction
Basic Decision Making Process
Scientific Method
– Process versus Outcome
– And then what?
– The Agency Problem
– 7 Deadly Sins
– Network Effect
Gresham’s Law
The Red Queen Effect

Business
– Ability to raise prices
– Scale
– Distribution
– Cost
– Brand
– Improving returns
– Porters 5 forces
– Decision trees
– Diminishing Returns

Investing
Mr. Market
Circle of competence

Ecology
Complex adaptive systems
– Systems Thinking

Economics
– Utility
– Diminishing Utility
Supply and Demand
Scarcity
– Opportunity Cost
– Marginal Cost
Comparative Advantage
– Trade-offs
– Price Discrimination
– Positive and Negative Externalities
– Sunk Costs
– Moral Hazard
Game Theory
Prisoners’ Dilemma
Tragedy of the Commons
– Bottlenecks
– Time value of Money

Engineering
Feedback loops
Redundancy
Margin of Safety
– Tight coupling
– Breakpoints

Mathematics
Bayes Theorem
– Power Law
– Law of large numbers
– Compounding
– Permutations
– Combinations
– Variability
– Trend
Inversion

Statistics
– Outliers and self fulfilling prophecy
– Correlation versus Causation
– Mean, Median, Mode
– Distribution

Chemistry
– Thermodynamics
– Kinetics
– Autocatalytic reactions

Physics
– Newton’s Laws
– Momentum
– Quantum Mechanics
Critical Mass
Equilibrium

Biology
– Evolution

Sources:
1. Charlie Munger
2. The Cambridge Handbook of Thinking and Reasoning.
3. Some observations on mental models – appears in Mental Models (pp. 7-14).