Tag: Game Theory

Books Everyone Should Read on Psychology and Behavioral Economics

Psychology and Behavioral Economics Books

Earlier this year, a prominent friend of mine was tasked with coming up with a list of behavioral economics book recommendations for the military leaders of a G7 country and I was on the limited email list asking for input.


While I read a lot and I’ve offered up books to sports teams and fortune 100 management teams, I’ve never contributed to something as broad as educating a nation's military leaders. While I have a huge behavorial economics reading list, this wasn't where I started.

Not only did I want to contribute, but I wanted to choose books that these military leaders wouldn’t normally have come across in everyday life. Books they were unlikely to have read. Books that offered perspective.

Given that I couldn’t talk to them outright, I was really trying to answer the question ‘what would I like to communicate to military leaders through non-fiction books?’ There were no easy answers.

I needed to offer something timeless. Not so outside the box that they wouldn't approach it, and not so hard to find that those purchasing the books would give up and move on to the next one on the list. And it can't be so big they get intimidated by the commitment to read. On top of that, you need a book that starts strong because, in my experience of dealing with C-level executives, they stop paying attention after about 20 pages if it’s not relevant or challenging them in the right way.

In short there is no one-size-fits-all but to make the biggest impact you have to consider all of these factors.

While the justifications for why people chose the books below are confidential, I can tell you what books were on the final email that I saw. I left one book off the list, which I thought was a little too controversial to post.

These books have nothing to do with military per se, rather they deal with enduring concepts like ecology, intuition, game theory, strategy, biology, second order thinking, and behavioral psychology. In short these books would benefit most people who want to improve their ability to think, which is why I’m sharing them with you.

If you’re so inclined you can try to guess which ones I recommended in the comments. Read wisely.

In no order and with no attribution:

  1. Risk Savvy: How to Make Good Decisions by Gerd Gigerenzer
  2. The Righteous Mind: Why Good People Are Divided by Politics and Religion by Jonathan Haidt
  3. The Checklist Manifesto: How to Get Things Right by Atul Gawande
  4. The Darwin Economy: Liberty, Competition, and the Common Good by Robert H. Frank
  5. David and Goliath: Underdogs, Misfits, and the Art of Battling Giants by Malcolm Gladwell
  6. Predictably Irrational, Revised and Expanded Edition: The Hidden Forces That Shape Our Decisions by Dan Ariely
  7. Thinking, Fast and Slow by Daniel Kahneman
  8. The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life by Robert Trivers
  9. The Hour Between Dog and Wolf: Risk Taking, Gut Feelings and the Biology of Boom and Bust by John Coates
  10. Adapt: Why Success Always Starts with Failure by Tim Harford
  11. The Lessons of History by Will & Ariel Durant
  12. Poor Charlie’s Almanack
  13. Passions Within Reason: The Strategic Role of the Emotions by Robert H. Frank
  14. The Signal and the Noise: Why So Many Predictions Fail–but Some Don't by Nate Silver
  15. Sex at Dawn: How We Mate, Why We Stray, and What It Means for Modern Relationships by Christopher Ryan & Cacilda Jetha
  16. The Red Queen: Sex and the Evolution of Human Nature by Matt Ridley
  17. Introducing Evolutionary Psychology by Dylan Evans & Oscar Zarate
  18. Filters Against Folly: How To Survive Despite Economists, Ecologists, and the Merely Eloquent by Garrett Hardin
  19. Games of Strategy (Fourth Edition) by Avinash Dixit, Susan Skeath & David H. Reiley, Jr.
  20. The Theory of Political Coalitions by William H. Riker
  21. The Evolution of War and its Cognitive Foundations (PDF) by John Tooby & Leda Cosmides.
  22. Fight the Power: Lanchester’s Laws of Combat in Human Evolution by Dominic D.P. Johnson & Niall J. MacKay.

Opinions and Organizational Theory

Mark Twain

Good opinions are a lot of work.

When I think about the world in which we live and the organizations in which we work, I can’t help but think that few people have the intellectual honesty, time, and discipline required to hold a view.

We have a bias for action and, equally important, a bias for the appearance of knowledge.

When's the last time you heard someone say I don't know?

It seems the higher up the corporate ladder you go, the more unlikely you are to say or hear those three words.

No one wants to tell the boss they don't know and the boss certainly doesn't want to let on that they don't know either. We have too much of our self-worth wrapped up in our profession.

If you’re a knowledge worker and you walk around saying “I don’t know” people are going to start to think you’re pretty stupid.

So we talk in abstractions and fog and create the appearance of knowledge.

Who has time to do the work required to hold an opinion? There is always an email to respond to, an urgent request from your boss, paper to move from one side of your desk to the other, etc. So we don’t do the work. But so few others do the work either.

Perhaps an example will help.

At 4:45 pm you receive a 4 page proposal in your inbox. The proposal is to be decided on the next day at a meeting with 12 people.

To reflect on the proposal seriously, you’d have to stay at work late. You'd need to turn off the email and all of your other tasks to read a document from start to finish. And, after-all, who has time to read four pages these days? (So we skim.)

If we really wanted to do the work necessary to hold an opinion we'd have to: read the document from start to finish; talk to anyone you can find about the proposal; listen to arguments from others for and against it; verify the facts; consider our assumptions; talk to someone who has been through something similar before; verify the framing of the problem is not too narrow or wide; make sure the solution solves the problem; etc.

So we don’t do the work. Yet we need an opinion for the meeting, or, perhaps more accurately, a sound-byte. So we skim the document again looking for something we can support; something that shows we’ve thought about it, despite the fact we haven’t.

We spend the time we should be learning and understanding something running around trying to make it look like we know everything. We're doing work alright: busywork. We do what's easy.

We turn up at the meeting the next day to discuss the proposal but our only real goal is to find a brief pause in the conversation so we can insert our pre-scripted, fact-deficient, obfuscating generality into the conversation. We do, after-all, have to maintain appearances.

The proposal ultimately reaches consensus, but this was never really in doubt. If you flip this around for a second, it must be the easiest decision in the world. Think about it, here you have a room full of smart people all in agreement on the best way to proceed. How often does that happen? A no-brainer like that is hardly worth a memo or even a meeting.

It's easy to agree when no one is thinking.

And organizational incentives encourage this behavior.

If the group makes the right decision you can take take credit. And, if by chance things go wrong, you don't get the blame. Group decisions, especially ones with consensus, allow for all of the participants to have the upside and few if any to have the downside.

When groups make decisions based on consensus, no one is really accountable if things go bad. Everyone can weasel out of responsibility (diffusion of responsibility.)

When you're talking to someone familiar with the situation you might say something like, ‘we were all wrong' or ‘we all thought the same thing.'

When you're talking to someone unfamiliar with the situation you'd offer something more clever like ‘I thought that decision was wrong but no one would listen to me,' knowing full well they can't prove you wrong.

And just like that, one-by-one, everyone in attendance at the meeting is absolved.

The alternative is uncomfortable.

Say, rather than jetting off to pick up the kids at 5 you stay and do the work required to have an opinion. While you get home around 11, exhausted, you are now comfortable with a well thought out opinion on the proposal.

Maybe two things happen at this point. If you’ve done the work and you reached the same conclusion at the proposal, you feel like you just wasted 6 hours. If, however, you do the work and you reach a different conclusion, things get more interesting.

You show up at the meeting and mention that you thought about this and reached a different conclusion — In fact, you determine this proposal doesn’t solve the problem. It's nothing more than lipstick.

So you speak up. And in the process, you risk being labelled as a dissenting troublemaker. Why, because no one else has done the work.

You might even offer some logical flow for everyone to follow along with your thinking. So you say something along the lines of: “I think a little differently on this. Here is how I see the problem and here are what I think are the governing variables. And here is how I weighed them. And here is how I'd address the main arguments I see against this. … What I’d miss?”

In short you'd expose your thinking and open yourself up. You'd be vulnerable to people who haven't really done the work.

If you expect them to say OK, that sounds good, you'd be wrong. After all, if they’re so easy swayed by your rational thinking, it looks like they haven’t done the work.

Instead, they need to show they've already thought about your reasoning and arguments and formed a different opinion.

Rather than stick to facts, they might respond with hard to pin down jargon or corporate speak — facts will rarely surface in a rebuttal.

You'll hear something like “that doesn't account for the synergies” or “that doesn't line up with the strategic plan (you haven't seen).” Or maybe they point the finger at their boss who is not in the room: “That’s what I thought too but Doug, oh no, he wants it done this way.”

The only way to argue that, and actually have a conversation, is to ask what the anticipated synergies are or to talk about the plan in detail, or bring Doug into the room. But that's a dangerous game akin to telling the emperor he has no clothes.

If you push too far you won’t be at the next meeting because everyone knows you’ll do the work and that means they know that by inviting you they’ll be forced to think about things a little more, to anticipate arguments, etc. In short, inviting you means more work for them. It's nothing personal.

Gaming the System

Some college students used game theory to get an A by exploiting a loophole in the grading curve.
La Rochefoucauld

Catherine Rampell explains:

In several computer science courses at Johns Hopkins University, the grading curve was set by giving the highest score on the final an A, and then adjusting all lower scores accordingly. The students determined that if they collectively boycotted, then the highest score would be a zero, and so everyone would get an A.

Inside Higher Ed, writes:

The students refused to come into the room and take the exam, so we sat there for a while: me on the inside, they on the outside,” [Peter Fröhlich, the professor,] said. “After about 20-30 minutes I would give up…. Then we all left.” The students waited outside the rooms to make sure that others honored the boycott, and were poised to go in if someone had. No one did, though.

Andrew Kelly, a student in Fröhlich’s Introduction to Programming class who was one of the boycott’s key organizers, explained the logic of the students’ decision via e-mail: “Handing out 0’s to your classmates will not improve your performance in this course,” Kelly said.

“So if you can walk in with 100 percent confidence of answering every question correctly, then your payoff would be the same for either decision. Just consider the impact on your other exam performances if you studied for [the final] at the level required to guarantee yourself 100. Otherwise, it’s best to work with your colleagues to ensure a 100 for all and a very pleasant start to the holidays.”

Bayesian Nash equilibria

In this one-off final exam, there are at least two Bayesian Nash equilibria (a stable outcome, where no student has an incentive to change his strategy after considering the other students’ strategies). Equilibrium #1 is that no one takes the test, and equilibrium #2 is that everyone takes the test. Both equilibria depend on what all the students believe their peers will do.

If all students believe that everyone will boycott with 100 percent certainty, then everyone should boycott (#1). But if anyone suspects that even one person will break the boycott, then at least someone will break the boycott, and everyone else will update their choices and decide to take the exam (#2).

Two incomplete thoughts

First, exploiting loopholes ensures increasing rules, laws, and language (to close previous loopholes), which lead to creating more complexity. More complexity, in turn, leads to more loopholes (among other things). … you see where this is going.

Second, ‘gaming the system’ is a form of game theory. What's best for you, the individual (or in this case, a small group), may not be best for society.

Today's college kids are tomorrow's bankers and CEO's. Just because you can do something doesn't mean you should.

Update (via metafilter): In 2009, Peter Fröhlich, the instructor mentioned above, published Game Design: Tricking Students into Learning More.

Still curious? Learn more about game theory with the Prisoners' Dilemma.

Mental Model: Game Theory

Game Theory

From Game Theory, by Morton Davis:

The theory of games is a theory of decision making. It considers how one should make decisions and to a lesser extent, how one does make them. You make a number of decisions every day. Some involve deep thought, while others are almost automatic. Your decisions are linked to your goals—if you know the consequences of each of your options, the solution is easy. Decide where you want to be and choose the path that takes you there. When you enter an elevator with a particular floor in mind (your goal), you push the button (one of your choices) that corresponds to your floor. Building a bridge involves more complex decisions but, to a competent engineer, is no different in principle. The engineer calculates the greatest load the bridge is expected to bear and designs a bridge to withstand it. When chance plays a role, however, decisions are harder to make. … Game theory was designed as a decision-making tool to be used in more complex situations, situations in which chance and your choice are not the only factors operating. … (Game theory problems) differ from the problems described earlier—building a bridge and installing telephones—in one essential respect: While decision makers are trying to manipulate their environment, their environment is trying to manipulate them. A store owner who lowers her price to gain a larger share of the market must know that her competitors will react in kind. … Because everyone's strategy affects the outcome, a player must worry about what everyone else does and knows that everyone else is worrying about him or her.

What is a game? From Game Theory and Strategy:

Game theory is the logical analysis of situations of conflict and cooperation. More specifically, a game is defined to be any situation in which:

  1. There are at least two players. A player may be an individual, but it may also be a more general entity like a company, a nation, or even a biological species.
  2. Each player has a number of possible strategies, courses of action which he or she may choose to follow.
  3. The strategies chosen by each player determine the outcome of the game.
  4. Associated to each possible outcome of the game is a collection of numerical payoffs, one to each player. These payoffs represent the value of the outcome to the different players.

…Game theory is the study of how players should rationally play games. Each player would like the game to end in an outcome which gives him as large a payoff as possible.

From Greg Mankiw's Economics textbook:

Game theory is the study of how people behave in strategic situations. By ‘strategic' we mane a situation in which each person, when deciding what actions to take, must consider how others might respond to that action. Because the number of firms in an oligopolistic market is small, each firm must act strategically. Each firm knows that its profit depends not only on how much it produces but also on how much the other firms produce. In making its production decision, each firm in an oligopoly should consider how its decision might affect the production decisions of all other firms.

Game theory is not necessary for understanding competitive or monopoly markets. In a competitive market, each firm is so small compared to the market that strategic interactions with other firms are not important. In a monopolized market, strategic interactions are absent because the market has only one firm. But, as we will see, game theory is quite useful for understanding the behavior of oligopolies.

A particularly important ‘game' is called the prisoners' dilemma.

Markets with only a few sellers

Because an oligopolistic market has only a small group of sellers, a key feature of oligopoly is the tension between cooperation and self-interest. The oligopolists are best off when they cooperate and act like a monopolist – producing a small quantity of output and charging a price above marginal cost. Yet because each oligopolist cares only about its own profit, there are powerful incentives at work that hinder a group of firms from maintaining the cooperative outcome.

Avinash Dixit and Barry Nalebuff, in their book “Thinking Strategically” offer:

Everyone's best choice depends on what others are going to do, whether it's going to war or maneuvering in a traffic jam.

These situations, in which people's choices depend on the behavior or the choices of other people, are the ones that usually don't permit any simple summation. Rather we have to look at the system of interaction.

Michael J. Mauboussin relates game theory to firm interaction

How a firm interacts with other firms plays an important role in shaping sustainable value creation. Here we not only consider how many companies interact with their competitors, but how companies can co-evolve.

Game Theory is one of the best tools to understand interaction. Game Theory forces managers to put themselves in the shoes of other players rather than viewing games solely from their own perspective.

The classic two-player example of game theory is the prisoners' dilemma.

Game Theory is part of the Farnam Street latticework of Mental Models. See all posts on game theory.

The Red Queen Principle: Avoid Running Faster and Faster Only to Stay in the Same Place

“Bees have to move very fast to stay still.”
— David Foster Wallace


The Red Queen Principle

Charles Lutwidge Dodgson (1832-1898), better known by his pseudonym Lewis Carroll, was not only an author but a keen observer of human nature. His most famous works are Alice's Adventures in Wonderland and its sequel Through the Looking Glasswhich have become timeless classics.

In Through the Looking Glass, Alice, a young girl, gets schooled by the Red Queen in an important life lesson that many of us fail to heed. Alice finds herself running faster and faster but saying in the same place.


Alice never could quite make out, in thinking it over afterwards, how it was that they began: all she remembers is, that they were running hand in hand, and the Queen went so fast that it was all she could do to keep up with her: and still the Queen kept crying ‘Faster! Faster!' but Alice felt she could not go faster, though she had not breath left to say so.

The most curious part of the thing was, that the trees and the other things round them never changed their places at all: however fast they went, they never seemed to pass anything. ‘I wonder if all the things move along with us?' thought poor puzzled Alice. And the Queen seemed to guess her thoughts, for she cried, ‘Faster! Don't try to talk!'

Eventually, the Queen stops running and props Alice up against a tree, telling her to rest.

Alice looked round her in great surprise. ‘Why, I do believe we've been under this tree the whole time! Everything's just as it was!'

‘Of course it is,' said the Queen, ‘what would you have it?'

‘Well, in our country,' said Alice, still panting a little, ‘you'd generally get to somewhere else — if you ran very fast for a long time, as we've been doing.'

‘A slow sort of country!' said the Queen. ‘Now, here, you see, it takes all the running you can do, to keep in the same place.

If you want to get somewhere else, you must run at least twice as fast as that!'


“It is not the strongest of the species that survives,
nor the most intelligent,
but the one most responsive to change.”

— Charles Darwin

Smarter, Not Harder

The Red Queen Principle means we can't be complacent or we'll fall behind. To survive another day we have to run very fast and hard, we need to co-evolve with the systems we interact with.

If all animals evolved at the same rate, there would be no change in the relative interactions between species. However, not all animals evolve at the same rate. As Darwin observed, some are more “responsive to change” than others. Species that are more responsive to change can gain a relative advantage over the ones they compete with and increase the odds of survival. In the short run, these small gains don't make much of a difference, but as generations pass the advantage can compound. A compounding advantage… that sounds nice.

Everyone from Entrepreneurs and Fortune 500 CEOs to best-selling authors and middle managers is embedded is in their own Red Queen. Rather than run harder, wouldn't it be nice to run smarter?

Here are just three of the ways we try to avoid the Red Queen.

  1. We invest significantly in new product development and content. Our courses, on everything from Reading to Focus and Productivity, evolve quickly incorporating student-tested concepts that work and reducing the importance of the ones that don't. Another example, our learning community, adds real-world value to people who make decisions by discussing time-tested principles. This is not a popular path as it's incredibly expensive in time and money. Standing still, however, is more expensive. We're not in the business of Edutainment but rather providing better outcomes. If we fail to keep getting better, we won't exist.
  2. We try to spend our limited mental resources working on things that won't change next week. We call these mental models and the ones we want to focus on are the ones that stand the test of time.
  3. We recognize how the world works and not how we want it to work. When the world isn't working the way we'd like it to, it's easy to say the world is wrong and sit back to see what happens. You know what happens right? You fall behind and it's even harder to catch up. It's like you're on a plane. When you're flying into the wind you have to work very hard. When you're flying with the wind at your back, you need to expend less energy and you get there earlier. Recognizing reality and adapting your behavior creates a tailwind.




We can find many examples of this effect.

In Deep Simplicity, John Gribbon describes the red queen principle with frogs.

There are lots of ways in which the frogs, who want to eat flies, and the flies, who want to avoid being eaten, interact. Frogs might evolve longer tongues, for fly-catching purposes; flies might evolve faster flight, to escape. Flies might evolve an unpleasant taste, or even excrete poisons that damage the frogs, and so on. We’ll pick one possibility. If a frog has a particularly sticky tongue, it will find it easier to catch flies. But if flies have particularly slippery bodies, they will find it easier to escape, even if the tongue touches them. Imagine a stable situation in which a certain number of frogs live on a pond and eat a certain proportion of the flies around them each year.

Because of a mutation a frog developes an extra sticky tongue. It will do well, compared with other frogs, and genes for extra sticky tongues will spread through the frog population. At first, a larger proportion of flies gets eaten. But the ones who don’t get eaten will be the more slippery ones, so genes for extra slipperiness will spread through the fly population. After a while, there will be the same number of frogs on the pond as before, and the same proportion of flies will be eaten each year. It looks as if nothing has changed – but the frogs have got stickier tongues, and the flies have got more slippery bodies.

Drugs and disease also represent an “arms-race.” Siddhartha Mukherjee, in his Pulitzer-prize winning book The Emperor of All Maladies describes this in the context of drugs and cancer.

In August 2000, Jerry Mayfield, a forty-one-year-old Louisiana policeman diagnosed with CML, began treatment with Gleevec. Mayfield’s cancer responded briskly at first. The fraction of leukemic cells in his bone marrow dropped over six months. His blood count normalized and his symptoms improved; he felt rejuvenated—“like a new man [on] a wonderful drug.” But the response was short-lived. In the winter of 2003, Mayfield’s CML stopped responding. Moshe Talpaz, the oncologist treating Mayfield in Houston, increased the dose of Gleevec, then increased it again, hoping to outpace the leukemia. But by October of that year, there was no response. Leukemia cells had fully recolonized his bone marrow and blood and invaded his spleen. Mayfield’s cancer had become resistant to targeted therapy…

… Even targeted therapy, then, was a cat-and-mouse game. One could direct endless arrows at the Achilles’ heel of cancer, but the disease might simply shift its foot, switching one vulnerability for another. We were locked in a perpetual battle with a volatile combatant. When CML cells kicked Gleevec away, only a different molecular variant would drive them down, and when they outgrew that drug, then we would need the next-generation drug. If the vigilance was dropped, even for a moment, then the weight of the battle would shift. In Lewis Carroll’s Through the Looking-Glass, the Red Queen tells Alice that the world keeps shifting so quickly under her feet that she has to keep running just to keep her position. This is our predicament with cancer: we are forced to keep running merely to keep still.

This doesn't only happen in nature, there are many business examples as well. 

In describing the capital investment needed to maintain a relative placement in the textile industry, Warren Buffett writes:

Over the years, we had the option of making large capital expenditures in the textile operation that would have allowed us to somewhat reduce variable costs. Each proposal to do so looked like an immediate winner. Measured by standard return-on-investment tests, in fact, these proposals usually promised greater economic benefits than would have resulted from comparable expenditures in our highly-profitable candy and newspaper businesses.

But the promised benefits from these textile investments were illusory. Many of our competitors, both domestic and foreign, were stepping up to the same kind of expenditures and, once enough companies did so, their reduced costs became the baseline for reduced prices industrywide. Viewed individually, each company’s capital investment decision appeared cost-effective and rational; viewed collectively, the decisions neutralized each other and were irrational (just as happens when each person watching a parade decides he can see a little better if he stands on tiptoes). After each round of investment, all the players had more money in the game and returns remained anemic.

In other words, more and more money is needed just to maintain your relative position in the industry and stay in the game. This situation plays out over and over again and brings with it many ripple effects. For example, the company distracted by maintaining a relative position in a poor industry places resources in a position almost assured to get a poor return on capital.

Inflation also causes a Red Queen Effect, here's Buffett Again:

Unfortunately, earnings reported in corporate financial statements are no longer the dominant variable that determines whether there are any real earnings for you, the owner. For only gains in purchasing power represent real earnings on investment. If you (a) forego ten hamburgers to purchase an investment; (b) receive dividends which, after tax, buy two hamburgers; and (c) receive, upon sale of your holdings, after-tax proceeds that will buy eight hamburgers, then (d) you have had no real income from your investment, no matter how much it appreciated in dollars. You may feel richer, but you won’t eat richer.

High rates of inflation create a tax on capital that makes much corporate investment unwise—at least if measured by the criterion of a positive real investment return to owners. This “hurdle rate” the return on equity that must be achieved by a corporation in order to produce any real return for its individual owners—has increased dramatically in recent years. The average tax-paying investor is now running up a down escalator whose pace has accelerated to the point where his upward progress is nil.

The Red Queen is part of the Farnam Street latticework of mental models.

– The excellent Sanjay Bakshi
Through the Looking Glass

The Great Ideas of the Social Sciences

What are the most important ideas ever put forward in social science?

I’m not asking what are the best ideas, so the truth of them is only obliquely relevant: a very important idea may be largely false. (I think it still must contain some germ of truth, or it would have no plausibility.) Think of it this way: if you were teaching a course called “The Great Ideas of the Social Sciences,” what would you want to make sure you included?

The list:

  • The state as the individual writ large (Plato)
  • Man is a political/social animal (Aristotle)
  • The city of God versus the city of man (Augustine)
  • What is moral for the individual may not be for the ruler (Machiavelli)
  • Invisible hand mechanisms (Hume, Smith, Ferguson)
  • Class struggle (Marx, various liberal thinkers)
  • The subconscious has a logic of its own (Freud)
  • Malthusian population theory
  • The labor theory of value (Ricardo, Marx)
  • Marginalism (Menger, Jevons, Walras)
  • Utilitarianism (Bentham, Mill, Mill)
  • Contract theory of the state (Hobbes, Locke, Rousseau)
  • Sapir-Worf hypothesis
  • Socialist calculation problem (Mises, Hayek)
  • The theory of comparative advantage (Mill, Ricardo)
  • Game theory (von Neumann, Morgenstern, Schelling)
  • Languages come in families (Jones, Young, Bopp)
  • Theories of aggregate demand shortfall (Malthus, Sismondi, Keynes)
  • History as an independent mode of thought (Dilthey, Croce, Collingwood, Oakeshott)
  • Public choice theory (Buchanan, Tullock)
  • Rational choice theory (who?)
  • Equilibrium theorizing (who?)