Tag: Over-influence from framing effects

The Psychology of Risk and Reward

The Psychology of Risk and Reward

An excerpt from The Aspirational Investor: Taming the Markets to Achieve Your Life's Goals that I think you'd enjoy.

Most of us have a healthy understanding of risk in the short term.

When crossing the street, for example, you would no doubt speed up to avoid an oncoming car that suddenly rounds the corner.

Humans are wired to survive: it’s a basic instinct that takes command almost instantly, enabling our brains to resolve ambiguity quickly so that we can take decisive action in the face of a threat.

The impulse to resolve ambiguity manifests itself in many ways and in many contexts, even those less fraught with danger. Glance at the (above) picture for no more than a couple of seconds. What do you see?

Some observers perceive the profile of a young woman with flowing hair, an elegant dress, and a bonnet. Others see the image of a woman stooped in old age with a wart on her large nose. Still others—in the gifted minority—are able to see both of the images simultaneously.

What is interesting about this illusion is that our brains instantly decide what image we are looking at, based on our first glance. If your initial glance was toward the vertical profile on the left-hand side, you were all but destined to see the image of the elegant young woman: it was just a matter of your brain interpreting every line in the picture according to the mental image that you already formed, even though each line can be interpreted in two different ways. Conversely, if your first glance fell on the central dark horizontal line that emphasizes the mouth and chin, your brain quickly formed an image of the older woman.

Regardless of your interpretation, your brain wasn’t confused. It simply decided what the picture was and filled in the missing pieces. Your brain resolved ambiguity and extracted order from conflicting information.

What does this have to do with decision making? Every bit of information can be interpreted differently according to our perspective. Ashvin Chhabra directs us to investing. I suggest you reframe this in the context of decision making in general.

Every trade has a seller and a buyer: your state of mind is paramount. If you are in a risk-averse mental framework, then you are likely to interpret a further fall in stocks as additional confirmation of your sell bias. If instead your framework is positive, you will interpret the same event as a buying opportunity.

The challenge of investing is compounded by the fact that our brains, which excel at resolving ambiguity in the face of a threat, are less well equipped to navigate the long term intelligently. Since none of us can predict the future, successful investing requires planning and discipline.

Unfortunately, when reason is in apparent conflict with our instincts—about markets or a “hot stock,” for example—it is our instincts that typically prevail. Our “reptilian brain” wins out over our “rational brain,” as it so often does in other facets of our lives. And as we have seen, investors trade too frequently, and often at the wrong time.

One way our brains resolve conflicting information is to seek out safety in numbers. In the animal kingdom, this is called “moving with the herd,” and it serves a very important purpose: helping to ensure survival. Just as a buffalo will try to stay with the herd in order to minimize its individual vulnerability to predators, we tend to feel safer and more confident investing alongside equally bullish investors in a rising market, and we tend to sell when everyone around us is doing the same. Even the so-called smart money falls prey to a herd mentality: one study, aptly titled “Thy Neighbor’s Portfolio,” found that professional mutual fund managers were more likely to buy or sell a particular stock if other managers in the same city were also buying or selling.

This comfort is costly. The surge in buying activity and the resulting bullish sentiment is self-reinforcing, propelling markets to react even faster. That leads to overvaluation and the inevitable crash when sentiment reverses. As we shall see, such booms and busts are characteristic of all financial markets, regardless of size, location, or even the era in which they exist.

Even though the role of instinct and human emotions in driving speculative bubbles has been well documented in popular books, newspapers, and magazines for hundreds of years, these factors were virtually ignored in conventional financial and economic models until the 1970s.

This is especially surprising given that, in 1951, a young PhD student from the University of Chicago, Harry Markowitz, published two very important papers. The first, entitled “Portfolio Selection,” published in the Journal of Finance, led to the creation of what we call modern portfolio theory, together with the widespread adoption of its important ideas such as asset allocation and diversification. It earned Harry Markowitz a Nobel Prize in Economics.

The second paper, entitled “The Utility of Wealth” and published in the prestigious Journal of Political Economy, was about the propensity of people to hold insurance (safety) and to buy lottery tickets at the same time. It delved deeper into the psychological aspects of investing but was largely forgotten for decades.

The field of behavioral finance really came into its own through the pioneering work of two academic psychologists, Amos Tversky and Daniel Kahneman, who challenged conventional wisdom about how people make decisions involving risk. Their work garnered Kahneman the Nobel Prize in Economics in 2002. Behavioral finance and neuroeconomics are relatively new fields of study that seek to identify and understand human behavior and decision making with regard to choices involving trade-offs between risk and reward. Of particular interest are the human biases that prevent individuals from making fully rational financial decisions in the face of uncertainty.

As behavioral economists have documented, our propensity for herd behavior is just the tip of the iceberg. Kahneman and Tversky, for example, showed that people who were asked to choose between a certain loss and a gamble, in which they could either lose more money or break even, would tend to choose the double down (that is, gamble to avoid the prospect of losses), a behavior the authors called “loss aversion.” Building on this work, Hersh Shefrin and Meir Statman, professors at the University of Santa Clara Leavey School of Business, have linked the propensity for loss aversion to investors’ tendency to hold losing investments too long and to sell winners too soon. They called this bias the disposition effect.

The lengthy list of behaviorally driven market effects often converge in an investor’s tale of woe. Overconfidence causes investors to hold concentrated portfolios and to trade excessively, behaviors that can destroy wealth. The illusion of control causes investors to overestimate the probability of success and underestimate risk because of familiarity—for example, causing investors to hold too much employer stock in their 401(k) plans, resulting in under-diversification. Cognitive dissonance causes us to ignore evidence that is contrary to our opinions, leading to myopic investing behavior. And the representativeness bias leads investors to assess risk and return based on superficial characteristics—for example, by assuming that shares of companies that make products you like are good investments.

Several other key behavioral biases come into play in the realm of investing. Framing can cause investors to make a decision based on how the question is worded and the choices presented. Anchoring often leads investors to unconsciously create a reference point, say for securities prices, and then adjust decisions or expectations with respect to that anchor. This bias might impede your ability to sell a losing stock, for example, in the false hope that you can earn your money back. Similarly, the endowment bias might lead you to overvalue a stock that you own and thus hold on to the position too long. And regret aversion may lead you to avoid taking a tough action for fear that it will turn out badly. This can lead to decision paralysis in the wake of a market crash, even though, statistically, it is a good buying opportunity.

Behavioral finance has generated plenty of debate. Some observers have hailed the field as revolutionary; others bemoan the discipline’s seeming lack of a transcendent, unifying theory. This much is clear: behavioral finance treats biases as mistakes that, in academic parlance, prevent investors from thinking “rationally” and cause them to hold “suboptimal” portfolios.

But is that really true? In investing, as in life, the answer is more complex than it appears. Effective decision making requires us to balance our “reptilian brain,” which governs instinctive thinking, with our “rational brain,” which is responsible for strategic thinking. Instinct must integrate with experience.

Put another way, behavioral biases are nothing more than a series of complex trade-offs between risk and reward. When the stock market is taking off, for example, a failure to rebalance by selling winners is considered a mistake. The same goes for a failure to add to a position in a plummeting market. That’s because conventional finance theory assumes markets to be inherently stable, or “mean-reverting,” so most deviations from the historical rate of return are viewed as fluctuations that will revert to the mean, or self-correct, over time.

But what if a precipitous market drop is slicing into your peace of mind, affecting your sleep, your relationships, and your professional life? What if that assumption about markets reverting to the mean doesn’t hold true and you cannot afford to hold on for an extended period of time? In both cases, it might just be “rational” to sell and accept your losses precisely when investment theory says you should be buying. A concentrated bet might also make sense, if you possess the skill or knowledge to exploit an opportunity that others might not see, even if it flies in the face of conventional diversification principles.

Of course, the time to create decision rules for extreme market scenarios and concentrated bets is when you are building your investment strategy, not in the middle of a market crisis or at the moment a high-risk, high-reward opportunity from a former business partner lands on your desk and gives you an adrenaline jolt. A disciplined process for managing risk in relation to a clear set of goals will enable you to use the insights offered by behavioral finance to your advantage, rather than fall prey to the common pitfalls. This is one of the central insights of the Wealth Allocation Framework. But before we can put these insights to practical use, we need to understand the true nature of financial markets.

Choice Under Uncertainty

Some of the general heuristics—rules of thumb—that people use in making judgments that produce biases towards classifying situations according to their representativeness, or toward judging frequencies according to the availability of examples in memory, or toward interpretations warped by the way in which a problem has been framed. These heuristics have important implications for individuals and society.

Insensitivity to Base Rates
When people are given information about the probabilities of certain events (e.g., how many lawyers and how many engineers are in a population that is being sampled), and then are given some additional information as to which of the events has occurred (which person has been sampled from the population), they tend to ignore the prior probabilities in favor of incomplete or even quite irrelevant information about the individual event. Thus, if they are told that 70 percent of the population are lawyers, and if they are then given a noncommittal description of a person (one that could equally well fit a lawyer or an engineer), half the time they will predict that the person is a lawyer and half the time that he is an engineer–even though the laws of probability dictate that the best forecast is always to predict that the person is a lawyer.

Insensitivity to Sample Size
People commonly misjudge probabilities in many other ways. Asked to estimate the probability that 60 percent or more of the babies born in a hospital during a given week are male, they ignore information about the total number of births, although it is evident that the probability of a departure of this magnitude from the expected value of 50 percent is smaller if the total number of births is larger (the standard error of a percentage varies inversely with the square root of the population size).

Availability
There are situations in which people assess the frequency of a class by the ease with which instances can be brought to mind. In one experiment, subjects heard a list of names of persons of both sexes and were later asked to judge whether there were more names of men or women on the list. In lists presented to some subjects, the men were more famous than the women; in other lists, the women were more famous than the men. For all lists, subjects judged that the sex that had the more famous personalities was the more numerous.

Framing and Loss Aversion
The way in which an uncertain possibility is presented may have a substantial effect on how people respond to it. When asked whether they would choose surgery in a hypothetical medical emergency, many more people said that they would when the chance of survival was given as 80 percent than when the chance of death was given as 20 percent.

Source: Decision Making and Problem Solving, Herbert A. Simon

How Williams Sonoma Inadvertently Sold More Bread Machines

Paying attention to what your customers and clients see can be a very effective way to increase your influence and, subsequently, your business.

Steve Martin, co-author of Yes! 50 Secrets from the Science of Persuasion, tells the story:

A few years ago a well-known US kitchen retailer released its latest bread-making machine. Like any company bringing a new and improved product to market, it was excited about the extra sales revenues the product might deliver. And, like most companies, it was a little nervous about whether it had done everything to get its product launch right.

It needn’t have worried. Within a few weeks, sales had almost doubled. Surprisingly, though, it wasn’t the new product that generated the huge sales growth but an older model.

Yet there was no doubt about the role that its brand new product had played in persuading customers to buy its older and cheaper version.

Persuasion researchers suggest that when people consider a particular set of choices, they often favour alternatives that are ‘compromise choices’. That is, choices that compromise between what is needed at a minimum and what they could possibly spend at a maximum.

A key factor that often drives compromise choices is price. In the case of the bread-making machine, when customers saw the newer, more expensive product, the original, cheaper product immediately seemed a wiser, more economical and attractive choice in comparison.

Paying attention to what your customers and clients see first can be a very effective way to increase your influence and, subsequently, your business. It is useful to remember that high- end and high-priced products provide two crucial benefits. Firstly, they often serve to meet the needs of customers who are attracted to high-price offerings. A second, and perhaps less recognised benefit is that the next-highest options are often seen as more attractively priced.

Bars and hotels often present wine lists in the order of their cheapest (most often the house wine) first. But doing so might mean that customers may never consider some of the more expensive and potentially more profitable wines towards the end of the list. The ‘compromise’ approach suggests that reversing the order and placing more expensive wines at the top of the list would immediately make the next most expensive wines a more attractive choice — potentially increasing sales.

Original source: http://www.babusinesslife.com/Tools/Persuasion/How-compromise-choices-can-make-you-money.html

Michael Mauboussin: Getting More out of Your Brain

As Warren Buffett says, temperament is more important that IQ, because otherwise you take all the horsepower of your brain and dramatically reduce it. “A lot of people start out with 400-horsepower motors but only get a hundred horsepower of output,” he said. “It’s way better to have a 200-horsepower motor and get it all into output.”

With that in mind, Michael Mauboussin, the very first guest on our podcast, writes about five pitfalls to avoid that reduce the horsepower of your brain.

Five Pitfalls to Avoid

1. Overconfidence

The first pitfall that Mauboussin mentions is overconfidence. Acting as though you are smarter than you are is a recipe for disaster.

Researchers have found that people consistently overrate their abilities, knowledge, and skill. This is especially true in areas outside of their expertise. For example, professional securities analysts and money managers were presented with ten requests for information that they were unlikely to know (e.g., total area of Lake Michigan in square miles). They were asked to respond to each question with both an answer and a “confidence range”—high and low boundaries within which they were 90% certain that the true number resides. On average, the analysts choose ranges wide enough to accommodate the correct answer only 64% of the time. Money managers were even less successful at 50%.

Edward Russo and Paul Schoemaker, in their book “Managing Overconfidence,” argue that this confidence quiz measures how well we recognize the gap between what we think we know and what we do know. They point out that good decision-making means knowing the limits of your knowledge. Warren Buffett echoes the point with his circle of competence concept. He argues that investors should define their circle of competence, or area of expertise, and stay within it. Overconfidence in our expertise can lead to poor decisions. In the words of Will Rogers, “It’s not what we don’t know that gets us into trouble, it’s what we know that ain’t so.”

Mauboussin suggests you know your circle of competence and not over-estimate your abilities. To compensate for inevitable error in this, he suggests adding a margin of safety. Another idea is to do the work necessary to have an opinion and test it with other people by seeking feedback.

2. Anchoring and Adjustment

This is when we are influenced by data or information, even if it's not relevant, that impacts our future judgments. “In considering a decision,” Mauboussin writes, “we often give disproportionate weight to the first information we receive. As a result, initial impressions, ideas, estimates, or data anchor our subsequent thoughts.”

A good decision process helps counter this by ensuring you look at decisions from various angles, consider a wide variety of sources, start from zero, and adapt to reality.

3. Improper Framing

I'll let Mauboussin explain this before I take over.

People’s decisions are affected by how a problem, or set of circumstances, is presented. Even the same problem framed in different—and objectively equal—ways can cause people to make different choices. One example is what Richard Thaler calls “mental accounting.” Say an investor buys a stock at $50 per share that surges to $100. Many investors divide the value of the stock into two distinct parts—the initial investment and the quick profit. And each is treated differently—the original investment with caution and the profit portion with considerably less discipline. Thaler and Eric Johnson call this “the house money effect.”

This effect is not limited to individuals. Hersh Shefrin documents how the committee in charge of Santa Clara University’s endowment portfolio succumbed to this effect. Because of strong market performance, the endowment crossed a preordained absolute level (the frame) ahead of the time line set by the university president. The result? The university took some of the “house money” and added riskier investment classes to its portfolio, including venture capital, hedge funds, and private placements. Classic economic theory assumes frame independence: all money is treated the same. But empirical evidence shows that the frame indeed shapes decisions.

One of the most significant insights from prospect theory is that people exhibit significant aversion to losses when making choices between risky outcomes, no matter how small the stakes. In fact, Kahneman and Tversky find that a loss has about two and a half times the impact of the gain the same size. In other words, people feel a lot worse about losses of given size than they feel good about a gain of similar magnitude. This leads to loss aversion.

To describe this loss aversion, Shefrin and Meir Statman coined the term “disposition effect,” which they amusingly suggest is shorthand for “predisposition toward get-evenitis.” Since it is difficult for investors to make peace with their losses, they tend to sell their winners too soon and hold on to their losers too long. This is because they don’t want to take a loss on a stock. They want to at least get even despite the fact that the original rationale for purchasing the stock no longer appears valid. This is an important insight for all investors, including those that adopt the expectations investing approach.

This is often influenced by incentives as well, because we tend to see the world through our incentives and not how it really is.

Consider this example from Charlie Munger of how Lloyd's Insurance rewarded people:

They were paid a percentage of the gross volume that went through. And paying everybody a percentage of the gross, when what you're really interested in is the net, is a system — given the natural bias of human beings toward doing what's in their own interest even though it has terrible consequences for other people — that really did Lloyd's in.

People were rewarded for doing stupid things because from their frame of reference they acted rationally. It's hard to see the reality of a system you are a part of. To counter this bias, you need to step outside the system. This can be done by adjusting time frames (thinking about the long term and short term), considering second and third order effects, looking at the issue in different ways, and seeking out conflicting opinions.

4. Irrational Escalation of a Commitment

Sometimes we just keep digging when the best thing to do is cut our losses. Past decisions create what economists call sunk costs, which are past investments that cannot be recovered. And if we're the ones who made the decisions for those investments, we're likely to rationalize that the future is brighter. Otherwise, we'd have to admit that we were wrong and that's painful.

While the irrational escalation of a commitment can sometimes pay off, it's not a probabilistic way to think about things.

Sticking to a good decision process is a good way to avoid irrational escalation of a commitment. Other ideas include considering only future benefits and looking inside yourself to see if you're trying to avoid admitting a mistake. As Daniel Dennett says, “you should become a connoisseur of your own mistakes.”

5. Confirmation Trap

This comes from justifying your point of view and is one we can easily observe in others and fail to see in ourselves (again, this comes down to relativity.)

Mauboussin writes:

Investors tend to seek out information that supports their existing point of view while avoiding information that contradicts their opinion. This trap not only affects where investors go for information but also how they interpret the information they receive—too much weight is given to confirming evidence and not enough to disconfirming evidence. Investors often fall into the confirmation trap after making an investment decision. For example, once investors purchase a stock, they seek evidence that confirms their thesis and dismiss or discount information that disconfirms it. This leads to a loss of objectivity.

It's this loss of objectivity that kills us. The way to counter this is to “ask questions and conduct analysis that challenges your most cherished and firmly held beliefs” and seek out disconfirming evidence, something Charles Darwin mastered.

***

Still Curious? Follow your brain over to Grey Thinking.

Social Dilemmas

Social dilemmas arise when an individual receives a higher payoff for defecting than cooperating, when everyone else cooperates. When everyone defects they are worse off. That is, each member has a clear and unambiguous incentive to make a choice, which if made by all members provides a worse outcome.

A great example of a social dilemma, is to imagine yourself out with a group of your friends for dinner. Before the meal you all agree to share the cost equally. Looking at the menu you see a lot of items that appeal to you but are outside of your budget.

Pondering on this, you realize that you're only on the hook for 1/(number of friends at the dinner) of the bill. Now you can enjoy yourself without having to pay the full cost.

But what if everyone at the table realized the same thing? My guess is you'd all be stunned by the bill, even the tragedy of the commons.

This is a very simple example but you can map this to the business word by thinking about healthcare and insurance.

If that sounds a lot like game theory, you're on the right track.

I came across an excellent paper by Robyn Dawes and David Messick, which takes a closer look at social dilemmas.

A Psychological Analysis of Social Dilemmas

In the case of the public good, one strategy that has been employed is to create a moral sense of duty to support it—for instance, the public television station that one watches. The attempt is to reframe the decision as doing one's duty rather than making a difference—again, in the wellbeing of the station watched. The injection of a moral element changes the calculation from “Will I make a difference” to “I must pay for the benefit I get.”

The final illustration, the shared meal and its more serious counterparts, requires yet another approach. Here there is no hierarchy, as in the organizational example, that can be relied upon to solve the problem. With the shared meal, all the diners need to be aware of the temptation that they have and there need to be mutually agreed-upon limits to constrain the diners. Alternatively, the rule needs to be changed so that everyone pays for what they ordered. The latter arrangement creates responsibility in that all know that they will pay for what they order. Such voluntary arrangements may be difficult to arrange in some cases. With the medical insurance, the insurance company may recognize the risk and insist on a principle of co-payments for medical services. This is a step in the direction of paying for one's own meal, but it allows part of the “meal' ‘ to be shared and part of it to be paid for by the one who ordered it.

The fishing version is more difficult. To make those harvesting the fish pay for some of the costs of the catch would require some sort of taxation to deter the unbridled exploitation of the fishery. Taxation, however, leads to tax avoidance or evasion. But those who harvest the fish would have no incentive to report their catches accurately or at all, especially if they were particularly successful, which simultaneously means particularly successfully—compared to others at least—in contributing to the problem of a subsequently reduced yield. Voluntary self-restraint would be punished as those with less of that personal quality would thrive while those with more would suffer. Conscience, as Hardin (1968) noted, would be self-eliminating. …

Relatively minor changes in the social environment can induce major changes in decision making because these minor changes can change the perceived appropriateness of a situation. One variable that has been shown to make such a difference is whether the decision maker sees herself as an individual or as a part of a group.