Over 400,000 people visited Farnam Street last month to learn how to make better decisions, create new ideas, and avoid stupid errors. With more than 100,000 subscribers to our popular weekly digest, we've become an online intellectual hub. To learn more about we what do, start here.

Category Archives: Culture

Embrace the Mess: The Upside of Disorder

“We often succumb to the temptation of a tidy-minded approach
when we would be better served by embracing a degree of mess.”
— Tim Harford

***

The breadth and depth of products and services that promise to help us stay organized is almost overwhelming. Indeed, it would seem that to be messy is almost universally shunned, considered a sign of not being “put together,” while being tidy and neat is venerated to the nth degree.

Tim Harford has a different take. In his book Messy: The Power of Disorder to Transform Our Lives, he flips this notion around, showing us that there are situations in which disorder is beneficial, or at the very least that order has been oversold. (Tim previously introduced us to another counterintuitive thought with Adapt.)

***

One of the reasons why we put so much time and effort into being organized and tidy is because we make assumptions about what this will do for our productivity. If all our papers are neatly filed and email is neatly sorted, it will be easier to retrieve anything that’s important, right? Maybe not.

Harford cites a paper by Steve Whittaker and researchers at IBM called “Am I Wasting My Time Organizing Email?” to illustrate the fallacy.

Whittaker and his colleagues got permission to install logging software on the computers of several hundred office workers, and tracked around 85,000 attempts to find e-mail by clicking through folders, or by using ad hoc methods—scrolling through the inbox, clicking on a header to sort by (for example) the sender, or using the search function. Whittaker found that clicking through a folder tree took almost a minute, while simply searching took just 17 seconds. People who relied on folders took longer to find what they were looking for, but their hunts for the right e-mail were no more or less successful. In other words, if you just dump all your e-mail into a folder called “archive,” you will find your e-mails more quickly than if you hide them in a tidy structure of folders.

Okay, so taking the time to organize your email may not be as useful as we thought. Computers, after all, are designed as tools to help us work better and faster, so it makes sense that the simple search function would outperform us. But physical filing and keeping our work space neat makes us more productive right?

Once again, maybe not.

Quite a bit of research has been done on people’s working environments and it would seem that those with big piles of paper and/or clutter on their desks may be just as effective (and sometimes more so) than those pedantic ‘fillers.’

This is not to argue that a big pile of paper is the best possible filing system. But despite appearances, it’s very far from being a random assortment. A messy desk isn’t nearly as chaotic as it at first seems. There’s a natural tendency toward a very pragmatic system of organization based simply on the fact that the useful stuff keeps on getting picked up and left on the top of the pile.

David Kirsh, a cognitive scientist at the University of California, San Diego studies the differences between the working habits of the tidy types (he calls them ‘neats’) and the messy types (he calls them ‘scruffies’). Let’s look at what he found.

…how do people orient themselves after arriving at the office or finishing a phone call? Kirsh finds that “neats” orient themselves with to-do lists and calendars, while “scruffies” orient themselves using physical cues—the report that they were working on is lying on the desk, as is a letter that needs a reply, and receipts that must be submitted for expenses. A messy desk is full of such cues. A tidy desk conveys no information at all, and it must be bolstered with the prompt of a to-do list. Both systems can work, so we should hesitate before judging other people based on their messy desks.

So if both systems work, are there times when it’s actually more advantageous to embrace messiness?

Here Harford hits upon an interesting hypothesis: Messiness may enhance certain types of creativity. In fact, creativity itself may systematically benefit from a certain amount of disorder.

When things are too neat and tidy, it’s easy for boredom to set in and creativity to suffer. We feel stifled.

A messy environment offers disruptions that seem to act as a catalyst for new ideas and creations. If you think about it, we try to avoid these same disruptions when we focus on being more “organized.” But, if you sometimes embrace a little mess, you may be opening yourself up to more creative serendipity:

Messy disruptions will be most powerful when combined with creative skill. The disruption puts an artist, scientist, or engineer in unpromising territory—a deep valley rather than a familiar hilltop. But then expertise kicks in and finds ways to move upward again: the climb finishes at a new peak, perhaps lower than the old one, but perhaps unexpectedly higher.

Think about an “inefficiently” designed office plan that looks wasteful on the surface: What’s lost in efficiency (say, putting two departments that need to talk to each other in separated areas) can be more than made up for in serendipitous encounters.

Brian Eno, considered one of the most influential and innovative figures in music over the last five decades describes it like this:

The enemy of creative work is boredom, actually,” he says. “And the friend is alertness. Now I think what makes you alert is to be faced with a situation that is beyond your control so you have to be watching it very carefully to see how it unfolds, to be able to stay on top of it. That kind of alertness is exciting.”

Eno created an amazing system for pushing people into ‘alertness.’ He came up with something he called “Oblique Strategies” cards. He would show up at the recording studio with a handful of cards and bring them out whenever it seemed that the group needed a nudge.

Each had a different instruction, often a gnomic one. Whenever the studio sessions were running aground, Eno would draw a card at random and relay its strange orders.

Be the first not to do what has never not been done before
Emphasize the flaws
Only a part, not the whole
Twist the spine
Look at the order in which you do things
Change instrument roles

Can you imagine asking the guitarist of a group to sit behind the drums on a track? These were the type of suggestions that Eno is famous for and it seems to be serving him well; at age sixty-eight he has a new album coming out in January of 2017 and some variation of his cards have been available for purchase since first appearing for public consumption in 1975.

We all won’t be able to embrace a card from Eno’s deck. Some people do well in tidy environments/situations and some do well in messy ones — it’s probably contingent on what you’re trying to achieve. (We wouldn’t go so far as recommending a CEO be disorganized.)

Reading through the book it would seem that the key is, like most things, to give it a try. A little “intentional messiness” could go a long way towards helping you climb out of a rut. And, if you are the tidy type through and through, it’s important not to try and force that on others — you just might be taking away a good thing.

If you like the ideas in Messy, check out Harford’s other book Adapt: Why Success Always Starts With Failure, or check out another important book on things that gain from disorder, Antifragile.

Carol Dweck on Creating a Growth Mindset in the Workplace

Carol Dweck‘s concept of Mindset permeates through every aspect of our lives.

One area particularity affected is in the workplace. We spend half of our day at work (some of you likely spend more than half) and both your mindset and the mindset of those around you will have a significant impact on your life, especially the mindset of your boss. Dweck comments:

Fixed-mindset leaders, like fixed-mindset people in general, live in a world where some people are superior and some are inferior. They must repeatedly affirm that they are superior, and the company is simply a platform for this.

These leaders tend to have a strong focus on personal reputation, generally at the expense of the company. Lee Iacocca, during his time at Chrysler, is a good example of this. Iacocca had his ego severely bruised when he was forced out of Ford. Fixed-mindset leaders tend to respond to failure with anger instead of viewing it as an opportunity to learn or get better.

So the king who had defined him as competent and worthy now rejected him as flawed. With ferocious energy, Iacocca applied himself to the monumental task of saving face and, in the process, Chrysler Motors. Chrysler, the once thriving Ford rival, was on the brink of death, but Iacocca as its new CEO acted quickly to hire the right people, bring out new models, and lobby the government for bailout loans. Just a few years after his humiliating exit from Ford, he was able to write a triumphant autobiography and in it declare, ‘Today, I’m a hero.’

He showed Ford that they made a mistake when they let him go, and he reveled in his triumph. But in his glory-basking, Iacocca forgot that the race wasn’t over yet.

This was a hard time for the American automotive industry, the Japanese were challenging the market like no one ever had before. Chrysler needed to respond to the competition or they would be in trouble again. Meanwhile, Iacocca was still focused on his reputation and legacy.

He also looked to history, to how he would be judged and remembered. But he did not address this concern by building the company. Quite the contrary. According to one of his biographers, he worried that his underlings might get credit for successful new designs, so he balked at approving them. He worried, as Chrysler faltered, that his underlings might be seen as the new saviors, so he tried to get rid of them.

Instead of listening to the advice of his designers and engineers, Iacocca dug his feet into the ground.

See, a fixed-mindset doesn’t easily allow you to change course. You believe that someone either has ‘it’ or they don’t: it’s a very binary frame of mind. You don’t believe in growth, you believe in right and wrong and any suggestion of change or adaptation is considered a criticism. You don’t know how to adopt grey thinking. Challenges or obstacles tend to make you angry and defensive. 

Iacocca was no different.

But rather than taking up the challenge and delivering better cars, Iacocca, mired in his fixed mindset, delivered blame and excuses. He went on the rampage, spewing angry diatribes against the Japanese and demanding that the American government impose tariffs and quotas that would stop them.

Blame is a big part of the fixed-mindset; when something goes wrong you don’t want to take responsibility because that would be akin to accepting inferiority. This can push some bosses to become abusive and controlling. They feel superior by making others feel inferior. Colleagues may feel this way too, but management has power. This is when you will notice the effect of mindset on your corporate culture. Everything starts to revolve around pleasing upper management. 

When bosses become controlling and abusive, they put everyone into a fixed mindset. This means that instead of learning, growing, and moving the company forward, everyone starts worrying about being judged. It starts with the bosses’ worry about being judged, but it winds up being everybody’s fear about being judged. It’s hard for courage and innovation to survive a companywide fixed mindset.

In these circumstances, the fear of punishment leads to groupthink. No one wants to dissent or put their hand up because it’s likely to get slapped. 

So what can you do if you’re new to a company and working against a fixed-mindset? This will be a difficult road but there are definitely ways of nudging your company towards a growth mindset.

Dweck outlines the main attributes that create a growth-mindset environment:

  • Presenting skills as learnable
  • Conveying that the organization values learning and perseverance, not just ready-made genius or talent
  • Giving feedback in a way that promotes learning and future success
  • Presenting managers as resources for learning.

At the end of each chapter of Dweck’s book, she has a brilliant section entitled ‘Grow Your Mindset.’ She reviews the chapter’s contents and asks the reader probing questions to help them evaluate their situation and suggests concrete ways to move forward. Here are a few pertinent examples to explore:

What kind of workplace are you in?

Are you in a fixed-mindset or growth-mindset workplace? Do you feel people are just judging you or are they helping you develop? Maybe you could try making it a more growth-mindset place, starting with yourself. 

Is it possible that you’re the problem?

Are there ways you could be less defensive about your mistakes? Could you profit more from the feedback you get? Are there ways you can create more learning experiences for yourself? How do you act toward others in your workplace? Are you a fixed-mindset boss, focused on your power more than on your employees’ well-being? Do you ever reaffirm your status by demeaning others? Do you ever try to hold back high-performing employees because they threaten you?

Can you foster a better environment?

Consider ways to help your employees develop on the job: Apprenticeships? Workshops? Coaching sessions? Think about how you can start seeing and treating your employees as your collaborators, as a team. Make a list of strategies and try them out. Do this even if you already think of yourself as a growth-mindset boss. Well-placed support and growth-promoting feedback never hurt.

Do you have procedures to overcome groupthink?

Is your workplace set up to promote groupthink? If so, the whole decision-making process is in trouble. Create ways to foster alternative views and constructive criticism. Assign people to play the devil’s advocate, taking opposing viewpoints so you can see the holes in your position. Get people to wage debates that argue different sides of the issue. Have an anonymous suggestion box that employees must contribute to as part of the decision-making process. Remember, people can be independent thinkers and team players at the same time. Help them fill both roles.

Mindset is filled with practical advice that will change the way in which you think and interact with the world. Through examples from her rigorous research Dweck eloquently explains the nature of the two mindsets and their influence on sports, business and relationships. Since culture eats strategy, it’s important to understand her main points. Understanding her core concepts will also add depth to your comprehension of metal models like confirmation bias and bias from overconfidence.

If you’d like a bit more on Mindset we suggest taking a look at Dweck’s Google talk or perhaps revisit a more detailed explanation of the two mindsets.

Biology Enables. Culture Forbids.

“From a biological perspective, nothing is unnatural.
Whatever is possible is by definition also natural.”
— Yuval Harari

***

We get a little confused when deciding if a particular human behavior is cultural or biological. Is homosexuality a natural act or unnatural? How about Facebook? Is it unnatural human behavior? Abortion? Non-procreative sex? Slavery? Mixing of races?

Many of these are either explicitly or certainly border on being taboo subjects. As in, they may not be discussed in polite company, even when encouraged.

Yet, for for those of us seeking to understand reality as it is, to understand deeply the most important buckets of knowledge, taboo is no reason to avoid the hard subjects.

So how should we think about this?

Professor Yuval Harari, who has previously taught us why humans dominate the earth and the false natural state of man, has an interesting take, discussed in his book Sapiens: A Brief History of Humankind. The chapter is aptly titled “There is No Justice in History.”

Professor Harari’s well-informed heuristic boils down to: Biology Enables. Culture Forbids.

How can we distinguish what is biologically determined from what people merely try to justify through biological myths? A good rule of thumb is ‘Biology enables, culture forbids.’ Biology is willing to tolerate a very wide spectrum of possibilities. It’s culture that obligates people to realize some possibilities while forbidding others. Biology enables women to have children — some cultures oblige women to realize this possibility. Biology enables men to enjoy sex with one another — some cultures forbid them to realize this possibility.

Culture tends to argue that it forbids only that which is unnatural. But from a biological perspective, nothing is unnatural. Whatever is possible is by definition also natural. A truly unnatural behavior, one that goes against the laws of nature, simply cannot exist, so it would need no prohibition.

[…]

…Evolution has no purpose. Organs have not evolved with a purpose, and the way they are used is in constant flux. There is not a single organ in the human body that only does the job its prototype did when it first appeared hundreds of millions of years ago. Organs evolve to perform a particular function, but once they exist, they can be adapted for other usages as well. Mouths, for example, appeared because the earliest multicellular organisms needed a way to take nutrients into their bodies. We still use our mouths for that purpose, but we also use them to kiss, speak, and, if we are Rambo, to pull the pins out of hand grenades. Are any of these uses unnatural simply because our worm-like ancestors 600 million years ago didn’t do those things with their mouths?

Our biology gives us a very wide playground and a lot of berth. We’re capable of a wide variety of activities and forms of organization, while other species generally fall into far more fixed and predictable hierarchies.

Over the course of history, humans have taken advantage of this wide range in a variety of positive and negative ways by creating and sustaining myths not supported by biological reality.

Take slavery, once a common practice throughout the world and now thankfully considered a scourge (and illegal) on all parts of the planet. Or the caste system, still in place in some in certain areas of the world, although perhaps less strictly than in the past.

Both slavery and the castes were carried out through a series of pseudoscientific rationalizations about the “natural order” of things, stories strong enough to believed (in part) by all constituents of the hierarchy. This “forbidding” aspect of culture was not supported by biological differences, but that didn’t make the stories any less powerful or believable.

Even the American political system, ostensibly founded on a bedrock of “liberty and equality”, only provided those things to certain small groups. The Founders used cultural myths to rationalize a deeply divided society in which men had dominion over women, European whites had dominion over blacks and the native people, and the historically rich had dominion over the historically poor. Any other order would have been “unnatural”:

The American order consecrated the hierarchy between the rich and poor. Most Americans at that time had little problem with the inequality caused by wealthy parents passing their money and businesses onto their children. In their view, equality meant simply that the same laws applied to rich and poor. It had nothing to do with unemployment benefits, integrated education or health insurance. Liberty, too, carried very different connotations than it does today. In 1776, it did not mean that the disempowered (certainly not blacks or Indians or, God forbid, women) could gain and exercise power. It meant simply that the state could not, except in unusual circumstances, confiscate a citizen’s private property or tell him what to do with it. The American order thereby upheld the hierarchy of wealth, which some thought was mandated by God and others viewed representing the immutable laws of nature. Nature, it was claimed, rewarded merit with wealth while penalizing indolence.

All the above-mentioned distinctions — between free persons and slaves, between whites and blacks, between rich and poor — are rooted in fictions…Yet it is an iron rule of history that every imagined hierarchy disavows its fictional origins and claims to be natural and inevitable. For instance, many people who have viewed the hierarchy of free persons and slaves as natural and correct have argued that slavery is not a human invention. Hammurabi saw it as ordained by the gods. Aristotle argued that slaves have a ‘slavish nature’ whereas free people have a ‘free nature’. Their status in society is merely a reflection of their innate nature.

This isn’t to argue that there aren’t biological differences between certain groups of people, including men and women. There are. But history has shown our tendency to exaggerate those differences and to create stories around our exaggerations, stories that uphold a certain desired hierarchy. These stories have a way of creating their own reality.

Just as frequently, we commit the opposite sin by restricting certain behavior based on some idea of what’s “natural” or “unnatural”, confusing biology with religious or cultural taboos. (And these myths die hard: It’s hard to fathom, but homosexuality wasn’t even legal in the United Kingdom until 1967.) As Harari rightly points out, anything we can do is perfectly natural in the biological sense. We come well-equipped for a variety of behavior.

And this certainly isn’t to argue that all behavior is equally acceptable: We put bumpers on society to reduce murder, rape, slavery, and other vile behavior that is perfectly biologically natural to us, and we should.

But unless we recognize the difference between biology and cultural myth and seek to reduce our unfair taboos wherever possible, we fail in some way to see the world through the eyes of others, and see that our imagined order is not always a fair or just one, a natural or inevitable one. Maybe some of the things we see around us are just a historical accident if we look closely enough.

Even more than that, examining the relationship between biological reality and cultural myth allows us to appreciate our basic storytelling instincts. Human beings are wired for narrative: We’ve been called the Storytelling Animal and for good reason. Our thirst and ready acceptance of narrative is a basic part of our existence; it’s hard-wired into our genetic algorithm.

Much of our narrative superpower can be observed in the structure of human language, which is unique among species in its infinite flexibility and adaptability. It makes us capable of great cooperative accomplishments, but also great evils.

Fortunately, the modern world has done a pretty good job steadily loosening the grip of mythical “natural” realities that only exist in our heads. But a fair inquiry remains: What sustaining myths still exist? Are they for good or for evil?

We leave that for you to ponder.

Check out Harari’s book Sapiens or his new book, Homo Deus.

***

If you liked this, you’ll love:

Why Humans Dominate the Earth: Myth-Making — It is our collected fictions that define us.

Religion and History: Will Durant on the Role of Religion and Morality — Religions ability to shape cultural behavior.

The False Allure of a “Natural State” of Man — The heated debate about Sapiens’ “natural way of life” is missing the point. Ever since the Cognitive Revolution, there hasn’t been a natural way of life for Sapiens.

Breaking the Rules: Moneyball Edition

Most of the book Simple Rules by Donald Sull and Kathleen Eisenhardt talks about identifying a problem area (or an area ripe for “simple rules”) and then walks you through creating your own set of rules. It’s a useful mental process.

An ideal situation for simple rules is something repetitive, giving you constant feedback so you can course correct as you go. But what if your rules stop working and you need to start over completely?

Simple Rules recounts the well-known Moneyball tale in its examination of this process:

The story begins with Sandy Alderson. Alderson, a former Marine with no baseball background became the A’s general manager in 1983. Unlike baseball traditionalists, Alderson saw scoring runs as a process, not an outcome, and imagined baseball as a factory with a flow of players moving along the bases. This view led Alderson and later his protege and replacement, Billy Beane, to the insight that most teams overvalue batting average (hits only) and miss the relevance of on-base percentage (walks plus hits) to keeping the runners moving. Like many insightful rules, this boundary rule of picking players with a high on base percentage has subtle second – and third-order effects. Hitters with a high on-base percentage are highly disciplined (i.e., patient, with a good eye for strikes). This means they get more walks, and their reputation for discipline encourages pitchers to throw strikes, which are easier to hit. They tire out pitchers by making them throw more pitches overall, and disciplined hitting does not erode much with age. These and other insights are at the heart of what author Michael Lewis famously described as moneyball.

The Oakland A’s did everything right, they had examined the issues, they tried to figure out those areas which would most benefit from a set of simple rules and they had implemented them. The problem was, they were easy rules to copy. 

They were operating in a Red Queen Effect world where everyone around them was co-evolving, where running fast was just enough to get ahead temporarily, but not permanently. The Red Sox were the first and most successful club to copy the A’s:

By 2004, a free-spending team, the Boston Red Sox, co-opted the A’s principles and won the World Series for the first time since 1918. In contrast, the A’s went into decline, and by 2007 the were losing more games than they were winning Moneyball had struck out.

What can we do when the rules stop working? 

We must break them.

***

When the A’s had brought in Sandy Alderson, he was an outsider with no baseball background who could look at the problem in a different and new light. So how could that be replicated?

The team decided to bring in Farhan Zaidi as director of baseball operations in 2009. Zaidi spent most of his life with a pretty healthy obsession for baseball but he had a unique background: a PhD in behavioral economics.

He started on the job of breaking the old rules and crafting new ones. Like Andy Grove did once upon a time with Intel, Zaidi helped the team turn and face a new reality. Sull and Eisenhardt consider this as a key trait:

To respond effectively to major change, it is essential to investigate the new situation actively, and create a reimagined vision that utilizes radically different rules.

The right choice is often to move to the new rules as quickly as possible. Performance will typically decline in the short run, but the transition to the new reality will be faster and more complete in the long run. In contrast, changing slowly often results in an awkward combination of the past and the future with neither fitting the other or working well.

Beane and Zaidi first did some house cleaning: They fired the team’s manager. Then, they began breaking the old Moneyball rules, things like avoiding drafting high-school players. They also decided to pay more attention to physical skills like speed and throwing.

In the short term, the team performed quite poorly as fan attendance showed a steady decline. Yet, once again, against all odds, the A’s finished first in their division in 2012. Their change worked. 

With a new set of Simple Rules, they became a dominant force in their division once again. 

Reflecting their formidable analytic skills, the A’s brass had a new mindset that portrayed baseball as a financial market rife with arbitrage possibilities and simple rules to match.

One was a how-to rule that dictated exploiting players with splits. Simply put, players with splits have substantially different performances in two seemingly similar situations. A common split is when a player hits very well against right-handed pitchers and poorly against left-handed pitchers, or vice versa. Players with spits are mediocre when they play every game, and are low paid. In contrast, most superstars play well regardless of the situation, and are paid handsomely for their versatility. The A’s insight was that when a team has a player who can perform one side of the split well and a different player who excels at the opposite split, the two positives can create a cheap composite player. So the A’s started using a boundary rule to pick players with splits and how-to rule to exploit those splits with platooning – putting different players at the same position to take advantage of their splits against right – or left-handed pitching.

If you’re reading this as a baseball fan, you’re probably thinking that exploiting splits isn’t anything new. So why did it have such an effect on their season? Well, no one had pushed it this hard before, which had some nuanced effects that might not have been immediately apparent.

For example, exploiting these splits keeps players healthier during the long 162-game season because they don’t play every day. The rule keeps everyone motivated because everyone has a role and plays often. It provides versatility when players are injured since players can fill in for each other.

They didn’t stop there. Zaidi and Beane looked at the data and kept rolling out new simple rules that broke with their highly successful Moneyball past.

In 2013 they added a new boundary rule to the player-selection activity: pick fly-ball hitters, meaning hitters who tend to hit the ball in the air and out of the infield (in contrast with ground-ball hitters). Sixty percent of the A’s at-bat were by fly-ball hitters in 2013, the highest percentage in major-league baseball in almost a decade, and the A’s had the highest ratio of fly ball to ground balls, by far. Why fly-ball hitters?

Since one of ten fly balls is a home run, fly-ball hitters hit more home runs: an important factor in winning games. Fly-ball hitters also avoid ground-ball double plays, a rally killer if ever there as one. They are particularly effective against ground-ball pitches because they tend to swing underneath the ball, taking way the advantage of those pitchers. In fact, the A’s fly-ball hitters batted an all-star caliber .302 against ground-ball pitchers in 2013 on their way to their second consecutive division title despite having the fourth-lowest payroll in major-league baseball.

Unfortunately, the new rules had a short-lived effectiveness: In 2014 the A’s fell to 2nd place and have been struggling the last two seasons. Two Cinderella stories is a great achievement, but it’s hard to maintain that edge. 

This wonderful demonstration of the Red Queen Effect in sports can be described as an “arms race.’” As everyone tries to get ahead, a strange equilibrium is created by the simultaneous continual improvement, and those with more limited resources must work even harder as the pack moves ahead one at a time.

Even though they have adapted and created some wonderful “Simple Rules” in the past, the A’s (and all of their competitors) must stay in the race in order to return to the top: No “rule” will allow them to rest on their laurels. Second Level Thinking and a little real world experience shows this to be true: Those that prosper consistently will think deeply, reevaluate, adapt, and continually evolve. That is the nature of a competitive world. 

Religion and History: Will Durant on the Role of Religion and Morality

“Even the skeptical historian develops a humble respect for religion, since he sees it functioning, and seemingly indispensable, in every land and age.”

***

Will and Ariel Durant have written a masterpiece in The Lessons of History. Inside the book, which is a condensed version of his life work, you can find an interesting chapter entitled Religion and History that explores the role of religion throughout history. 

Scientists often question the value of religion. Durant demurs:

To the unhappy, the suffering, the bereaved, the old, it has brought supernatural comforts valued by millions of souls as more precious than any natural aid. It has helped parents and teachers to discipline the young. It has conferred meaning and dignity upon the lowliest existence, and through its sacraments has made for stability by transforming human covenants into solemn relationships with God. It has kept the poor (said Napoleon) from murdering the rich. For since the natural inequality of men dooms many of us to poverty or defeat, some supernatural hope may be the sole alternative to despair. Destroy that hope, and class war is intensified. Heaven and utopia are buckets in a well: when one goes down the other goes up; when religion declines Communism grows.

The role of religion and morality is not clear at first. According to Petronius, who echoed Lucretius, “it was fear that first made the gods.” The fear he was talking about was a fear of the unexplainable — fear of hidden forces in the earth, oceans, skies, and rivers.

Religion became the propitiatory worship of these forces through offerings, sacrifice, incantation, and prayer. Only when priests used these fears and rituals to support morality and law did religion become a force vital and rival to the state. It told the people that the local code of morals and laws had been dictated by the gods.

In the eyes of the Durants, the effect of this new moral law was to dampen the worst of moral disorder—sensuality, drunkenness, coarseness, greed, dishonesty, robbery, and violence.

"Gregory VII saying Mass" (Via Wikipedia)
“Gregory VII saying Mass” (Via Wikipedia)

 

“Though the Church served the state,” they write, “it claimed to stand above all states, as morality should stand above power.” The idea of a moral superstate briefly come to fulfillment in the century after The Emperor Henry IV submitted to Pope Gregory VII at Canossa in 1077. The dream crumbled, however, under attacks of nationalism, skepticism and human frailty.

The Church, after all, was manned with men who proved all too human in their failings of greed and power. As states became stronger and wealthier they made the papacy a political tool. “Kings,” the Durants write, “became strong enough to compel a pope to dissolve the Jesuit order which had so devotedly supported the popes.” In response, the Church stooped to fraud. Increasingly the religious hierarchy spent time promoting orthodoxy rather than morality. The Inquisition almost killed the Church.

Even while preaching peace the Church fomented religious wars in sixteenth-century France and the Thirty Years’ War in seventeenth-century Germany. It played only a modest part in the outstanding advance of modern morality— the abolition of slavery.

This allowed the philosophers to take the lead in the humanitarian movements that “alleviated the evils of our time.”

History has justified the Church in the belief that the masses of mankind desire a religion rich in miracle, mystery, and myth. Some minor modifications have been allowed in ritual, in ecclesiastical costume, and in episcopal authority; but the Church dares not alter the doctrines that reason smiles at, for such changes would offend and disillusion the millions whose hopes have been tied to inspiring and consolatory imaginations. No reconciliation is possible between religion and philosophy except through the philosophers’ recognition that they have found no substitute for the moral function of the Church, and the ecclesiastical recognition of religious and intellectual freedom.

Does history support a belief in God?

If by God we mean not the creative vitality of nature but a supreme being intelligent and benevolent, the answer must be a reluctant negative. Like other departments of biology, history remains at bottom a natural selection of the fittest individuals and groups in a struggle wherein goodness receives no favors, misfortunes abound, and the final test is the ability to survive. Add to the crimes, wars, and cruelties of man the earthquakes, storms, tornadoes, pestilences, tidal waves, and other “acts of God” that periodically desolate human and animal life, and the total evidence suggests either a blind or an impartial fatality, with incidental and apparently haphazard scenes to which we subjectively ascribe order, splendor, beauty, or sublimity. If history supports any theology this would be a dualism like the Zoroastrian or Manichaean: a good spirit and an evil spirit battling for control of the universe and men’s souls. These faiths and Christianity (which is essentially Manichaean) assured their followers that the good spirit would win in the end; but of this consummation history offers no guarantee. Nature and history do not agree with our conceptions of good and bad; they define good as that which survives, and bad as that which goes under; and the universe has no prejudice in favor of Christ as against Genghis Khan.

Our Place in the Cosmos

Bronze statue of Bruno by Ettore Ferrari at Campo de' Fiori, Rome.
Bronze statue of Bruno by Ettore Ferrari at Campo de’ Fiori, Rome.

 

As science further develops, it shows our minuscule place in the cosmos. This knowledge further impairs Religion. We can date the beginning of the decline with Giordano Bruno and then with Copernicus (1543). In 1611 John Donne was “mourning that the earth had become a mere suburb in the world.” All was thrown into doubt. Francis Bacon proclaimed that science was the religion of the modern man. This was the generation that began the  “death of God” as an external deity.

So great an effect required many causes besides the spread of science and historical knowledge. First, the Protestant Reformation, which originally defended private judgment. Then the multitude of Protestant sects and conflicting theologies, each appealing to both Scriptures and reason. Then the higher criticism of the Bible, displaying that marvelous library as the imperfect work of fallible men. Then the deistic movement in England, reducing religion to a vague belief in a God hardly distinguishable from nature. Then the growing acquaintance with other religions, whose myths, many of them pre-Christian, were distressingly similar to the supposedly factual bases of one’s inherited creed. Then the Protestant exposure of Catholic miracles, the deistic exposure of Biblical miracles, the general exposure of frauds, inquisitions, and massacres in the history of religion. Then the replacement of agriculture— which had stirred men to faith by the annual rebirth of life and the mystery of growth— with industry, humming daily a litany of machines, and suggesting a world machine. Add meanwhile the bold advance of skeptical scholarship, as in Bayle, and of pantheistic philosophy, as in Spinoza; the massive attack of the French Enlightenment upon Christianity; the revolt of Paris against the Church during the French Revolution. Add, in our own time, the indiscriminate slaughter of civilian populations in modern war. Finally, the awesome triumphs of scientific technology, promising man omnipotence and destruction, and challenging the divine command of the skies.

The ceiling of the Sistine Chapel (Via wikipedia)
The ceiling of the Sistine Chapel (Via wikipedia)

In a way Christianity lent a hand to its reduced place, by fostering a moral sense in believers that could no longer tolerate the vengeful God of traditional Theology.

The idea of hell disappeared from educated thought, even from pulpit homilies. Presbyterians became ashamed of the Westminster Confession, which had pledged them to belief in a God who had created billions of men and women despite his foreknowledge that, regardless of their virtues and crimes, they were predestined to everlasting hell. Educated Christians visiting the Sistine Chapel were shocked by Michelangelo’s picture of Christ hurling offenders pell-mell into an inferno whose fires were never to be extinguished; was this the “gentle Jesus, meek and mild,” who had inspired our youth?

The industrial revolution replaced Christian with secular institutions.

That states should attempt to dispense with theological supports is one of the many crucial experiments that bewilder our brains and unsettle our ways today. Laws which were once presented as the decrees of a god-given king are now frankly the confused commands of fallible men. Education, which was the sacred province of god-inspired priests, becomes the task of men and women shorn of theological robes and awe, and relying on reason and persuasion to civilize young rebels who fear only the policeman and may never learn to reason at all. Colleges once allied to churches have been captured by businessmen and scientists. The propaganda of patriotism, capitalism, or Communism succeeds to the inculcation of a supernatural creed and moral code.

But one lesson of history is that religion adapts and has a habit of resurrection. Often in the past it has nearly died only to be reborn.

Generally religion and puritanism prevail in periods when the laws are feeble and morals must bear the burden of maintaining social order; skepticism and paganism (other factors being equal) progress as the rising power of law and government permits the decline of the church, the family, and morality without basically endangering the stability of the state. In our time the strength of the state has united with the several forces listed above to relax faith and morals, and to allow paganism to resume its natural sway. Probably our excesses will bring another reaction; moral disorder may generate a religious revival; atheists may again (as in France after the debacle of 1870) send their children to Catholic schools to give them the discipline of religious belief.

Religion and Morality

If we are wondering whether history warrants the conclusion that religion is necessary for morality — “that natural ethic is too weak to withstand the savagery that lurks under civilization and emerges in our dreams, crimes, and wars” — we need look no further than the answer given by Joseph de Maistre who said: “I do not know what the heart of a rascal may be; I know what is in the heart an an honest man; it is horrible.” Whether religion must be the force to temper the hearts of future men and women, the Durants think that’s certainly been the case in the past:

There is no significant example in history, before our time, of a society successfully maintaining moral life without the aid of religion. France, the United States, and some other nations have divorced their governments from all churches, but they have had the help of religion in keeping social order. Only a few Communist states have not merely dissociated themselves from religion but have repudiated its aid; and perhaps the apparent and provisional success of this experiment in Russia owes much to the temporary acceptance of Communism as the religion (or, as skeptics would say, the opium) of the people, replacing the church as the vendor of comfort and hope. If the socialist regime should fail in its efforts to destroy relative poverty among the masses, this new religion may lose its fervor and efficacy, and the state may wink at the restoration of supernatural beliefs as an aid in quieting discontent. “As long as there is poverty there will be gods.”

The Lessons of History is full of condensed wisdom on the meaning of history, the age of play, the lessons of biological history, and more.

 

Focusing Illusions

focusing illusions

My favorite chapter in the book Rapt: Attention and the Focused Life by Winifred Gallagher is called ‘Decisions: Focusing Illusions.’ It’s a really great summary of how focusing on the wrong things affects the weights we use to make decisions. There is a lot of great content packed into this chapter but I’ll attempt to highlight a few points.

***
Bounded Rationality

According to the principle of ‘bounded rationality,’ which (Daniel) Kahneman first applied to economic decisions and more recently to choices concerning quality of life, we are reasonable-enough beings but sometimes liable to focus on the wrong things. Our thinking gets befuddled not so much by our emotions as by our ‘cognitive illusions,’ or mistaken intuitions, and other flawed, fragmented mental constructs.

***
Loss/Risk Aversion

If you’re pondering a choice that involves risk, you might focus too much on the threat of possible loss, thereby obscuring an even likelier potential benefit. Where this common scenario is concerned, research shows that we aren’t so much risk-averse as loss-averse, in that we’re generally much more sensitive to what we might have to give up than to what we might gain.

***
The Focusing Illusion

The key to understanding why you pay more attention to your thoughts about living than to life itself is neatly summed up by what Kahneman proudly calls his ‘fortune cookie maxim’ (a.k.a the focusing illusion): ‘Nothing in life is as important as you think it is while you are thinking about it.’ Why? ‘Because you’re thinking about it!

In one much-cited illustration of the focusing illusion, Kahneman asked some people if they would be happier if they lived in California. Because the climate is often delightful there, most subjects thought so. For the same reason, even Californians assume they’re happier than people who live elsewhere. When Kahneman actually measured their well-being however, Michiganders and others are just as contented as Californians. The reason is that 99 percent of the stuff of life – relationships, work, home, recreation – is the same no matter where you are, and once you settle in a place, no matter how salubrious, you don’t think about it’s climate very much. If you’re prompted to evaluate it, however, the weather immediately looms large, simply because you’re paying attention to it. This illusion inclines you to accentuate the difference between Place A and Place B, making it seem to matter much more than it really does, which is marginal.

To test the fortune cookie rule, you have only to ask yourself how happy you are. The question automatically summons your remembering self, which will focus on any recent change in your life – marriage or divorce, new job or home. You’ll then think about this novel event, which in turn will increase its import and influence your answer. If you’re pleased that you’ve just left the suburbs for the city, say, you’ll decide that life is pretty good. If you regret the move, you’ll be dissatisfied in general. Fifteen years on, however, the change that looms so large now will pale next to a more recent event – a career change, perhaps or becoming a grandparent – which will draw your focus and, simply because you’re thinking about it, bias your evaluation of your general well-being.

***
The Effects of Adaptation

Like focusing too much on the opinions of your remembering self, overlooking the effects of adaptation – the process of becoming used to a situation – can obstruct wise decisions about how to live. As Kahneman says, ‘when planning for the future, we don’t consider that we will stop paying attention to a thing.

The tendency to stop focusing on a particular event or experience over time, no matter how wonderful or awful, helps explain why the differences in well-being between groups of people in very different circumstances tend to be surprisingly small – sometimes astoundingly so. The classic examples are paraplegics and lottery winners, who respectively aren’t nearly as miserable or happy as you’d think. ‘That’s where attention comes in,’ says Kahneman. ‘People think that if they win the lottery, they’ll be happy forever. Of course, they will not. For a while, they are happy because of the novelty, and because they think about winning all the time. Then they adapt and stop paying attention to it.’ Similarly, he says, ‘Everyone is surprised by how happy paraplegics can be, but they are not paraplegic full-time. They do other things. They enjoy their meals, their friends, the newspaper. It has to do with the allocation of attention.’

Like couples who’ve just fallen in love, professionals starting a career, or children who go to camp for the first time, paraplegics and lottery winners initially pay a lot of attention to their new situation. Then, like everybody else, they get used to it and shift their focus to the next big thing. Their seemingly blase attitude surprises us, because when we imagine ourselves in their place, we focus on how we’d feel at the moment of becoming paralyzed or wildly rich, when such an event utterly monopolizes one’s focus. We forget that we, too, would get used to wealth, a wheelchair, and most other things under the sun, then turn our attention elsewhere.

***
Good Enough

Finally, don’t worry if the choice you made wasn’t the absolute best, as long as it meets your needs. Offering the single most important lesson from his research, Schwartz says, ‘Good enough is almost always good enough. If you have that attitude, many problems about decisions and much paralysis melt away.’