Category: Learning

Let Go of the Learning Baggage

We all want to learn better. That means retaining information, processing it, being able to use it when needed. More knowledge means better instincts; better insights into opportunities for both you and your organization. You will ultimately produce better work if you give yourself the space to learn. Yet often organizations get in the way of learning.

How do we learn how to learn? Usually in school, combined with instructions from our parents, we cobble together an understanding that allows us to move forward through the school years until we matriculate into a job. Then because most initial learning comes from doing, less from books, we switch to an on-the-fly approach.

Which is usually an absolute failure. Why? In part, because we layer our social values on top and end up with a hot mess of guilt and fear that stymies the learning process.

Learning is necessary for our success and personal growth. But we can’t maximize the time we spend learning because our feelings about what we ‘should’ be doing get in the way.

We are trained by our modern world to organize our day into mutually exclusive chunks called ‘work’, ‘play’, and ‘sleep’. One is done at the office, the other two are not. We are not allowed to move fluidly between these chunks, or combine them in our 24 hour day. Lyndon Johnson got to nap at the office in the afternoon, likely because he was President and didn’t have to worry about what his boss was going to think. Most of us don’t have this option. And now in the open office debacle we can’t even have a quiet 10 minutes of rest in our cubicles.

We have become trained to equate working with doing. Thus the ‘doing’ has value. We deserve to get paid for this. And, it seems, only this.

What does this have to do with learning?

It’s this same attitude that we apply to the learning process when we are older, with similarly unsatisfying results.

If we are learning for work, then in our brains learning = work. So we have to do it during the day. At the office. And if we are not learning, then we are not working. We think that walking is not learning. It’s ‘taking a break’. We instinctively believe that reading is learning. Having discussions about what you’ve read, however, is often not considered work, again it’s ‘taking a break’.

To many, working means sitting at your desk for eight hours a day. Being physically present, mental engagement is optional. It means pushing out emails and rushing to meetings and generally getting nothing done. We’ve looked at the focus aspect of this before. But what about the learning aspect?

Can we change how we approach learning, letting go of the guilt associated with not being visibly active, and embrace what seems counter-intuitive?

Thinking and talking are useful elements of learning. And what we learn in our ‘play’ time can be valuable to our ‘work’ time, and there’s nothing wrong with moving between the two (or combining them) during our day.

When mastering a subject, our brains actually use different types of processing. Barbara Oakley explains in A Mind for Numbers: How to Excel at Math and Science (even if you flunked algebra) that our brain has two general modes of thinking – ‘focused’ and ‘diffuse’ – and both of these are valuable and required in the learning process.

The focused mode is what we traditionally associate with learning. Read, dive deep, absorb. Eliminate distractions and get into the material. Oakley says “the focused mode involves a direct approach to solving problems using rational, sequential, analytical approaches. … Turn your attention to something and bam – the focused mode is on, like the tight, penetrating beam of a flashlight.”

But the focused mode is not the only one required for learning because we need time to process what we pick up, to get this new information integrated into our existing knowledge. We need time to make new connections. This is where the diffuse mode comes in.

Diffuse-mode thinking is what happens when you relax your attention and just let your mind wander. This relaxation can allow different areas of the brain to hook up and return valuable insights. … Diffuse-mode insights often flow from preliminary thinking that’s been done in the focused mode.

Relying solely on the focused mode to learn is a path to burnout. We need the diffuse mode to cement our ideas, put knowledge into memory and free up space for the next round of focused thinking. We need the diffuse mode to build wisdom. So why does diffuse mode thinking at work generally involve feelings of guilt?

Oakley’s recommendations for ‘diffuse-mode activators’ are: go to the gym, walk, play a sport, go for a drive, draw, take a bath, listen to music (especially without words), meditate, sleep. Um, aren’t these all things to do in my ‘play’ time? And sleep? It’s a whole time chunk on its own.

Most organizations do not promote a culture that allow these activities to be integrated into the work day. Go to the gym on your lunch. Sleep at home. Meditate on a break. Essentially do these things while we are not paying you.

We ingest this way of thinking, associating the value of getting paid with the value of executing our task list. If something doesn’t directly contribute, it’s not valuable. If it’s not valuable I need to do it in my non-work time or not at all. This is learned behavior from our organizational culture, and it essentially communicates that our leaders would rather see us do less than trust in the potential payoff of pursuits that aren’t as visible or ones that don’t pay off as quickly. The ability to see something is often a large component of trust. So if we are doing any of these ‘play’ activities at work, which are invisible in terms of their contribution to the learning process, we feel guilty because we don’t believe we are doing what we get paid to do.

If you aren’t the CEO or the VP of HR, you can’t magic a policy that says ‘all employees shall do something meaningful away from their desks each day and won’t be judged for it’, so what can you do to learn better at work? Find a way to let go of the guilt baggage when you invest in proven, effective learning techniques that are out of sync with your corporate culture.

How do you let go of the guilt? How do you not feel it every time you stand up to go for a walk, close your email and put on some headphones, or have a coffee with a colleague to discuss an idea you have? Because sometimes knowing you are doing the right thing doesn’t translate into feeling it, and that’s where guilt comes in.

Guilt is insidious. Not only do we usually feel guilt, but then we feel guilty about feeling guilty. Like, I go to visit my grandmother in her old age home mostly because I feel guilty about not going, and then I feel guilty because I’m primarily motivated by guilt! Like if I were a better person I would be doing it out of love, but I’m not, so that makes me terrible.

Breaking this cycle is hard. Like anything new, it’s going to feel unnatural for a while but it can be done.

How? Be kind to yourself.

This may sound a bit touchy-feely, but it is really a just a cognitive-behavioral approach with a bit of mindfulness thrown in. Dennis Tirch has done a lot of research into the positive benefits of compassion for yourself on worry, panic and fear. And what is guilt but worry that you aren’t doing the right thing, fear that you’re not a good person, and panic about what to do about it?

In his book, The Compassionate-Mind Guide to Overcoming Anxiety, Tirch writes:

the compassion focused model is based on research showing that some of the ways in which we instinctively regulate our response to threats have evolved from the attachment system that operates between infant and mother and from other basic relationships between mutually supportive people. We have specific systems in our brains that are sensitive to the kindness of others, and the experience of this kindness has a major impact on the way we process these threats and the way we process anxiety in particular.

The Dalai Lama defines compassion as “a sensitivity to the suffering of others, with a commitment to do something about it,” and Tirch also explains that we are greatly impacted by our compassion to ourselves.

In order to manage and overcome emotions like guilt that can prevent us from learning and achieving, we need to treat ourselves the same way we would the person we love most in the world. “We can direct our attention to inner images that evoke feelings of kindness, understanding, and support,” writes Tirch.

So the next time you look up from that proposal on the new infrastructure schematics and see that the sun is shining, go for a walk, notice where you are, and give your mind a chance to go into diffuse-mode and process what you’ve been focusing on all morning. And give yourself a hug for doing it.

Language: Why We Hear More Than Words

It’s a classic complaint in relationships, especially romantic ones: “She said she was okay with me forgetting her birthday! Then why is she throwing dishes in the kitchen? Are the two things related? I wish I had a translator for my spouse. What is going on?”

The answer: Extreme was right, communication is more than words. It’s how those words are said, the tone, the order, even the choice of a particular word. It’s multidimensional.

In their book, Meaning and Relevance, Deirdre Wilson and Dan Sperber explore the aspects of communication that are beyond the definitions of the words that we speak but are still encoded in the words themselves.

Consider the following example:

Peter got angry and Mary left.

Mary left and Peter got angry.

We can instantly see that these two sentences, despite having exactly the same words, do not mean the same thing. The first one has us thinking, wow, Peter must get angry often if Mary leaves to avoid his behavior. Maybe she’s been the recipient of one too many tantrums and knows that there’s nothing she can do to diffuse his mood. The second sentence suggests that Peter wants more from Mary. He might have a crush on her! Same words – totally different context.

Human language is not a code. True codes have a one-to-one relationship with meaning. One sound, one definition. This is what we see with animals.

Wilson and Sperber explain that “coded communication works best when emitter and receiver share exactly the same code. Any difference between the emitter’s and receiver’s codes is a possible source of error in the communication process.” For animals, any evolutionary mutations that affected the innate code would be counter-adaptive. A song-bird one note off key is going to have trouble finding a mate.

Not so for humans. We communicate more than the definitions of our words would suggest. (Steven Pinker argues language itself as a DNA-level instinct.) And we decode more than the words spoken to us. This is inferential communication, and it means that we understand not only the words spoken, but the context in which they are spoken. Contrary to the languages of other animals, which are decidedly less ambiguous, human language requires a lot of subjective interpretation.

This is probably why we can land in a country where we don’t speak the language and can’t read the alphabet, yet get the gist of what the hotel receptionist is telling us. We can find our room, and know where the breakfast is served in the morning. We may not understand her words, but we can comprehend her tone and make inferences based on the context.

Wilson and Sperber argue that mutations in our inferential abilities do not negatively impact communication and potentially even enhance it. Essentially, because our human language is not simply a one-to-one code because more can be communicated beyond the exact representations of certain words, we can easily adapt to changes in communication and interpretation that may evolve in our communities.

For one thing, we can laugh at more than physical humor. Words can send us into stitches. Depending on how they are conveyed, the tone, the timing, the expressions that come along with them, we can find otherwise totally innocuous words hysterical.

Remember Abbott and Costello?

Who’s on first.”
“What?”
“No, what’s on second.”

Consider Irony

Irony is a great example of how powerfully we can communicate context with a few simple words.

I choose my words as indicators of a more complex thought that may include emotions, opinions, biases, and these words will help you infer this entire package. And one of my goals as the communicator is to make it as easy as possible for you to get the meaning I’m intending to convey.

Irony is more than just stating the opposite. There must be an expectation of that opposite in at least some of the population. And choosing irony is more of a commentary on that group. Wilson and Sperber argue that “what irony essentially communicates is neither the proposition literally expressed not the opposite of that proposition, but an attitude to this proposition and to those who might hold or have held it.”

For example

When Mary says, after a boring party, ‘That was fun’, she is neither asserting literally that the party was fun nor asserting ‘ironically’ that the party was boring. Rather, she is expressing an attitude of scorn towards (say) the general expectation among the guests that the party would be fun.

This is a pretty complex linguistic structure. It allows us to communicate our feelings on cultural norms fairly succinctly. Mary says ‘That was fun’. Three little words. And I understand that she hated the party, couldn’t wait to get out of there, feels distant from the other party-goers and is rejecting that whole social scene. Very powerful!

Irony works because it is efficient. To communicate the same information without irony involves more sentences. And my desire as a communicator is always to express myself in the most efficient way possible to my listener.

Wilson and Sperber conclude that human language developed and became so powerful because of two unique cognitive abilities of humans, language and the power to attribute mental states to others. We look for context for the words we hear. And we are very proficient at absorbing this context to infer meaning.

The lesson? If you want to understand reality, don't be pedantic.

Friedrich Nietzsche on Making Something Worthwhile of Ourselves

Friedrich Nietzsche (1844-1900) explored many subjects, perhaps the most important was himself.

A member of our learning community directed me to the passage below, written by Richard Schacht in the introduction to Nietzsche: Human, All Too Human: A Book for Free Spirits.

​If we are to make something worthwhile of ourselves, we have to take a good hard look at ourselves. And this, for Nietzsche, means many things. It means looking at ourselves in the light of everything we can learn about the world and ourselves from the natural sciences — most emphatically including evolutionary biology, physiology and even medical science. It also means looking at ourselves in the light of everything we can learn about human life from history, from the social sciences, from the study of arts, religions, literatures, mores and other features of various cultures. It further means attending to human conduct on different levels of human interaction, to the relation between what people say and seem to think about themselves and what they do, to their reactions in different sorts of situations, and to everything about them that affords clues to what makes them tick. All of this, and more, is what Nietzsche is up to in Human, All Too Human. He is at once developing and employing the various perspectival techniques that seem to him to be relevant to the understanding of what we have come to be and what we have it in us to become. This involves gathering materials for a reinterpretation and reassessment of human life, making tentative efforts along those lines and then trying them out on other human phenomena both to put them to the test and to see what further light can be shed by doing so.

Nietzsche realized that mental models were the key to not only understanding the world but understanding ourselves. Understanding how the world works is the key making more effective decisions and gaining insights. However, its through the journey of discovery of these ideas, that we learn about ourselves. Most of us want to skip the work, so we skim the surface of not only knowledge but ourselves.

Memory and the Printing Press

You probably know that Gutenberg invented the printing press. You probably know it was pretty important. You may have heard some stuff about everyone being able to finally read the Bible without a priest handy. But here's a point you might not be familiar with: The printing press changed why, and consequently what, we remember.

Before the printing press, memory was the main store of human knowledge. Scholars had to go to find books, often traveling around from one scriptoria to another. They couldn’t buy books. Individuals did not have libraries. The ability to remember was integral to the social accumulation of knowledge.

Thus, for centuries humans had built ways to remember out of pure necessity. Because knowledge wasn’t fixed, remembering content was the only way to access it. Things had to be known in a deep, accessible way as Elizabeth Eisenstein argues in The Printing Press as an Agent of Change:

As learning by reading took on new importance, the role played by mnemonic aids was diminished. Rhyme and cadence were no longer required to preserve certain formulas and recipes. The nature of the collective memory was transformed.

In the Church, for example, Eisenstein talks of a multimedia approach to remembering the bible. As a manuscript, it was not widely available, not even to many church representatives; the stories of the bible were often pictorially represented in the churches themselves. Use of images, both physically and mentally, was critical to storing knowledge in memory: they were used as a tool to allow one to create extensive “memory palaces” enabling the retention of knowledge.

Not only did printing eliminate many functions previously performed by stone figures over portals and stained glass in windows, but it also affected less tangible images by eliminating the need for placing figures and objects in imaginary niches located in memory theatres.

Thus, in an age before the printing press, bits of knowledge were associated with other bits of knowledge not because they complemented each other, or allowed for insights, but merely so they could be retained.

…the heavy reliance on memory training and speech arts, combined with the absence of uniform conventions for dating and placing [meant that] classical images were more likely to be placed in niches in ‘memory theatres’ than to be assigned a permanent location in a fixed past.

In our post on memory palaces, we used the analogy of a cow and a steak. To continue with the analogy used there, imagining that your partner asks you to pick up steak for dinner. To increase your chances of remembering the request, you envision a cow sitting on the front porch. When you mind-walk through your palace, you see this giant cow sitting there, perhaps waving at you (so unlike a cow!), causing you to think, ‘Why is that cow there–oh yeah, pick up steak for dinner’.

Before the printing press, it wasn’t just about picking up dinner. It was all of our knowledge. Euclid's Elements and Aristotle's Politics. The works of St. Augustine and Seneca. These works were shared most often orally, passing from memory to memory. Thus memory was not as much about remembering in the ages of scribes, as it was about preserving.

Consequently, knowledge was far less shared, and then only to those who could understand it and recall it.

To be preserved intact, techniques had to be entrusted to a select group of initiates who were instructed not only in special skills but also in the ‘mysteries’ associated with them. Special symbols, rituals, and incantations performed the necessary function of organizing data, laying out schedules, and preserving techniques in easily memorized forms.

Anyone who's played the game “Telephone” knows the problem: As knowledge is passed on, over and over, it gets transformed, sometimes distorted. This needed to be guarded against, and sometimes couldn't be. As there was no accessible reference library for knowledge, older texts were prized because they were closer to the originals.

Not only could more be learned from retrieving an early manuscript than from procuring a recent copy but the finding of lost texts was the chief means of achieving a breakthrough in almost any field.

Almost incomprehensible today, “Energies were expended on the retrieval of ancient texts because they held the promise of finding so much that still seemed new and untried.” Only by finding older texts could scholars hope to discover the original, unaltered sources of knowledge.

With the advent of the printing press, images and words became something else. Because they were now repeatable, they became fixed. No longer individual interpretations designed for memory access, they became part of the collective.

The effects of this were significant.

Difficulties engendered by diverse Greek and Arabic expressions, by medieval Latin abbreviations, by confusion between Roman letters and numbers, by neologisms, copyists’ errors and the like were so successfully overcome that modern scholars are frequently absent-minded about the limitations on progress in the mathematical sciences which scribal procedures imposed. … By the seventeenth century, Nature’s language was being emancipated from the old confusion of tongues. Diverse names for flora and fauna became less confusing when placed beneath identical pictures. Constellations and landmasses could be located without recourse to uncertain etymologies, once placed on uniform maps and globes. … The development of neutral pictorial and mathematical vocabularies made possible a large-scale pooling of talents for analyzing data, and led to the eventual achievement of a consensus that cut across all the old frontiers.

A key component of this was that apprentices and new scholars could consult books and didn’t have to exclusively rely on the memories of their superiors.

An updated technical literature enabled young men in certain fields of study to circumvent master-disciple relationships and to surpass their elders at the same time. Isaac Newton was still in his twenties when he mastered available mathematical treatises, beginning with Euclid and ending with an updated edition of Descartes. In climbing ‘on the shoulders of giants’ he was not re-enacting the experience of twelfth-century scholars for whom the retrieval of Euclid’s theorems had been a major feat.

Before the printing press, a scholar could spend his lifetime looking for a copy of Euclid’s Elements and never find them, thus having to rely on how the text was encoded in the memories of the scholars he encountered.

After the printing press, memory became less critical to knowledge. And knowledge became more widely dispersed as the reliance on memory being required for interpretation and understanding diminished. And with that, the collective power of the human mind was multiplied.

If you liked this post, check out our series on memory, starting with the advantages of our faulty memory, and continuing to the first part on our memory's frequent errors.

Mozart’s Brain and the Fighter Pilot

Most of us want to be smarter but have no idea how to go about improving our mental apparatus. We intuitively think that if we raised our IQ a few points that we'd be better off intellectually. This isn't necessarily the case. I know a lot of people with high IQs that make terribly stupid mistakes. The way around this is by improving not our IQ, but our overall cognition.

Cognition, argues Richard Restak, “refers to the ability of our brain to attend, identify, and act.” You can think of this as a melange of our moods, thoughts, decisions, inclinations and actions.

Included among the components of cognition are alertness, concentration, perceptual speed, learning, memory, problem solving, creativity, and mental endurance.

All of these components have two things in common. First, our efficacy at them depends on how well the brain is functioning relative to its capabilities. Second, this efficacy function can be improved with the right discipline and the right habits.

Restak convincingly argues that we can make our brains work better by “enhancing the components of cognition.” How we go about improving our brain performance, and thus cognition, is the subject of his book Mozart’s Brain and the Fighter Pilot.

Improving Our Cognitive Power

To improve the brain we need to exercise our cognitive powers. Most of us believe that physical exercise helps us feel better and live healthier; yet how many of us exercise our brain? As with our muscles and our bones, “the brain improves the more we challenge it.”

This is possible because the brain retains a high degree of plasticity; it changes in response to experience. If the experiences are rich and varied, the brain will develop a greater number of nerve cell connections. If the experiences are dull and infrequent, the connections will either never form or die off.

If we’re in stimulating and challenging environments, we increase the number of nerve cell connections. Our brain literally gets heavier, as the number of synapses (connections between neurons) increases. The key that many people miss here is “rich and varied.”

Memory is the most important cognitive function. Imagine if you lost your memory permanently: Would you still be you?

“We are,” Restak writes, “what we remember.” And poor memories are not limited to those who suffer from Alzheimer's disease. While some of us are genetically endowed with superlative memories, the rest of us need not fear.

Aristotle suggested that our mind was a wax tablet in a short book on memory, arguing that the passage of time fades the image unless we take steps to preserve it. He was right in ways he never knew; memory researchers know now that, like a wax tablet, our memory changes every time we access it, due to the plasticity Restak refers to. It can also be molded and improved, at least to a degree.

Long ago, the Greeks hit upon the same idea — mostly starting with Plato — that we don’t have to accept our natural memory. We can take steps to improve it.

Learning and Knowledge Acquisition

When we learn something new, we expand the complexity of our brain. We literally increase our brainpower.

[I]ncrease your memory and you increase your basic intelligence. … An increased memory leads to easier, quicker accessing of information, as well as greater opportunities for linkages and associations. And, basically, you are what you can remember.

Too many of us can’t remember these days, because we’ve outsourced our brain. One of the most common complaints at the neurologist's office for people over forty is poor memory. Luckily most of these people do not suffer from something neurological, but rather the cumulative effect of disuse — a graceful degradation of their memory.

Those who are not depressed (the commonest cause of subjective complaints of memory impairment) are simply experiencing the cumulative effect of decades of memory disuse. Part of this disuse is cultural. Most businesses and occupations seldom demand that their employees recite facts and figures purely from memory. In addition, in some quarters memory is even held in contempt. ‘He’s just parroting a lot of information he doesn’t really understand’ is a common put-down when people are enviously criticizing someone with a powerful memory. Of course, on some occasions, such criticisms are justified, particularly when brute recall occurs in the absence of understanding or context. But I’m not advocating brute recall. I’m suggesting that, starting now, you aim for a superpowered memory, a memory aimed at quicker, more accurate retrieval of information.

Prior to the printing press, we had to use our memories. Epics such as The Odyssey and The Iliad, were recited word-for-word. Today, however, we live in a different world, and we forget that these things were even possible. Information is everywhere. We need not remember anything thanks to technology. This helps and hinders the development of our memory.

[Y]ou should think of the technology of pens, paper, tape recorders, computers, and electronic diaries as an extension of the brain. Thanks to these aids, we can carry incredible amounts of information around with us. While this increase in readily available information is generally beneficial, there is also a downside. The storage and rapid retrieval of information from a computer also exerts a stunting effect on our brain’s memory capacities. But we can overcome this by working to improve our memory by aiming at the development and maintenance of a superpowered memory. In the process of improving our powers of recall, we will strengthen our brain circuits, starting at the hippocampus and extending to every other part of our brain.

Information is only as valuable as what it connects to. Echoing the latticework of mental models, Restek states:

Everything that we learn is stored in the brain within that vast, interlinking network. And everything within that network is potentially connected to everything else.

From this we can draw a reasonable conclusion: if you stop learning mental capacity declines.

That’s because of the weakening and eventual loss of brain networks. Such brain alterations don’t take place overnight, of course. But over a varying period of time, depending on your previous training and natural abilities, you’ll notice a gradual but steady decrease in your powers if you don’t nourish and enhance these networks.

The Better Network: Your Brain or the Internet

Networking is a fundamental operating principle of the human brain. All knowledge within the brain is based on networking. Thus, any one piece of information can be potentially linked with any other. Indeed, creativity can be thought of as the formation of novel and original linkages.

In his book, Weaving the Web: The Original Design and the Ultimate Destiny of the World Wide Web, Tim Berners-Lee, the creator of the Internet, distills the importance of the brain forming connections.

A piece of information is really defined only by what it’s related to, and how it’s related. There really is little else to meaning. The structure is everything. There are billions of neurons in our brains, but what are neurons? Just cells. The brain has no knowledge until connections are made between neurons. All that we know, all that we are, comes from the way our neurons are connected.

Cognitive researchers now accept that it may not be the size of the human brain which gives it such unique abilities — other animals have large brains as well. Rather its our structure; the way our neurons are structured, arranged, and linked.

The more you learn, the more you can link. The more you can link, the more you increase the brain's capacity. And the more you increase the capacity of your brain the better able you’ll be to solve problems and make decisions quickly and correctly. This is real brainpower.

Multidisciplinary Learning

Restak argues that a basic insight about knowledge and intelligence is: “The existence of certain patterns, which underlie the diversity of the world around us and include our own thoughts, feelings, and behaviors.”

Intelligence enhancement therefore involves creating as many neuronal linkages as possible. But in order to do this we have to extricate ourselves from the confining and limiting idea that knowledge can be broken down into separate “disciplines” that bear little relation to one another.

This brings the entire range of ideas into play, rather than just silos of knowledge from human-created specialities. Charlie Munger and Richard Feynman would probably agree that such over-specialization can be quite limiting. As the old proverb goes, the frog in the well knows nothing of the ocean.

Charles Cameron, a game theorist, adds to this conversation:

The entire range of ideas can legitimately be brought into play: and this means not only that ideas from different disciplines can be juxtaposed, but also that ideas expressed in ‘languages’ as diverse as music, painting, sculpture, dance, mathematics and philosophy can be juxtaposed, without first being ‘translated’ into a common language.

Mozart's Brain and the Fighter Pilot goes on to provide 28 suggestions and exercises for enhancing your brain's performance, a few of which we’ll cover in future posts.

The Self-Education of Louis L’Amour


“That was Louis’s way – to find something of value from every printed page.”
— Daniel Boorstein

***

The author Louis L’Amour (1908-1988) was among America’s most prolific and most beloved. He wrote 105 books, most of which were fiction, and at his death in 1988 they were all still in print. Most still are today. (His prolific nature resembles another great American author, Isaac Asimov.)

Two things drove L’Amour: Adventure and a deep need for self-education. In his memoir, The Education of a Wandering Man, he makes it clear that the two went hand in hand. His travels were his way of learning by direct experience, but he augmented that with a tremendous and voracious appetite for the vicarious learning that comes through reading.

Writing in in the late 1980’s, L’Amour describes his love of the written word, a pursuit he undertook at all cost:

Today you can buy the Dialogues of Plato for less than you would spend on a fifth of whisky, or Gibbon’s Decline and Fall of the Roman Empire for the price of a cheap shirt. You can buy a fair beginning of an education in any bookstore with a good stock of paperback books for less than you would spend on a week’s supply of gasoline.

Often I hear people say they do not have the time to read. That’s absolute nonsense. In one year during which I kept that kind of record, I read twenty-five books while waiting for people. In offices, applying for jobs, waiting to see a dentist, waiting in a restaurant for friends, many such places. I read on buses, trains and planes. If one really wants to learn, one has to decide what is important. Spending an evening on the town? Attending a ball game? Or learning something that can be with you your life long?

Byron’s Don Juan I read on an Arab dhow sailing north from Aden up the Red Sea to Port Tewfik on the Suez Canal. Boswell’s Life of Samuel Johnson I read while broke and on the beach in San Pedro. In Singapore, I came upon a copy of The Annals and Antiquities of Rajahstan by James Tod.

Many of us think we don’t have the time or the inclination to keep learning, but to L’Amour this was a ridiculous idea. If he didn’t educate himself, who else would do the job? In this sense, all education is self-education.

No man or woman has a greater appreciation for schools than I, although few have spent less time in them. No matter how much I admire our schools, I know that no university exists that can provide an education; what a university can provide is an outline, to give the learner a direction and guidance. The rest one has to do for oneself.

What is the point of an education? Steven Pinker would define it more precisely years later, but to L’Amour it was pretty simple, and closely aligned with our ethos at Farnam Street: To enable one to live a better life.

Education should provide the tools for a widening and deepening of life, for increased appreciation of all one sees or experiences. It should equip a person to live life well, to understand what is happening about him, for to live life well one must live with awareness.

L’Amour was clearly a proponent of direct life experience, and he had more than most. As his memoir details, his young life saw him take on the role of a traveling hobo, sailor, amateur boxer, miner, and ranch hand, jobs that took him all around the world in search of work and adventure.

But throughout, L’Amour knew that his destiny was to become a storyteller, and he also knew that to avoid a lot of misery in life would require a massive amount of experience he couldn’t obtain directly.

So he did it through books.

It is often said that one has but one life to live, but that is nonsense. For one who reads, there is no limit to the number of lives that may be lived, for fiction, biography, and history offer an inexhaustible number of lives in many parts of the world, in all periods of time.

So it was with me. I saved myself much hardship by learning from the experiences of others, learning what to expect and what to avoid. I have no doubt that my vicarious experience saved me from mistakes I might otherwise have made—not to say I did not make many along the way.

Although he didn’t set out to learn for this reason, L’Amour also discovered an important lesson in associative pattern-matching and creativity: The brain needs to be stocked full to make interesting and useful connections.

A love of learning for its own sake creates a massive ancillary benefit. What L’Amour says about writers goes for all of us, in any profession:

I have never had to strive to graduate, never to earn a degree. The only degrees I have are honorary, and I am proud to have them. I studied purely for the love of learning, wanting to know and understand. For a writer, of course, everything is grist for the mill, and a writer cannot know too much. Sooner or later everything he does know will find its uses.

A writer’s brain is like a magician’s hat. If you’re going to get anything out of it, you have to put something in first.

I have studied a thousand things I never expected to find use in a story, yet every once in a while these things will find a place.

People who read a lot, people like L’Amour, are often asked about what should be read. Is there some program or direction to take?

The answer we give at Farnam Street and the answer L’Amour gave are about the same: You must follow your passions, follow your curiosities. Why does this work? Nassim Taleb once hit it on the head by saying that “Curiosity is antifragile, like an addiction it is magnified by attempts to satisfy it.”

Down the line, as those curiosities are pursued, the course tends to become quite clear. Trying to pursue some difficult course of study is not the way to get your engines going.

Says L’Amour:

For those who have not been readers, my advice is to read what entertains you. Reading is fun. Reading is adventure. It is not important what you read at first, only that you read.

Many would advise the great books first, but often readers are not prepared for them. If you want to study the country from which you came, there are atlases with maps and there are good books on all countries, books of history, of travel, of current affairs.

Our libraries are not cloisters for an elite. They are for the people, and if they are not used, the fault belongs to those who do not take advantage of their wealth. If one does not move on from what merely amuses to what interests, the fault lies in the reader, for everything is there.

One mistake made by would-be learners it to think that they need guidance or permission to do so. That they must take a class on Shakespeare to enjoy Shakespeare or take a guided tour of the classics in order to enjoy those.

The great works of the world are there to be enjoyed by all. (Of course, we have some recommendations for how to read books in general.) But as L’Amour guides, you must learn and read what you like, unless there is an important extenuating circumstance. Boredom creates a shut-off valve in the brain. And if you’re always reading something of even moderate depth, you simply can’t avoid learning. A continually curious mind ends up at the classics one way or another anyways.

In the end, in a thought later echoed by the technology great Andrew Ng, L'Amour believed the human mind was capable of incredible creativity, perhaps beyond what we currently believe:

Personally, I do not believe the human mind has any limits but those we impose ourselves. I believe that creativity and inventiveness are there for anybody willing to apply himself. I do not believe that man has even begun to realize who he is or what he can become. So far he has been playing it by ear, following paths of least resistance, getting by — because most others were just getting by too. I believe that man has been living in a Neanderthal state of mind. Mentally, we are still flaking rocks for scraping stones or chipping them for arrowheads. […]

We simply must free the mind from its fetters and permit it to function without restraint. Many of us have learned to supply ourselves with the raw materials and then allow the subconscious to take over. This is what creativity is. One must condition oneself for the process and then let it proceed.

If you liked this post, you might like these too:

Schopenhauer on Reading and Books – One of the most timeless and beautiful meditations on reading comes from the 19th-century German philosopher Arthur Schopenhauer.

Reading a Book is a Conversation Between You and the Author – Full ownership of a book only comes when you have made it a part of yourself, and the best way to make yourself a part of it— which comes to the same thing— is by writing in it.