Tag: William Deresiewicz

Multitasking: Giving the World an Advantage it Shouldn’t Have


Echoing the comments of William Deresiewicz, Charlie Munger offers some sage advice on multi-tasking:

I will say this, I know no wise person who doesn’t read a lot. I suspect that you can read on the computer now and get a lot of benefit out of it, but I doubt it will work as well as reading print worked for me.

I think people that multitask pay a huge price. They think they’re being extra productive, and I think they’re (out of their mind). I use the metaphor of the one-legged man in the ass-kicking contest.

I think when you multi-task so much, you don’t have time to think about anything deeply. You’re giving the world an advantage you shouldn’t do. Practically everybody is drifting into that mistake.

Concentrating hard on something that is important is … I can’t succeed at all without doing it. I did not succeed in life by intelligence. I succeeded because I have a long attention span.

It sounds counter-intuitive but if you want to increase discretionary time and reduce stress you need to schedule time to think. The tiny fragments of time many of us find ourselves with have a negative effect on our ability to think deeply about a problem. Furthermore they impede our ability to learn — we stay at a surface level and never move into a deep understanding.

Deresiewicz warns: “You simply cannot (think) in bursts of 20 seconds at a time, constantly interrupted by Facebook messages or Twitter tweets, or fiddling with your iPod, or watching something on YouTube.”

The opposite approach is to focus on a problem or subject and try to achieve a deep fluency. How many of us, however, have time? We don't do the work required to have an opinion. Instead we operate with surface knowledge. We tackle problems with the first thought that comes to mind. Because we make a poor initial decision, we spend countless hours attempting to correct it. No wonder we have no time to think. We're not heeding the advice of Joseph Tussman and letting the world do the work for us.

We sound good and yet and we fail to learn — in part because everyone else is doing the same thing. Well, when you do what everyone else does, don't be surprised when you get the same results everyone else gets.

If you want to get off the same track that everyone else is on, start scheduling time to think. That's what Munger did when he sold himself the best hour of his day. Structure your environment in a way that promotes thinking and reduces interruption. And match your energy to your task.

Steven Pinker on What a Broad Education Should Entail

Harvard's great biologist/psychologist Steven Pinker is one of my favorites, even though I'm just starting to get into his work.

What makes him great is not just his rational mind, but his multidisciplinary approach. He pulls from many fields to make his (generally very good) arguments. And he's a rigorous scientist in his own field, even before we get to his ability to synthesize.

I first encountered Pinker in reading Poor Charlie's Almanack: Charlie Munger gives him the edge over Noam Chomsky and others in the debate over whether the capacity for language has been “built into” our DNA through natural selection. Pinker wrote the bestseller The Language Instinct, in which he argued that the capacity for complex language is innate. We develop it, of course, throughout our lives, but it's in our genes from the beginning (an idea that has since been criticized).

Pinker went on to write books with modest titles like How the Mind Works, The Blank Slate: The Modern Denial of Human Nature, and The Better Angels of our Nature: Why Violence Has Declined. The latter is a controversial one: Bill Gates loves it, Nassim Taleb hates it. You'll have to make up your own mind.


The reason for writing about Pinker is that, while re-reading William Deresiewicz's brilliant speech Solitude and Leadership, I noticed that he had an extremely popular piece about not sending your kids to Ivy League schools. It's an interesting argument, though I'm not sure I agree with all of it.

A little Googling told me that Pinker, himself a professor at an Ivy League school, responded with an even better piece on why Deresiewicz was imprecise in his criticisms and anecdotes.

I was fascinated most by Pinker's discussion of what an elite education should entail. This tells you a lot about his mind:

This leads to Deresiewicz’s second goal, “building a self,” which he explicates as follows: “it is only through the act of establishing communication between the mind and the heart, the mind and experience, that you become an individual, a unique being—a soul.” Perhaps I am emblematic of everything that is wrong with elite American education, but I have no idea how to get my students to build a self or become a soul. It isn’t taught in graduate school, and in the hundreds of faculty appointments and promotions I have participated in, we’ve never evaluated a candidate on how well he or she could accomplish it. I submit that if “building a self” is the goal of a university education, you’re going to be reading anguished articles about how the universities are failing at it for a long, long time.

I think we can be more specific. It seems to me that educated people should know something about the 13-billion-year prehistory of our species and the basic laws governing the physical and living world, including our bodies and brains. They should grasp the timeline of human history from the dawn of agriculture to the present. They should be exposed to the diversity of human cultures, and the major systems of belief and value with which they have made sense of their lives. They should know about the formative events in human history, including the blunders we can hope not to repeat. They should understand the principles behind democratic governance and the rule of law. They should know how to appreciate works of fiction and art as sources of aesthetic pleasure and as impetuses to reflect on the human condition.

On top of this knowledge, a liberal education should make certain habits of rationality second nature. Educated people should be able to express complex ideas in clear writing and speech. They should appreciate that objective knowledge is a precious commodity, and know how to distinguish vetted fact from superstition, rumor, and unexamined conventional wisdom. They should know how to reason logically and statistically, avoiding the fallacies and biases to which the untutored human mind is vulnerable. They should think causally rather than magically, and know what it takes to distinguish causation from correlation and coincidence. They should be acutely aware of human fallibility, most notably their own, and appreciate that people who disagree with them are not stupid or evil. Accordingly, they should appreciate the value of trying to change minds by persuasion rather than intimidation or demagoguery.

I believe (and believe I can persuade you) that the more deeply a society cultivates this knowledge and mindset, the more it will flourish. The conviction that they are teachable gets me out of bed in the morning. Laying the foundations in just four years is a formidable challenge. If on top of all this, students want to build a self, they can do it on their own time.

If this seems familiar to some of you, that's because it very closely parallels thoughts by Charlie Munger, who has argued many times for something similar in his demand for multidisciplinary worldly wisdom. We must learn the big ideas from the big disciplines. Notice the buckets Pinker talks about: 13 billion years of organic and inorganic history, 10,000 years of human culture, hundreds of years of modern civilization. These are the most reliable forms of wisdom.

So if the education system won't do it for you, the job must be done anyway. Pinker and Munger have laid out the kinds of things you want to go about learning. Don't let the education system keep you from having a real education. Learn how to think. Figure out how to spend more time reading. When you do, focus on the most basic and essential wisdom — including the lessons from history.

Of course, if you're reading Farnam Street, you're already on the right track.


William Deresiewicz: How To Learn How To Think

“I’ve spent my life trying to undo habits—especially habits of thinking. They narrow your interaction with the world. They’re the phrases that come easily to your mind, like: ‘I know what I think,’ or ‘I know what I like,’ or ‘I know what’s going to happen today.’ If you just replace ‘know’ with ‘don’t know,’ then you start to move into the unknown. And that’s where the interesting stuff happens.”  — Humans of New York


I've read Solitude and Leadership, an essay by William Deresiewicz before. In fact, I even pointed out some of its leadership lessons. However, after a friend prompted a re-visit to the very same essay, I realized that I missed a key part.

How do you learn to think?

Let’s start with how you don’t learn to think. A study by a team of researchers at Stanford came out a couple of months ago. The investigators wanted to figure out how today’s college students were able to multitask so much more effectively than adults. How do they manage to do it, the researchers asked? The answer, they discovered—and this is by no means what they expected—is that they don’t. The enhanced cognitive abilities the investigators expected to find, the mental faculties that enable people to multitask effectively, were simply not there. In other words, people do not multitask effectively. And here’s the really surprising finding: the more people multitask, the worse they are, not just at other mental abilities, but at multitasking itself.

One thing that made the study different from others is that the researchers didn’t test people’s cognitive functions while they were multitasking. They separated the subject group into high multitaskers and low multitaskers and used a different set of tests to measure the kinds of cognitive abilities involved in multitasking. They found that in every case the high multitaskers scored worse. They were worse at distinguishing between relevant and irrelevant information and ignoring the latter. In other words, they were more distractible. They were worse at what you might call “mental filing”: keeping information in the right conceptual boxes and being able to retrieve it quickly. In other words, their minds were more disorganized. And they were even worse at the very thing that defines multitasking itself: switching between tasks.

Multitasking, in short, is not only not thinking, it impairs your ability to think. Thinking means concentrating on one thing long enough to develop an idea about it. Not learning other people’s ideas, or memorizing a body of information, however much those may sometimes be useful. Developing your own ideas. In short, thinking for yourself. You simply cannot do that in bursts of 20 seconds at a time, constantly interrupted by Facebook messages or Twitter tweets, or fiddling with your iPod, or watching something on YouTube.

I find for myself that my first thought is never my best thought. My first thought is always someone else’s; it’s always what I’ve already heard about the subject, always the conventional wisdom. It’s only by concentrating, sticking to the question, being patient, letting all the parts of my mind come into play, that I arrive at an original idea. By giving my brain a chance to make associations, draw connections, take me by surprise. And often even that idea doesn’t turn out to be very good. I need time to think about it, too, to make mistakes and recognize them, to make false starts and correct them, to outlast my impulses, to defeat my desire to declare the job done and move on to the next thing.

I used to have students who bragged to me about how fast they wrote their papers. I would tell them that the great German novelist Thomas Mann said that a writer is someone for whom writing is more difficult than it is for other people. The best writers write much more slowly than everyone else, and the better they are, the slower they write. James Joyce wrote Ulysses, the greatest novel of the 20th century, at the rate of about a hundred words a day—half the length of the selection I read you earlier from Heart of Darkness—for seven years. T. S. Eliot, one of the greatest poets our country has ever produced, wrote about 150 pages of poetry over the course of his entire 25-year career. That’s half a page a month. So it is with any other form of thought. You do your best thinking by slowing down and concentrating.

Deresiewicz concentrating and thinking

And there you have it. An argument to spend more time thinking.


William Deresiewicz with an insightful article in The American Scholar arguing that we've fallen into the trap of scientism: the belief that science is the only valid form of knowledge.

Reading fiction increases our ability to empathize with others? Did we really need science to tell us that? Apparently, we need science to tell us everything. ….

In The Prisoner of Sex, Norman Mailer wrote that he was “sufficiently intimate with magazine readers to know the age of technology had left them with an inability to respect writing which lacked the authority of statistics.” I don’t know about readers, but I do know about editors, and most of them don’t like it when you rest your argument on literary sources. They want numbers, studies, sociology. Aristotle, Montaigne, and Emerson are not valid authorities on the topic, say, of friendship, but a study of 50 college students is enough to convince an editor of anything.

Oh, those studies. They always have a lot of data, but they so often miss the point. Their focus is too narrow, or they ignore the important factors, or they fail to grasp the underlying questions. They’re either jaw-droppingly obvious or head-clutchingly misguided. Science is bad enough, where it doesn’t belong, but the social sciences are even worse, precisely because they pretend to scientific rigor. As Alan Bloom pointed out, when the social sciences committed themselves to the principle of measurement, they gave up the ability to talk about anything that can’t be measured.

Still curious? Reading fiction is good for you.

William Deresiewicz on Learning To Lead and the Ills of Exposing Yourself to a Constant Stream of Other People’s Thoughts

William Deresiewicz delivered a stunning lecture on Solitude and Leadership to the United States Military Academy at West Point.

In the lecture, Deresiewicz convincingly argues that

  1. We don't teach leadership;
  2. Excellence doesn't get you up the greasy pole of bureaucracy;
  3. We constantly bombard ourselves with the opinions of others; and
  4. Leaders need to spend some time alone with their thoughts and ideas so they know why and where they are leading.

While a contradiction to a lot of today's common practice, it's also an antidote to many of our ills.

Here are two parts to whet your appetite.

My title must seem like a contradiction. What can solitude have to do with leadership? Solitude means being alone, and leadership necessitates the presence of others-the people you're leading. When we think about leadership in American history we are likely to think of Washington, at the head of an army, or Lincoln, at the head of a nation, or King, at the head of a movement-people with multitudes behind them, looking to them for direction. And when we think of solitude, we are apt to think of Thoreau, a man alone in the woods, keeping a journal and communing with nature in silence.

Leadership is what you are here to learn-the qualities of character and mind that will make you fit to command a platoon, and beyond that, perhaps, a company, a battalion, or, if you leave the military, a corporation, a foundation, a department of government. Solitude is what you have the least of here, especially as plebes. You don't even have privacy, the opportunity simply to be physically alone, never mind solitude, the ability to be alone with your thoughts. And yet I submit to you that solitude is one of the most important necessities of true leadership. This lecture will be an attempt to explain why.


The very rigor and regimentation to which you are quite properly subject here naturally has a tendency to make you lose touch with the passion that brought you here in the first place. I saw exactly the same kind of thing at Yale. It’s not that my students were robots. Quite the reverse. They were in­tensely idealistic, but the overwhelming weight of their practical responsibilities, all of those hoops they had to jump through, often made them lose sight of what those ideals were. Why they were doing it all in the first place.

… Here’s the other problem with Facebook and Twitter and even The New York Times. When you expose yourself to those things, especially in the constant way that people do now—older people as well as younger people—you are continuously bombarding yourself with a stream of other people’s thoughts.


If you liked this, you'll love:

Learning how to think — The journey of learning requires patience, concentration, and most importantly time for thinking.