Charles Dickens to The Times — I Stand Astounded and Appalled

Charles-Dickens

On November 13, 1849 a crowd of over 30,000 people gathered outside a prison in South London to witness the public execution of Marie and Frederick Manning. Marie and Frederick, a married couple, had recently murdered Marie’s wealthy former lover, Patrick O’Connor. Given that this was the first married couple to be hanged in over a century, the publicity was intense, and it became known as “The hanging of the century.” The event also attracted the pen of Charles Dickens, who shared his opinion with The Times and its readers.

Devonshire Terrace,
Tuesday, Thirteenth November, 1849

Sir,
I was a witness of the execution at Horsemonger Lane this morning. I went there with the intention of observing the crowd gathered to behold it, and I had excellent opportunities of doing so, at intervals all through the night, and continuously from daybreak until after the spectacle was over. I do not address you on the subject with any intention of discussing the abstract question of capital punishment, or any of the arguments of its opponents or advocates. I simply wish to turn this dreadful experience to some account for the general good, by taking the readiest and most public means of adverting to an intimation given by Sir G. Grey in the last session of Parliament, that the Government might be induced to give its support to a measure making the infliction of capital punishment a private solemnity within the prison walls (with such guarantees for the last sentence of the law being inexorably and surely administered as should be satisfactory to the public at large), and of most earnestly beseeching Sir G. Grey, as a solemn duty which he owes to society, and a responsibility which he cannot for ever put away, to originate such a legislative change himself. I believe that a sight so inconceivably awful as the wickedness and levity of the immense crowd collected at that execution this morning could be imagined by no man, and could be presented in no heathen land under the sun. The horrors of the gibbet and of the crime which brought the wretched murderers to it faded in my mind before the atrocious bearing, looks, and language of the assembled spectators. When I came upon the scene at midnight, the shrillness of the cries and howls that were raised from time to time, denoting that they came from a concourse of boys and girls already assembled in the best places, made my blood run cold. As the night went on, screeching, and laughing, and yelling in strong chorus of parodies on negro melodies, with substitutions of “Mrs. Manning” for “Susannah” and the like, were added to these. When the day dawned, thieves, low prostitutes, ruffians, and vagabonds of every kind, flocked on to the ground, with every variety of offensive and foul behaviour. Fightings, faintings, whistlings, imitations of Punch, brutal jokes, tumultuous demonstrations of indecent delight when swooning women were dragged out of the crowd by the police, with their dresses disordered, gave a new zest to the general entertainment. When the sun rose brightly— as it did— it gilded thousands upon thousands of upturned faces, so inexpressibly odious in their brutal mirth or callousness, that a man had cause to feel ashamed of the shape he wore, and to shrink from himself, as fashioned in the image of the Devil. When the two miserable creatures who attracted all this ghastly sight about them were turned quivering into the air, there was no more emotion, no more pity, no more thought that two immortal souls had gone to judgement, no more restraint in any of the previous obscenities, than if the name of Christ had never been heard in this world, and there were no belief among men but that they perished like the beasts.

I have seen, habitually, some of the worst sources of general contamination and corruption in this country, and I think there are not many phases of London life that could surprise me. I am solemnly convinced that nothing that ingenuity could devise to be done in this city, in the same compass of time, could work such ruin as one public execution, and I stand astounded and appalled by the wickedness it exhibits. I do not believe that any community can prosper where such a scene of horror and demoralization as was enacted this morning outside Horsemonger Lane Gaol is presented at the very doors of good citizens, and is passed by, unknown or forgotten. And when in our prayers and thanksgivings for the season we are humbly expressing before God our desire to remove the moral evils of the land, I would ask your readers to consider whether it is not a time to think of this one, and to root it out.

I am, Sir, your faithful Servant.
Charles Dickens

This letter and many others can be found in Letters of Note: An Eclectic Collection of Correspondence Deserving of a Wider Audience.

Elon Musk Recommends 12 Books

Musk
The best thing about Elon Musk is that he makes us dream big again. Musk, of course, is the billionaire behind Tesla and SpaceX.

Charlie Munger was asked a question about him at the 2014 Daily Journal Meeting and he replied:

I think Elon Musk is a genius, and I don’t use that word lightly. I think he’s also one of the boldest men that ever came down the pike.

Whenever anyone asks him how he learned to build rockets, he says, ‘I read books.’ Not only does he read them, according to his interview with Esquire, he devours them. After meeting Musk, people tend to walk away with the same reaction: ‘He’s the smartest guy I’ve ever met.’

Not to be outdone by his friend and co-founder, Peter Thiel, who offered some reading recommendations, Musk has a few of his own that influenced him.

In an interview with Design and Architecture, Musk said “In terms of sci-fi books, I think Isaac Asimov is really great. I like the Foundation series, probably one of the all-time best. Robert Heinlein, obviously. I like The Moon Is a Harsh Mistress and I like Stranger in a Strange Land, although it kind of goes off the rails at the end.” He continues “There’s a good book on structural design called Structures: Or Why Things Don’t Fall Down. It is really, really good if you want a primer on structural design.”

Here are some of his other reading recommendations.

The Lord of the Rings by J.R.R. Tolkien
He told the New Yorker that as an “undersized and picked upon smart-aleck,” he turned to reading fantasy and science fiction. “The heroes of the books I read, ‘The Lord of the Rings’ and the ‘Foundation’ series, always felt a duty to save the world.”

Benjamin Franklin: An American Life by Walter Isaacson. “He was an entrepreneur,” Musk says in an interview. “He started from nothing. He was just a runaway kid.”

In that same interview he also recommends Einstein: His Life and Universe, also by Isaacson.

Zero to One: Notes on Startups, or How to Build the Future by Peter Thiel
I’ve already said this is required reading for Farnam Streeters. Of this book Musk says: “Peter Thiel has built multiple breakthrough companies, and (this book) shows how.”

Superintelligence: Paths, Dangers, Strategies by Nick Bostrom
“Worth reading Superintelligence by Bostrom. We need to be super careful with AI. Potentially more dangerous than nukes.” he tweeted. Of course I bought this.

Howard Hughes: His Life and Madness by Donald L. Barlett and James B. Steele
Recently, in an interview with CNN, he mentioned having just finished this book. Musk calls it a “cautionary tale.”

The Hitchhiker’s Guide to the Galaxy by Douglas Adams
Here is an excerpt from an interview where he explains why this was a key book for him:

Alison van Diggelen: I understand Hitchhikers Guide to the Galaxy, that wonderful book by Douglas Adams, that was a key book for you. What was it about that book that fired your imagination?

Elon Musk: I guess when I was around 12 or 15 … I had an existential crisis, and I was reading various books on trying to figure out the meaning of life and what does it all mean? It all seemed quite meaningless and then we happened to have some books by Nietzsche and Schopenhauer in the house, which you should not read at age 14 (laughter). It is bad, it’s really negative. So then I read Hitchhikers Guide to the Galaxy which is quite positive I think and it highlighted an important point which is that a lot of times the question is harder than the answer. And if you can properly phrase the question, then the answer is the easy part. So, to the degree that we can better understand the universe, then we can better know what questions to ask. Then whatever the question is that most approximates: what’s the meaning of life? That’s the question we can ultimately get closer to understanding. And so I thought to the degree that we can expand the scope and scale of consciousness and knowledge, then that would be a good thing.”

Finally we get to the rocket science part.

Ignition!: An informal history of liquid rocket propellants by John D. Clark
“There is a good book on rocket stuff called ‘Ignition!’ by John Clark that’s a really fun one,” Musk said in an interview. Becoming a rocket scientist isn’t cheap. This book recommendation from Musk will set you back about 3k for a used copy (it’s also free on the web)

​​(Additional Sources: Business Insider and favobooks)

Andy Warhol on Loneliness

"As soon as you stop waning something you get it. I’ve found that to be absolutely axiomatic."
“As soon as you stop wanting something you get it. I’ve found that to be absolutely axiomatic.”

In his pseudo memoir, The Philosophy of Andy Warhol (From A to B and Back Again), which is more a collection of his thoughts on various subjects, Andy Warhol writes about the paradox of getting what you don’t want.

I had an incredible number of roommates. To this day almost every night I go out in New York I run into somebody I used to room with who invariably explains to my date, “I used to live with Andy.” I always turn white—I mean whiter. After the same scene happens a few times, my date can’t figure out how I could have lived with so many people, especially since they only know me as the loner I am today. Now, people who imagine me as the 60s media partygoer who traditionally arrived at parties with a minimum six-person “retinue” may wonder how I dare to call myself a “loner,” so let me explain how I really mean that and why it’s true. At the time in my life when I was feeling the most gregarious and looking for bosom friendships, I couldn’t find any takers, so that exactly when I was alone was when I felt the most like not being alone. The moment I decided I’d rather be alone and not have anyone telling me their problems, everybody I’d never even seen before in my life started running after me to tell me things I’d just decided I didn’t think it was a good idea to hear about. As soon as I became a loner in my own mind, that’s when I got what you might call a “following.”

As soon as you stop wanting something you get it. I’ve found that to be absolutely axiomatic.

The Philosophy of Andy Warhol is an examination of things important to him—love, beauty, art, fame, and business

Tiny Beautiful Things

On March 11, 2010, a new writer took over “Dear Sugar,” an advice column on the Web site the Rumpus.

She claimed she would offer a combination of “the by-the-book common sense of Dear Abby and the earnest spiritual cheesiness of Cary Tennis and the butt-pluggy irreverence of Dan Savage and the closeted Upper East Side nymphomania of Miss Manners.”

It became clear after a while that she was an advice columnist unlike others: intimate and frank, dispensing advice built on a foundation drawn of deep personal experience.

Slowly over the next two years, we learned a little more about her until eventually Sugar formally introduced herself as Cheryl Strayed. Strayed is the author behind the book Wild: From Lost to Found on the Pacific Crest Trail. I remember reading this book cover-to-cover on a flight. When the pilot announced that we’d be circling Heathrow for 20 minutes, I was the only one happy. I only had a few pages left.

In a way Sugar’s advice columns — combined into the amazing collection Tiny Beautiful Things: Advice on Love and Life from Dear Sugar — represents an ad hoc memoir.

“But it’s a memoir with an agenda,” Strayed’s friend Steve Almond writes in the introduction, “With great patience, and eloquence, (Sugar) assures her readers that within the chaos of our shame and disappointment and rage there is meaning, and within that meaning is the possibility of rescue.”

Inexplicable sorrows await all of us. … Life isn’t some narcissistic game you play online. It all matters— every sin, every regret, every affliction.

One of my favorite letters, the one for which the book is titled, comes in response to this question.

Dear Sugar,

I read your column religiously. I’m twenty-two. From what I can tell by your writing, you’re in your early forties. My question is short and sweet: What would you tell your twentysomething self if you could talk to her now?

Love, Seeking Wisdom

Think, dear reader, for a moment on what you would respond before continuing. Here is what Sugar, or should I say, Cheryl, had to say.

These words will touch your soul.

Dear Seeking Wisdom,

Stop worrying about whether you’re fat. You’re not fat. Or rather, you’re sometimes a little bit fat, but who gives a shit? There is nothing more boring and fruitless than a woman lamenting the fact that her stomach is round. Feed yourself. Literally. The sort of people worthy of your love will love you more for this, sweet pea.

In the middle of the night in the middle of your twenties when your best woman friend crawls naked into your bed, straddles you, and says, You should run away from me before I devour you, believe her.

You are not a terrible person for wanting to break up with someone you love. You don’t need a reason to leave. Wanting to leave is enough. Leaving doesn’t mean you’re incapable of real love or that you’ll never love anyone else again. It doesn’t mean you’re morally bankrupt or psychologically demented or a nymphomaniac. It means you wish to change the terms of one particular relationship. That’s all. Be brave enough to break your own heart.

When that really sweet but fucked-up gay couple invites you over to their cool apartment to do Ecstasy with them, say no.

There are some things you can’t understand yet. Your life will be a great and continuous unfolding. It’s good you’ve worked hard to resolve childhood issues while in your twenties, but understand that what you resolve will need to be resolved again. And again. You will come to know things that can only be known with the wisdom of age and the grace of years. Most of those things will have to do with forgiveness.

One evening you will be rolling around on the wooden floor of your apartment with a man who will tell you he doesn’t have a condom. You will smile in this spunky way that you think is hot and tell him to fuck you anyway. This will be a mistake for which you alone will pay.

Don’t lament so much about how your career is going to turn out. You don’t have a career. You have a life. Do the work. Keep the faith. Be true blue. You are a writer because you write. Keep writing and quit your bitching. Your book has a birthday. You don’t know what it is yet.

You cannot convince people to love you. This is an absolute rule. No one will ever give you love because you want him or her to give it. Real love moves freely in both directions. Don’t waste your time on anything else.

Most things will be okay eventually, but not everything will be. Sometimes you’ll put up a good fight and lose. Sometimes you’ll hold on really hard and realize there is no choice but to let go. Acceptance is a small, quiet room.

One hot afternoon during the era in which you’ve gotten yourself ridiculously tangled up with heroin, you will be riding the bus and thinking what a worthless piece of crap you are when a little girl will get on the bus holding the strings of two purple balloons. She’ll offer you one of the balloons, but you won’t take it because you believe you no longer have a right to such tiny beautiful things. You’re wrong. You do.

Your assumptions about the lives of others are in direct relation to your naïve pomposity. Many people you believe to be rich are not rich. Many people you think have it easy worked hard for what they got. Many people who seem to be gliding right along have suffered and are suffering. Many people who appear to you to be old and stupidly saddled down with kids and cars and houses were once every bit as hip and pompous as you.

When you meet a man in the doorway of a Mexican restaurant who later kisses you while explaining that this kiss doesn’t “mean anything” because, much as he likes you, he is not interested in having a relationship with you or anyone right now, just laugh and kiss him back. Your daughter will have his sense of humor. Your son will have his eyes.

The useless days will add up to something. The shitty waitressing jobs. The hours writing in your journal. The long meandering walks. The hours reading poetry and story collections and novels and dead people’s diaries and wondering about sex and God and whether you should shave under your arms or not. These things are your becoming.

One Christmas at the very beginning of your twenties when your mother gives you a warm coat that she saved for months to buy, don’t look at her skeptically after she tells you she thought the coat was perfect for you. Don’t hold it up and say it’s longer than you like your coats to be and too puffy and possibly even too warm. Your mother will be dead by spring. That coat will be the last gift she gave you. You will regret the small thing you didn’t say for the rest of your life.

Say thank you.

Yours,
Sugar

Tiny Beautiful Things will endure as a piece of literary art,” Almond writes, “as will Cheryl’s other books (Torch and Wild), because they do the essential work of literary art: they make us more human than we were before.”

The Future of Writing In the Age of Information

David Foster Wallace remains both loved and hated. His wisdom shows itself in argumentative writing, ambition and perfectionism, and perhaps one of the best, most profound, commencement addresses ever. He’s revered, in part, because he makes us think … about ourselves, about society, and about things we don’t generally want to think about.

In this interview from May of 1996 with Charlie Rose, Wallace addresses “the future of fiction in the information age.” His thoughts highlight the difficulties of reading in an age of distraction and are worth considering in a world where we often prefer being entertained to being educated.

On commercial entertainment for the masses and how it changes what we seek, Wallace comments:

Commercial entertainment — its efficiency, its sheer ability to deliver pleasure in large doses — changes people’s relationship to art and entertainment, it changes what an audience is looking for. I would argue that it changes us in deeper ways than that. And that some of the ways that commercial culture and commercial entertainment affects human beings is one of the things that I sort of think that serious fiction ought to be doing right now.

[…]

There’s this part that makes you feel full. There’s this part that is redemptive and instructive, [so that] when you read something, it’s not just delight — you go, “Oh my god, that’s me! I’ve lived like that, I’ve felt like that, I’m not alone in the world …

What’s tricky for me is … It would be one thing if everybody was absolutely delighted watching TV 24/7. But we have, as a culture, not only an enormous daily watching rate but we also have a tremendous cultural contempt for TV … Now TV that makes fun of TV is itself popular TV. There’s a way in which we who are watching a whole lot are also aware that we’re missing something — that there’s something else, there’s something more. While at the same time, because TV is really darn easy, you sit there and you don’t have to do very much.

Commenting on our need for easy fun he elaborates

Because commercial entertainment has conditioned readers to want easy fun, I think that avant garde and art fiction has sort of relinquished the field. Basically I don’t read much avant garde stuff because it’s hilaciously un-fun. … A lot of it is academic and foisted and basically written for critics.

What got him started writing?

Fiction for me, mostly as a reader, is a very weird double-edged sword — on the one hand, it can be difficult and it can be redemptive and morally instructive and all the good stuff we learn in school; on the other hand, it’s supposed to be fun, it’s a lot of fun. And what drew me into writing was mostly memories of really fun rainy afternoons spent with a book. It was a kind of a relationship.

I think part of the fun, for me, was being part of some kind of an exchange between consciousnesses, a way for human beings to talk to each other about stuff we can’t normally talk about.

​​(h/t Brainpickings)

Adding Tools to Your Mental Toolbox

Berkshire Headquarters
In The Art of War Sun Tzu said “The general who wins a battle makes many calculations in his temple before the battle is fought.”

Those ‘calculations’ are the tools we have available to think better. One of the best questions you can ask is how we can make our mental processes work better.

Charlie Munger says that “developing the habit of mastering the multiple models which underlie reality is the best thing you can do.”

Those models are mental models.

They fall into two categories: (1) ones that help us simulate time (and predict the future) and better understand how the world works (e.g. understanding a useful idea from like autocatalysis), and (2) ones that help us better understand how our mental processes lead us astray (e.g., availability bias).

When our mental models line up with reality they help us avoid problems. However, they also cause problems when they don’t line up with reality as we think something that isn’t true.

In Peter Bevelin’s Seeking Wisdom, he highlights Munger talking about autocatalysis:

If you get a certain kind of process going in chemistry, it speeds up on its own. So you get this marvellous boost in what you’re trying to do that runs on and on. Now, the laws of physics are such that it doesn’t run on forever. But it runs on for a goodly while. So you get a huge boost. You accomplish A – and, all of a sudden, you’re getting A + B + C for awhile.

He continues telling us how this idea can be applied:

Disney is an amazing example of autocatalysis … They had those movies in the can. They owned the copyright. And just as Coke could prosper when refrigeration came, when the videocassette was invented, Disney didn’t have to invent anything or do anything except take the thing out of the can and stick it on the cassette.

***

This leads us to an interesting problem. The world is always changing so which models should we prioritize learning?

How we prioritize our learning has implications beyond the day-to-day. Often we focus on things that change quickly. We chase the latest study, the latest findings, the most recent best-sellers. We do this to keep up-to-date with the latest-and-greatest.

Despite our intentions, learning in this way fails to account for cumulative knowledge. Instead we consume all of our time keeping up to date.

If we are prioritize learning, we should focus on things that change slowly.

The models that come from hard science and engineering are the most reliable models on this Earth. And engineering quality control – at least the guts of it that matters to you and me and people who are not professional engineers – is very much based on the elementary mathematics of Fermat and Pascal: It costs so much and you get so much less likelihood of it breaking if you spend this much…

And, of course, the engineering idea of a backup system is a very powerful idea. The engineering idea of breakpoints – that’s a very powerful model, too. The notion of a critical mass – that comes out of physics – is a very powerful model.

After we learn a model we have to make it useful. We have to integrate it into our existing knowledge.

Our world is mutli-dimensional and our problems are complicated. Most problems cannot be solved using one model alone. The more models we have the better able we are to rationally solve problems.

But if we don’t have the models we become the proverbial man with a hammer. To the man with a hammer everything looks like a nail. If you only have one model you will fit whatever problem you face to the model you have. If you have more than one model, however, you can look at the problem from a variety of perspectives and increase the odds you come to a better solution.

“Since no single discipline has all the answers,” Peter Bevelin writes in Seeking Wisdom, “we need to understand and use the big ideas from all the important disciplines: Mathematics, physics, chemistry, engineering, biology, psychology, and rank and use them in order of reliability.”

Charles Munger illustrates the importance of this:

Suppose you want to be good at declarer play in contract bridge. Well, you know the contract – you know what you have to achieve. And you can count up the sure winners you have by laying down your high cards and your invincible trumps.

But if you’re a trick or two short, how are you going to get the other needed tricks? Well, there are only six or so different, standard methods: You’ve got long-suit establishment. You’ve got finesses. You’ve got throw-in plays. You’ve got cross-ruffs. You’ve got squeezes. And you’ve got various ways of misleading the defense into making errors. So it’s a very limited number of models. But if you only know one or two of those models, then you’re going to be a horse’s patoot in declarer play…

If you don’t have the full repertoire, I guarantee you that you’ll overutilize the limited repertoire you have – including use of models that are inappropriate just because they’re available to you in the limited stock you have in mind.

As for how we can use different ideas, Munger again shows the way …

Have a full kit of tools … go through them in your mind checklist-style.. .you can never make any explanation that can be made in a more fundamental way in any other way than the most fundamental way. And you always take with full attribution to the most fundamental ideas that you are required to use. When you’re using physics, you say you’re using physics. When you’re using biology, you say you’re using biology.

But ideas alone are not enough. We need to understand how they interact and combine. This leads to lollapalooza effects.

You get lollapalooza effects when two, three or four forces are all operating in the same direction. And, frequently, you don’t get simple addition. It’s often like a critical mass in
physics where you get a nuclear explosion if you get to a certain point of mass – and you don’t get anything much worth seeing if you don’t reach the mass.

Sometimes the forces just add like ordinary quantities and sometimes they combine on a break-point or critical-mass basis … More commonly, the forces coming out of … models are conflicting to some extent. And you get huge, miserable trade-offs … So you [must] have the models and you [must] see the relatedness and the effects from the relatedness.

Peter Thiel Recommends 7 Reads

Eccentric billionaire Peter Thiel’s book Zero To One should be required reading for Farnam Street readers. Like The Hard Thing About Hard Things, it’s nice to see another business leader come out and write about life in the trenches in their own voice. I pointed out eight lessons that I took away, although there are many more hidden in the book.

In 2012 the Wall Street Journal asked him which books he enjoyed most in 2012, he responded with the following three suggestions:

100 Plus, Sonia Arrison

.. was first published in 2011, but its message is evergreen: how scientists are directly attacking the problem of aging and death and why we should fight for life instead of accepting decay as inevitable. The goal of longer life doesn’t just mean more years at the margin; it means a healthier old age. There is nothing to fear but our own complacency.

Bloodlands, Timothy Snyder

… He tells how the Nazis and the Soviets drove each other to ever more murderous atrocities as they fought to dominate Eastern Europe in the 1930s and ’40s. Even as he calculates the death toll painstakingly, Mr. Snyder reminds us that the most important number is one: Each victim was an individual whose life cannot be reduced to the violence that cut it short.

Resurrection From the Underground, René Girard

… the great French thinker René Girard’s classic study of Fyodor Dostoevsky …. There is no better way to think about human irrationality than to read Dostoevsky, and there is no better reader of Dostoevsky than Mr. Girard. For a fresh application of Mr. Girard’s insights into power politics, that great international theater of irrationality, try Jean-Michel Oughourlian’s “Psychopolitics,” a brief, freewheeling 2012 work by one of Mr. Girard’s closest collaborators.

Of course those were only his favorite books that year. So what then influenced his thinking overall? Luckily he answered this question in a reddit AMA. Prefacing his response with “I like the genre of past books written about the future,” he went on to list four books:

New Atlantis by Francis Bacon
Bacon writes of a utopian land called Bensalem where people live better lives because of science. Bacon “focuses on the duty of the state toward science, and his projections for state-sponsored research anticipate many advances in medicine and surgery, meteorology, and machinery.” Keep in mind this was written in 1627.

The American Challenge by Jean-Jacques Servan-Schreiber
A book that foresaw the information age. Here is a powerful quote from the book: “The signs and instruments of power are no longer armed legions or raw materials or capital… The wealth we seek does not lie in the earth or in numbers of men or in machines, but in the human spirit. And particularly in the ability of men to think and to create.”

The Great Illusion A Study of the Relation of Military Power to National Advantage
I’d never heard of this book before now, but as one Amazon reviewer summed it up: “(this is) a tightly reasoned and broadly historical perspective challenging the reigning view that man’s nature is inherently evil and that evil nature must dictate human relations.”

The Diamond Age: Or, a Young Lady’s Illustrated Primer
I started reading this once and was mesmerized by Stephenson’s imaginative future world. If you like artificial intelligence and nanotechnology, this is the book for you.

The Improbable Story of the Online Encyclopedia

More important than determining who deserved credit is ap­preciating the dynamics that occur when people share ideas.

 

Walter Isaacson is the rare sort of writer that, if you’re like me, you just pre-order everything he writes. The first thing I read that he wrote was the Einstein Biography, then the Steve Jobs Biography, then I went back and ordered everything else. He’s out with a new book, The Innovators, which recounts the story of the people who created the Internet. From Ada Lovelace, Lord Byron’s daughter, who pioneered computer programming in the 1840s, long before anyone else, through to Steve Jobs, Tim Berners-Lee, and Larry Page, Isaacson shows not only the people but how their minds worked.

Below is an excerpt from The Innovators, recounting the improbable story of Wikipedia.

When he launched the Web in 1991, Tim Berners-Lee intended it to be used as a collaboration tool, which is why he was dismayed that the Mosaic browser did not give users the ability to edit the Web pages they were viewing. It turned Web surfers into passive consumers of published content. That lapse was partly mitigated by the rise of blog­ging, which encouraged user-generated content. In 1995 another me­dium was invented that went further toward facilitating collaboration on the Web. It was called a wiki, and it worked by allowing users to modify Web pages—not by having an editing tool in their browser but by clicking and typing directly onto Web pages that ran wiki software.

The application was developed by Ward Cunningham, another of those congenial Midwest natives (Indiana, in his case) who grew up making ham radios and getting turned on by the global communities they fostered. After graduating from Purdue, he got a job at an elec­tronic equipment company, Tektronix, where he was assigned to keep track of projects, a task similar to what Berners-Lee faced when he went to CERN.

To do this he modified a superb software product developed by one of Apple’s most enchanting innovators, Bill Atkinson. It was called HyperCard, and it allowed users to make their own hyper-linked cards and documents on their computers. Apple had little idea what to do with the software, so at Atkinson’s insistence Apple gave it away free with its computers. It was easy to use, and even kids—especially kids—found ways to make HyperCard stacks of linked pictures and games.

Cunningham was blown away by HyperCard when he first saw it, but he found it cumbersome. So he created a super simple way of creating new cards and links: a blank box on each card in which you could type a title or word or phrase. If you wanted to make a link to Jane Doe or Harry’s Video Project or anything else, you simply typed those words in the box. “It was fun to do,” he said.

Then he created an Internet version of his HyperText program, writing it in just a few hundred lines of Perl code. The result was a new content management application that allowed users to edit and contribute to a Web page. Cunningham used the application to build a service, called the Portland Pattern Repository, that allowed soft­ware developers to exchange programming ideas and improve on the patterns that others had posted. “The plan is to have interested parties write web pages about the People, Projects and Patterns that have changed the way they program,” he wrote in an announcement posted in May 1995. “The writing style is casual, like email . . . Think of it as a moderated list where anyone can be moderator and everything is archived. It’s not quite a chat, still, conversation is possible.”

Now he needed a name. What he had created was a quick Web tool, but QuickWeb sounded lame, as if conjured up by a com­mittee at Microsoft. Fortunately, there was another word for quick that popped from the recesses of his memory. When he was on his honeymoon in Hawaii thirteen years earlier, he remembered, “the airport counter agent directed me to take the wiki wiki bus between terminals.” When he asked what it meant, he was told that wiki was the Hawaiian word for quick, and wiki wiki meant superquick. So he named his Web pages and the software that ran them WikiWikiWeb, wiki for short.

In his original version, the syntax Cunningham used for creating links in a text was to smash words together so that there would be two or more capital letters—as in Capital Letters—in a term. It be­came known as CamelCase, and its resonance would later be seen in scores of Internet brands such as AltaVista, MySpace, and YouTube.

WardsWiki (as it became known) allowed anyone to edit and contribute, without even needing a password. Previous versions of each page would be stored, in case someone botched one up, and there would be a “Recent Changes” page so that Cunningham and others could keep track of the edits. But there would be no supervisor or gatekeeper preapproving the changes. It would work, he said with cheery midwestern optimism, because “people are generally good.” It was just what Berners-Lee had envisioned, a Web that was read-write rather than read-only. “Wikis were one of the things that allowed col­laboration,” Berners-Lee said. “Blogs were another.”

Like Berners-Lee, Cunningham made his basic software available for anyone to modify and use. Consequently, there were soon scores of wiki sites as well as open-source improvements to his software. But the wiki concept was not widely known beyond software engineers until January 2001, when it was adopted by a struggling Internet entrepreneur who was trying, without much success, to build a free, online encyclopedia.

***

Jimmy Wales was born in 1966 in Huntsville, Alabama, a town of rednecks and rocket scientists. Six years earlier, in the wake of Sput­nik, President Eisenhower had personally gone there to open the Marshall Space Flight Center. “Growing up in Huntsville during the height of the space program kind of gave you an optimistic view of the future,” Wales observed. “An early memory was of the windows in our house rattling when they were testing the rockets. The space program was basically our hometown sports team, so it was exciting and you felt it was a town of technology and science.”

Wales, whose father was a grocery store manager, went to a one-room private school that was started by his mother and grandmother, who taught music. When he was three, his mother bought a World Book Encyclopedia from a door-to-door salesman; as he learned to read, it became an object of veneration. It put at his fingertips a cor­nucopia of knowledge along with maps and illustrations and even a few cellophane layers of transparencies you could lift to explore such things as the muscles, arteries, and digestive system of a dissected frog. But Wales soon discovered that the World Book had shortcom­ings: no matter how much was in it, there were many more things that weren’t. And this became more so with time. After a few years, there were all sorts of topics—moon landings and rock festivals and protest marches, Kennedys and kings—that were not included. World Book sent out stickers for owners to paste on the pages in order to update the encyclopedia, and Wales was fastidious about doing so. “I joke that I started as a kid revising the encyclopedia by stickering the one my mother bought.”

After graduating from Auburn and a halfhearted stab at graduate school, Wales took a job as a research director for a Chicago financial trading firm. But it did not fully engage him. His scholarly attitude was combined with a love for the Internet that had been honed by playing Multi-User Dungeons fantasies, which were essentially crowdsourced games. He founded and moderated an Internet mailing list discussion on Ayn Rand, the Russian-born American writer who espoused an objectivist and libertarian philosophy. He was very open about who could join the discussion forum, frowned on rants and the personal attack known as flaming, and managed comportment with a gentle hand. “I have chosen a ‘middle-ground’ method of moderation, a sort of behind-the-scenes prodding,” he wrote in a posting.

Before the rise of search engines, among the hottest Internet ser­vices were Web directories, which featured human-assembled lists and categories of cool sites, and Web rings, which created through a common navigation bar a circle of related sites that were linked to one another. Jumping on these bandwagons, Wales and two friends in 1996 started a venture that they dubbed BOMIS, for Bitter Old Men in Suits, and began casting around for ideas. They launched a panoply of startups that were typical of the dotcom boom of the late ’90s: a used-car ring and directory with pictures, a food-ordering service, a business directory for Chicago, and a sports ring. After Wales relo­cated to San Diego, he launched a directory and ring that served as “kind of a guy-oriented search engine,” featuring pictures of scantily clad women.

The rings showed Wales the value of having users help generate the content, a concept that was reinforced as he watched how the crowds of sports bettors on his site provided a more accurate morning line than any single expert could. He also was impressed by Eric Ray­mond’s The Cathedral and the Bazaar, which explained why an open and crowd-generated bazaar was a better model for a website than the carefully controlled top-down construction of a cathedral.

Wales next tried an idea that reflected his childhood love of the World Book: an online encyclopedia. He dubbed it Nupedia, and it had two attributes: it would be written by volunteers, and it would be free. It was an idea that had been proposed in 1999 by Richard Stallman, the pioneering advocate of free software. Wales hoped eventually to make money by selling ads. To help develop it, he hired a doctoral student in philosophy, Larry Sanger, whom he first met in online discussion groups. “He was specifically interested in finding a philoso­pher to lead the project,” Sanger recalled.

Sanger and Wales developed a rigorous, seven-step process for creating and approving articles, which included assigning topics to proven experts, whose credentials had been vetted, and then putting the drafts through outside expert reviews, public reviews, professional copy editing, and public copy editing. “We wish editors to be true experts in their fields and (with few exceptions) possess Ph.Ds.,” the Nupedia policy guidelines stipulated. “Larry’s view was that if we didn’t make it more academic than a traditional encyclopedia, people wouldn’t believe in it and respect it,” Wales explained. “He was wrong, but his view made sense given what we knew at the time.” The first article, published in March 2000, was on atonality by a scholar at the Johannes Gutenberg University in Mainz, Germany.

***

It was a painfully slow process and, worse yet, not a lot of fun. The whole point of writing for free online, as Justin Hall had shown, was that it produced a jolt of joy. After a year, Nupedia had only about a dozen articles published, making it useless as an encyclopedia, and 150 that were still in draft stage, which indicated how unpleasant the process had become. It had been rigorously engineered not to scale.

This hit home to Wales when he decided that he would personally write an article on Robert Merton, an economist who had won the Nobel Prize for creating a mathematical model for markets contain­ing derivatives. Wales had published a paper on option pricing theory, so he was very familiar with Merton’s work. “I started to try to write the article and it was very intimidating, because I knew they were going to send my draft out to the most prestigious finance professors they could find,” Wales said. “Suddenly I felt like I was back in grad school, and it was very stressful. I realized that the way we had set things up was not going to work.”

That was when Wales and Sanger discovered Ward Cunningham’s wiki software. Like many digital-age innovations, the application of wiki software to Nupedia in order to create Wikipedia—combining two ideas to create an innovation—was a collaborative process in­volving thoughts that were already in the air. But in this case a very non-wiki-like dispute erupted over who deserved the most credit.

The way Sanger remembered the story, he was having lunch in early January 2001 at a roadside taco stand near San Diego with a friend named Ben Kovitz, a computer engineer. Kovitz had been using Cunningham’s wiki and described it at length. It then dawned on Sanger, he claimed, that a wiki could be used to help solve the problems he was having with Nupedia. “Instantly I was considering whether wiki would work as a more open and simple editorial system for a free, collaborative encyclopedia,” Sanger later recounted. “The more I thought about it, without even having seen a wiki, the more it seemed obviously right.” In his version of the story, he then convinced Wales to try the wiki approach.

Kovitz, for his part, contended that he was the one who came up with the idea of using wiki software for a crowdsourced encyclopedia and that he had trouble convincing Sanger. “I suggested that instead of just using the wiki with Nupedia’s approved staff, he open it up to the general public and let each edit appear on the site immediately, with no review process,” Kovitz recounted. “My exact words were to allow ‘any fool in the world with Internet access’ to freely modify any page on the site.” Sanger raised some objections: “Couldn’t total idiots put up blatantly false or biased descriptions of things?” Kovitz replied, “Yes, and other idiots could delete those changes or edit them into something better.”

As for Wales’s version of the story, he later claimed that he had heard about wikis a month before Sanger’s lunch with Kovitz. Wikis had, after all, been around for more than four years and were a topic of discussion among programmers, including one who worked at BOMIS, Jeremy Rosenfeld, a big kid with a bigger grin. “Jeremy showed me Ward’s wiki in December 2000 and said it might solve our problem,” Wales recalled, adding that when Sanger showed him the same thing, he responded, “Oh, yes, wiki, Jeremy showed me this last month.” Sanger challenged that recollection, and a nasty cross­fire ensued on Wikipedia’s discussion boards. Wales finally tried to de-escalate the sniping with a post telling Sanger, “Gee, settle down,” but Sanger continued his battle against Wales in a variety of forums.

The dispute presented a classic case of a historian’s challenge when writing about collaborative creativity: each player has a different rec­ollection of who made which contribution, with a natural tendency to inflate his own. We’ve all seen this propensity many times in our friends, and perhaps even once or twice in ourselves. But it is ironic that such a dispute attended the birth of one of history’s most collab­orative creations, a site that was founded on the faith that people are willing to contribute without requiring credit. (Tellingly, and laudably, Wikipedia’s entries on its own history and the roles of Wales and Sanger have turned out, after much fighting on the discussion boards, to be bal­anced and objective.)

More important than determining who deserved credit is ap­preciating the dynamics that occur when people share ideas. Ben Kovitz, for one, understood this. He was the player who had the most insightful view—call it the “bumblebee at the right time” theory—on the collaborative way that Wikipedia was created. “Some folks, aim­ing to criticize or belittle Jimmy Wales, have taken to calling me one of the founders of Wikipedia, or even ‘the true founder,’” he said. “I suggested the idea, but I was not one of the founders. I was only the bumblebee. I had buzzed around the wiki flower for a while, and then pollinated the free-encyclopedia flower. I have talked with many oth­ers who had the same idea, just not in times or places where it could take root.”

That is the way that good ideas often blossom: a bumblebee brings half an idea from one realm, and pollinates another fertile realm filled with half-formed innovations. This is why Web tools are valuable, as are lunches at taco stands.

***

Cunningham was supportive, indeed delighted when Wales called him up in January 2001 to say he planned to use the wiki software to juice up his encyclopedia project. Cunningham had not sought to patent or copyright either the software or the wiki name, and he was one of those innovators who was happy to see his products become tools that anyone could use or adapt.

At first Wales and Sanger conceived of Wikipedia merely as an adjunct to Nupedia, sort of like a feeder product or farm team. The wiki articles, Sanger assured Nupedia’s expert editors, would be rel­egated to a separate section of the website and not be listed with the regular Nupedia pages. “If a wiki article got to a high level it could be put into the regular Nupedia editorial process,” he wrote in a post. Nevertheless, the Nupedia purists pushed back, insisting that Wiki­pedia be kept completely segregated, so as not to contaminate the wisdom of the experts. The Nupedia Advisory Board tersely declared on its website, “Please note: the editorial processes and policies of Wikipedia and Nupedia are totally separate; Nupedia editors and peer reviewers do not necessarily endorse the Wikipedia project, and Wikipedia contributors do not necessarily endorse the Nupedia project.” Though they didn’t know it, the pedants of the Nupedia priesthood were doing Wikipedia a huge favor by cutting the cord.

Unfettered, Wikipedia took off. It became to Web content what GNU/Linux was to software: a peer-to-peer commons collabora­tively created and maintained by volunteers who worked for the civic satisfactions they found. It was a delightful, counterintuitive concept, perfectly suited to the philosophy, attitude, and technology of the Internet. Anyone could edit a page, and the results would show up instantly. You didn’t have to be an expert. You didn’t have to fax in a copy of your diploma. You didn’t have to be authorized by the Powers That Be. You didn’t even have to be registered or use your real name. Sure, that meant vandals could mess up pages. So could idiots or ideologues. But the software kept track of every version. If a bad edit appeared, the community could simply get rid of it by clicking on a “revert” link. “Imagine a wall where it was easier to remove graffiti than add it” is the way the media scholar Clay Shirky explained the process. “The amount of graffiti on such a wall would depend on the commitment of its defenders.” In the case of Wikipedia, its de­fenders were fiercely committed. Wars have been fought with less intensity than the reversion battles on Wikipedia. And somewhat amazingly, the forces of reason regularly triumphed.

One month after Wikipedia’s launch, it had a thousand articles, approximately seventy times the number that Nupedia had after a full year. By September 2001, after eight months in existence, it had ten thousand articles. That month, when the September 11 attacks occurred, Wikipedia showed its nimbleness and usefulness; contribu­tors scrambled to create new pieces on such topics as the World Trade Center and its architect. A year after that, the article total reached forty thousand, more than were in the World Book that Wales’s mother had bought. By March 2003 the number of articles in the English-language edition had reached 100,000, with close to five hundred ac­tive editors working almost every day. At that point, Wales decided to shut Nupedia down.

By then Sanger had been gone for a year. Wales had let him go. They had increasingly clashed on fundamental issues, such as Sanger’s desire to give more deference to experts and scholars. In Wales’s view, “people who expect deference because they have a Ph.D. and don’t want to deal with ordinary people tend to be annoying.” Sanger felt, to the contrary, that it was the nonacademic masses who tended to be annoying. “As a community, Wikipedia lacks the habit or tra­dition of respect for expertise,” he wrote in a New Year’s Eve 2004 manifesto that was one of many attacks he leveled after he left. “A policy that I attempted to institute in Wikipedia’s first year, but for which I did not muster adequate support, was the policy of respect­ing and deferring politely to experts.” Sanger’s elitism was rejected not only by Wales but by the Wikipedia community. “Consequently, nearly everyone with much expertise but little patience will avoid ed­iting Wikipedia,” Sanger lamented.

Sanger turned out to be wrong. The uncredentialed crowd did not run off the experts. Instead the crowd itself became the expert, and the experts became part of the crowd. Early in Wikipedia’s devel­opment, I was researching a book about Albert Einstein and I noticed that the Wikipedia entry on him claimed that he had traveled to Al­bania in 1935 so that King Zog could help him escape the Nazis by getting him a visa to the United States. This was completely untrue, even though the passage included citations to obscure Albanian websites where this was proudly proclaimed, usually based on some third-hand series of recollections about what someone’s uncle once said a friend had told him. Using both my real name and a Wikipedia han­dle, I deleted the assertion from the article, only to watch it reappear. On the discussion page, I provided sources for where Einstein actu­ally was during the time in question (Princeton) and what passport he was using (Swiss). But tenacious Albanian partisans kept reinserting the claim. The Einstein-in-Albania tug-of-war lasted weeks. I became worried that the obstinacy of a few passionate advocates could under­mine Wikipedia’s reliance on the wisdom of crowds. But after a while, the edit wars ended, and the article no longer had Einstein going to Albania. At first I didn’t credit that success to the wisdom of crowds, since the push for a fix had come from me and not from the crowd. Then I realized that I, like thousands of others, was in fact a part of the crowd, occasionally adding a tiny bit to its wisdom.

A key principle of Wikipedia was that articles should have a neutral point of view. This succeeded in producing articles that were generally straightforward, even on controversial topics such as global warming and abortion. It also made it easier for people of different viewpoints to collaborate. “Because of the neutrality policy, we have partisans working together on the same articles,” Sanger explained. “It’s quite remarkable.” The community was usually able to use the lodestar of the neutral point of view to create a consensus article offering competing views in a neutral way. It became a model, rarely emulated, of how digital tools can be used to find common ground in a contentious society.

Not only were Wikipedia’s articles created collaboratively by the community; so were its operating practices. Wales fostered a loose system of collective management, in which he played guide and gentle prodder but not boss. There were wiki pages where users could jointly formulate and debate the rules. Through this mechanism, guidelines were evolved to deal with such matters as reversion practices, media­tion of disputes, the blocking of individual users, and the elevation of a select few to administrator status. All of these rules grew organically from the community rather than being dictated downward by a cen­tral authority. Like the Internet itself, power was distributed. “I can’t imagine who could have written such detailed guidelines other than a bunch of people working together,” Wales reflected. “It’s common in Wikipedia that we’ll come to a solution that’s really well thought out because so many minds have had a crack at improving it.”

As it grew organically, with both its content and its governance sprouting from its grassroots, Wikipedia was able to spread like kudzu. At the beginning of 2014, there were editions in 287 lan­guages, ranging from Afrikaans to Žemaitška. The total number of articles was 30 million, with 4.4 million in the English-language edi­tion. In contrast, the Encyclopedia Britannica, which quit publishing a print edition in 2010, had eighty thousand articles in its electronic edition, less than 2 percent of the number in Wikipedia. “The cumu­lative effort of Wikipedia’s millions of contributors means you are a click away from figuring out what a myocardial infarction is, or the cause of the Agacher Strip War, or who Spangles Muldoon was,” Clay Shirky has written. “This is an unplanned miracle, like ‘the market’ deciding how much bread goes in the store. Wikipedia, though, is even odder than the market: not only is all that material contributed for free, it is available to you free.” The result has been the greatest collaborative knowledge project in history.

***

The Innovators

So why do people contribute? Harvard Professor Yochai Benkler dubbed Wikipedia, along with open-source software and other free collaborative projects, examples of “commons-based peer produc­tion.” He explained, “Its central characteristic is that groups of in­dividuals successfully collaborate on large-scale projects following a diverse cluster of motivational drives and social signals, rather than either market prices or managerial commands.” These motivations include the psychological reward of interacting with others and the personal gratification of doing a useful task. We all have our little joys, such as collecting stamps or being a stickler for good grammar, knowing Jeff Torborg’s college batting average or the order of battle at Trafalgar. These all find a home on Wikipedia.

There is something fundamental, almost primordial at work. Some Wikipedians refer to it as “wiki-crack.” It’s the rush of dopamine that seems to hit the brain’s pleasure center when you make a smart edit and it appears instantly in a Wikipedia article. Until recently, being published was a pleasure afforded only to a select few. Most of us in that category can remember the thrill of seeing our words appear in public for the first time. Wikipedia, like blogs, made that treat avail­able to anyone. You didn’t have to be credentialed or anointed by the media elite.

For example, many of Wikipedia’s articles on the British aristoc­racy were largely written by a user known as Lord Emsworth. They were so insightful about the intricacies of the peerage system that some were featured as the “Article of the Day,” and Lord Emsworth rose to become a Wikipedia administrator. It turned out that Lord Emsworth, a name taken from P. G. Wodehouse’s novels, was actu­ally a 16-year-old schoolboy in South Brunswick, New Jersey. On Wikipedia, nobody knows you’re a commoner.

Connected to that is the even deeper satisfaction that comes from helping to create the information that we use rather than just pas­sively receiving it. “Involvement of people in the information they read,” wrote the Harvard professor Jonathan Zittrain, “is an important end itself.” A Wikipedia that we create in common is more mean­ingful than would be the same Wikipedia handed to us on a platter. Peer production allows people to be engaged.

Jimmy Wales often repeated a simple, inspiring mission for Wiki­pedia: “Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge. That’s what we’re doing.” It was a huge, audacious, and worthy goal. But it badly understated what Wikipedia did. It was about more than people being “given” free access to knowledge; it was also about empowering them, in a way not seen before in history, to be part of the process of creating and distributing knowledge. Wales came to realize that. “Wikipedia allows people not merely to access other people’s knowl­edge but to share their own,” he said. “When you help build some­thing, you own it, you’re vested in it. That’s far more rewarding than having it handed down to you.”

Wikipedia took the world another step closer to the vision pro­pounded by Vannevar Bush in his 1945 essay, “As We May Think,” which predicted, “Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.” It also harkened back to Ada Lovelace, who asserted that machines would be able to do almost anything, except think on their own. Wikipedia was not about building a machine that could think on its own. It was instead a dazzling example of human-machine symbiosis, the wisdom of humans and the processing power of computers being woven to­gether like a tapestry. When Wales and his new wife had a daughter in 2011, they named her Ada, after Lady Lovelace.

The Innovators is a must read for anyone looking to better understand the creative mind.

​​(h/t The Daily Beast)