The Future of Writing In the Age of Information

David Foster Wallace remains both loved and hated. His wisdom shows itself in argumentative writing, ambition and perfectionism, and perhaps one of the best, most profound, commencement addresses ever. He’s revered, in part, because he makes us think … about ourselves, about society, and about things we don’t generally want to think about.

In this interview from May of 1996 with Charlie Rose, Wallace addresses “the future of fiction in the information age.” His thoughts highlight the difficulties of reading in an age of distraction and are worth considering in a world where we often prefer being entertained to being educated.

On commercial entertainment for the masses and how it changes what we seek, Wallace comments:

Commercial entertainment — its efficiency, its sheer ability to deliver pleasure in large doses — changes people’s relationship to art and entertainment, it changes what an audience is looking for. I would argue that it changes us in deeper ways than that. And that some of the ways that commercial culture and commercial entertainment affects human beings is one of the things that I sort of think that serious fiction ought to be doing right now.

[…]

There’s this part that makes you feel full. There’s this part that is redemptive and instructive, [so that] when you read something, it’s not just delight — you go, “Oh my god, that’s me! I’ve lived like that, I’ve felt like that, I’m not alone in the world …

What’s tricky for me is … It would be one thing if everybody was absolutely delighted watching TV 24/7. But we have, as a culture, not only an enormous daily watching rate but we also have a tremendous cultural contempt for TV … Now TV that makes fun of TV is itself popular TV. There’s a way in which we who are watching a whole lot are also aware that we’re missing something — that there’s something else, there’s something more. While at the same time, because TV is really darn easy, you sit there and you don’t have to do very much.

Commenting on our need for easy fun he elaborates

Because commercial entertainment has conditioned readers to want easy fun, I think that avant garde and art fiction has sort of relinquished the field. Basically I don’t read much avant garde stuff because it’s hilaciously un-fun. … A lot of it is academic and foisted and basically written for critics.

What got him started writing?

Fiction for me, mostly as a reader, is a very weird double-edged sword — on the one hand, it can be difficult and it can be redemptive and morally instructive and all the good stuff we learn in school; on the other hand, it’s supposed to be fun, it’s a lot of fun. And what drew me into writing was mostly memories of really fun rainy afternoons spent with a book. It was a kind of a relationship.

I think part of the fun, for me, was being part of some kind of an exchange between consciousnesses, a way for human beings to talk to each other about stuff we can’t normally talk about.

​​(h/t Brainpickings)

Adding Tools to Your Mental Toolbox

Berkshire Headquarters
In The Art of War Sun Tzu said “The general who wins a battle makes many calculations in his temple before the battle is fought.”

Those ‘calculations’ are the tools we have available to think better. One of the best questions you can ask is how we can make our mental processes work better.

Charlie Munger says that “developing the habit of mastering the multiple models which underlie reality is the best thing you can do.”

Those models are mental models.

They fall into two categories: (1) ones that help us simulate time (and predict the future) and better understand how the world works (e.g. understanding a useful idea from like autocatalysis), and (2) ones that help us better understand how our mental processes lead us astray (e.g., availability bias).

When our mental models line up with reality they help us avoid problems. However, they also cause problems when they don’t line up with reality as we think something that isn’t true.

In Peter Bevelin’s Seeking Wisdom, he highlights Munger talking about autocatalysis:

If you get a certain kind of process going in chemistry, it speeds up on its own. So you get this marvellous boost in what you’re trying to do that runs on and on. Now, the laws of physics are such that it doesn’t run on forever. But it runs on for a goodly while. So you get a huge boost. You accomplish A – and, all of a sudden, you’re getting A + B + C for awhile.

He continues telling us how this idea can be applied:

Disney is an amazing example of autocatalysis … They had those movies in the can. They owned the copyright. And just as Coke could prosper when refrigeration came, when the videocassette was invented, Disney didn’t have to invent anything or do anything except take the thing out of the can and stick it on the cassette.

***

This leads us to an interesting problem. The world is always changing so which models should we prioritize learning?

How we prioritize our learning has implications beyond the day-to-day. Often we focus on things that change quickly. We chase the latest study, the latest findings, the most recent best-sellers. We do this to keep up-to-date with the latest-and-greatest.

Despite our intentions, learning in this way fails to account for cumulative knowledge. Instead we consume all of our time keeping up to date.

If we are prioritize learning, we should focus on things that change slowly.

The models that come from hard science and engineering are the most reliable models on this Earth. And engineering quality control – at least the guts of it that matters to you and me and people who are not professional engineers – is very much based on the elementary mathematics of Fermat and Pascal: It costs so much and you get so much less likelihood of it breaking if you spend this much…

And, of course, the engineering idea of a backup system is a very powerful idea. The engineering idea of breakpoints – that’s a very powerful model, too. The notion of a critical mass – that comes out of physics – is a very powerful model.

After we learn a model we have to make it useful. We have to integrate it into our existing knowledge.

Our world is mutli-dimensional and our problems are complicated. Most problems cannot be solved using one model alone. The more models we have the better able we are to rationally solve problems.

But if we don’t have the models we become the proverbial man with a hammer. To the man with a hammer everything looks like a nail. If you only have one model you will fit whatever problem you face to the model you have. If you have more than one model, however, you can look at the problem from a variety of perspectives and increase the odds you come to a better solution.

“Since no single discipline has all the answers,” Peter Bevelin writes in Seeking Wisdom, “we need to understand and use the big ideas from all the important disciplines: Mathematics, physics, chemistry, engineering, biology, psychology, and rank and use them in order of reliability.”

Charles Munger illustrates the importance of this:

Suppose you want to be good at declarer play in contract bridge. Well, you know the contract – you know what you have to achieve. And you can count up the sure winners you have by laying down your high cards and your invincible trumps.

But if you’re a trick or two short, how are you going to get the other needed tricks? Well, there are only six or so different, standard methods: You’ve got long-suit establishment. You’ve got finesses. You’ve got throw-in plays. You’ve got cross-ruffs. You’ve got squeezes. And you’ve got various ways of misleading the defense into making errors. So it’s a very limited number of models. But if you only know one or two of those models, then you’re going to be a horse’s patoot in declarer play…

If you don’t have the full repertoire, I guarantee you that you’ll overutilize the limited repertoire you have – including use of models that are inappropriate just because they’re available to you in the limited stock you have in mind.

As for how we can use different ideas, Munger again shows the way …

Have a full kit of tools … go through them in your mind checklist-style.. .you can never make any explanation that can be made in a more fundamental way in any other way than the most fundamental way. And you always take with full attribution to the most fundamental ideas that you are required to use. When you’re using physics, you say you’re using physics. When you’re using biology, you say you’re using biology.

But ideas alone are not enough. We need to understand how they interact and combine. This leads to lollapalooza effects.

You get lollapalooza effects when two, three or four forces are all operating in the same direction. And, frequently, you don’t get simple addition. It’s often like a critical mass in
physics where you get a nuclear explosion if you get to a certain point of mass – and you don’t get anything much worth seeing if you don’t reach the mass.

Sometimes the forces just add like ordinary quantities and sometimes they combine on a break-point or critical-mass basis … More commonly, the forces coming out of … models are conflicting to some extent. And you get huge, miserable trade-offs … So you [must] have the models and you [must] see the relatedness and the effects from the relatedness.

Peter Thiel Recommends 7 Reads

Eccentric billionaire Peter Thiel’s book Zero To One should be required reading for Farnam Street readers. Like The Hard Thing About Hard Things, it’s nice to see another business leader come out and write about life in the trenches in their own voice. I pointed out eight lessons that I took away, although there are many more hidden in the book.

In 2012 the Wall Street Journal asked him which books he enjoyed most in 2012, he responded with the following three suggestions:

100 Plus, Sonia Arrison

.. was first published in 2011, but its message is evergreen: how scientists are directly attacking the problem of aging and death and why we should fight for life instead of accepting decay as inevitable. The goal of longer life doesn’t just mean more years at the margin; it means a healthier old age. There is nothing to fear but our own complacency.

Bloodlands, Timothy Snyder

… He tells how the Nazis and the Soviets drove each other to ever more murderous atrocities as they fought to dominate Eastern Europe in the 1930s and ’40s. Even as he calculates the death toll painstakingly, Mr. Snyder reminds us that the most important number is one: Each victim was an individual whose life cannot be reduced to the violence that cut it short.

Resurrection From the Underground, René Girard

… the great French thinker René Girard’s classic study of Fyodor Dostoevsky …. There is no better way to think about human irrationality than to read Dostoevsky, and there is no better reader of Dostoevsky than Mr. Girard. For a fresh application of Mr. Girard’s insights into power politics, that great international theater of irrationality, try Jean-Michel Oughourlian’s “Psychopolitics,” a brief, freewheeling 2012 work by one of Mr. Girard’s closest collaborators.

Of course those were only his favorite books that year. So what then influenced his thinking overall? Luckily he answered this question in a reddit AMA. Prefacing his response with “I like the genre of past books written about the future,” he went on to list four books:

New Atlantis by Francis Bacon
Bacon writes of a utopian land called Bensalem where people live better lives because of science. Bacon “focuses on the duty of the state toward science, and his projections for state-sponsored research anticipate many advances in medicine and surgery, meteorology, and machinery.” Keep in mind this was written in 1627.

The American Challenge by Jean-Jacques Servan-Schreiber
A book that foresaw the information age. Here is a powerful quote from the book: “The signs and instruments of power are no longer armed legions or raw materials or capital… The wealth we seek does not lie in the earth or in numbers of men or in machines, but in the human spirit. And particularly in the ability of men to think and to create.”

The Great Illusion A Study of the Relation of Military Power to National Advantage
I’d never heard of this book before now, but as one Amazon reviewer summed it up: “(this is) a tightly reasoned and broadly historical perspective challenging the reigning view that man’s nature is inherently evil and that evil nature must dictate human relations.”

The Diamond Age: Or, a Young Lady’s Illustrated Primer
I started reading this once and was mesmerized by Stephenson’s imaginative future world. If you like artificial intelligence and nanotechnology, this is the book for you.

The Improbable Story of the Online Encyclopedia

More important than determining who deserved credit is ap­preciating the dynamics that occur when people share ideas.

 

Walter Isaacson is the rare sort of writer that, if you’re like me, you just pre-order everything he writes. The first thing I read that he wrote was the Einstein Biography, then the Steve Jobs Biography, then I went back and ordered everything else. He’s out with a new book, The Innovators, which recounts the story of the people who created the Internet. From Ada Lovelace, Lord Byron’s daughter, who pioneered computer programming in the 1840s, long before anyone else, through to Steve Jobs, Tim Berners-Lee, and Larry Page, Isaacson shows not only the people but how their minds worked.

Below is an excerpt from The Innovators, recounting the improbable story of Wikipedia.

When he launched the Web in 1991, Tim Berners-Lee intended it to be used as a collaboration tool, which is why he was dismayed that the Mosaic browser did not give users the ability to edit the Web pages they were viewing. It turned Web surfers into passive consumers of published content. That lapse was partly mitigated by the rise of blog­ging, which encouraged user-generated content. In 1995 another me­dium was invented that went further toward facilitating collaboration on the Web. It was called a wiki, and it worked by allowing users to modify Web pages—not by having an editing tool in their browser but by clicking and typing directly onto Web pages that ran wiki software.

The application was developed by Ward Cunningham, another of those congenial Midwest natives (Indiana, in his case) who grew up making ham radios and getting turned on by the global communities they fostered. After graduating from Purdue, he got a job at an elec­tronic equipment company, Tektronix, where he was assigned to keep track of projects, a task similar to what Berners-Lee faced when he went to CERN.

To do this he modified a superb software product developed by one of Apple’s most enchanting innovators, Bill Atkinson. It was called HyperCard, and it allowed users to make their own hyper-linked cards and documents on their computers. Apple had little idea what to do with the software, so at Atkinson’s insistence Apple gave it away free with its computers. It was easy to use, and even kids—especially kids—found ways to make HyperCard stacks of linked pictures and games.

Cunningham was blown away by HyperCard when he first saw it, but he found it cumbersome. So he created a super simple way of creating new cards and links: a blank box on each card in which you could type a title or word or phrase. If you wanted to make a link to Jane Doe or Harry’s Video Project or anything else, you simply typed those words in the box. “It was fun to do,” he said.

Then he created an Internet version of his HyperText program, writing it in just a few hundred lines of Perl code. The result was a new content management application that allowed users to edit and contribute to a Web page. Cunningham used the application to build a service, called the Portland Pattern Repository, that allowed soft­ware developers to exchange programming ideas and improve on the patterns that others had posted. “The plan is to have interested parties write web pages about the People, Projects and Patterns that have changed the way they program,” he wrote in an announcement posted in May 1995. “The writing style is casual, like email . . . Think of it as a moderated list where anyone can be moderator and everything is archived. It’s not quite a chat, still, conversation is possible.”

Now he needed a name. What he had created was a quick Web tool, but QuickWeb sounded lame, as if conjured up by a com­mittee at Microsoft. Fortunately, there was another word for quick that popped from the recesses of his memory. When he was on his honeymoon in Hawaii thirteen years earlier, he remembered, “the airport counter agent directed me to take the wiki wiki bus between terminals.” When he asked what it meant, he was told that wiki was the Hawaiian word for quick, and wiki wiki meant superquick. So he named his Web pages and the software that ran them WikiWikiWeb, wiki for short.

In his original version, the syntax Cunningham used for creating links in a text was to smash words together so that there would be two or more capital letters—as in Capital Letters—in a term. It be­came known as CamelCase, and its resonance would later be seen in scores of Internet brands such as AltaVista, MySpace, and YouTube.

WardsWiki (as it became known) allowed anyone to edit and contribute, without even needing a password. Previous versions of each page would be stored, in case someone botched one up, and there would be a “Recent Changes” page so that Cunningham and others could keep track of the edits. But there would be no supervisor or gatekeeper preapproving the changes. It would work, he said with cheery midwestern optimism, because “people are generally good.” It was just what Berners-Lee had envisioned, a Web that was read-write rather than read-only. “Wikis were one of the things that allowed col­laboration,” Berners-Lee said. “Blogs were another.”

Like Berners-Lee, Cunningham made his basic software available for anyone to modify and use. Consequently, there were soon scores of wiki sites as well as open-source improvements to his software. But the wiki concept was not widely known beyond software engineers until January 2001, when it was adopted by a struggling Internet entrepreneur who was trying, without much success, to build a free, online encyclopedia.

***

Jimmy Wales was born in 1966 in Huntsville, Alabama, a town of rednecks and rocket scientists. Six years earlier, in the wake of Sput­nik, President Eisenhower had personally gone there to open the Marshall Space Flight Center. “Growing up in Huntsville during the height of the space program kind of gave you an optimistic view of the future,” Wales observed. “An early memory was of the windows in our house rattling when they were testing the rockets. The space program was basically our hometown sports team, so it was exciting and you felt it was a town of technology and science.”

Wales, whose father was a grocery store manager, went to a one-room private school that was started by his mother and grandmother, who taught music. When he was three, his mother bought a World Book Encyclopedia from a door-to-door salesman; as he learned to read, it became an object of veneration. It put at his fingertips a cor­nucopia of knowledge along with maps and illustrations and even a few cellophane layers of transparencies you could lift to explore such things as the muscles, arteries, and digestive system of a dissected frog. But Wales soon discovered that the World Book had shortcom­ings: no matter how much was in it, there were many more things that weren’t. And this became more so with time. After a few years, there were all sorts of topics—moon landings and rock festivals and protest marches, Kennedys and kings—that were not included. World Book sent out stickers for owners to paste on the pages in order to update the encyclopedia, and Wales was fastidious about doing so. “I joke that I started as a kid revising the encyclopedia by stickering the one my mother bought.”

After graduating from Auburn and a halfhearted stab at graduate school, Wales took a job as a research director for a Chicago financial trading firm. But it did not fully engage him. His scholarly attitude was combined with a love for the Internet that had been honed by playing Multi-User Dungeons fantasies, which were essentially crowdsourced games. He founded and moderated an Internet mailing list discussion on Ayn Rand, the Russian-born American writer who espoused an objectivist and libertarian philosophy. He was very open about who could join the discussion forum, frowned on rants and the personal attack known as flaming, and managed comportment with a gentle hand. “I have chosen a ‘middle-ground’ method of moderation, a sort of behind-the-scenes prodding,” he wrote in a posting.

Before the rise of search engines, among the hottest Internet ser­vices were Web directories, which featured human-assembled lists and categories of cool sites, and Web rings, which created through a common navigation bar a circle of related sites that were linked to one another. Jumping on these bandwagons, Wales and two friends in 1996 started a venture that they dubbed BOMIS, for Bitter Old Men in Suits, and began casting around for ideas. They launched a panoply of startups that were typical of the dotcom boom of the late ’90s: a used-car ring and directory with pictures, a food-ordering service, a business directory for Chicago, and a sports ring. After Wales relo­cated to San Diego, he launched a directory and ring that served as “kind of a guy-oriented search engine,” featuring pictures of scantily clad women.

The rings showed Wales the value of having users help generate the content, a concept that was reinforced as he watched how the crowds of sports bettors on his site provided a more accurate morning line than any single expert could. He also was impressed by Eric Ray­mond’s The Cathedral and the Bazaar, which explained why an open and crowd-generated bazaar was a better model for a website than the carefully controlled top-down construction of a cathedral.

Wales next tried an idea that reflected his childhood love of the World Book: an online encyclopedia. He dubbed it Nupedia, and it had two attributes: it would be written by volunteers, and it would be free. It was an idea that had been proposed in 1999 by Richard Stallman, the pioneering advocate of free software. Wales hoped eventually to make money by selling ads. To help develop it, he hired a doctoral student in philosophy, Larry Sanger, whom he first met in online discussion groups. “He was specifically interested in finding a philoso­pher to lead the project,” Sanger recalled.

Sanger and Wales developed a rigorous, seven-step process for creating and approving articles, which included assigning topics to proven experts, whose credentials had been vetted, and then putting the drafts through outside expert reviews, public reviews, professional copy editing, and public copy editing. “We wish editors to be true experts in their fields and (with few exceptions) possess Ph.Ds.,” the Nupedia policy guidelines stipulated. “Larry’s view was that if we didn’t make it more academic than a traditional encyclopedia, people wouldn’t believe in it and respect it,” Wales explained. “He was wrong, but his view made sense given what we knew at the time.” The first article, published in March 2000, was on atonality by a scholar at the Johannes Gutenberg University in Mainz, Germany.

***

It was a painfully slow process and, worse yet, not a lot of fun. The whole point of writing for free online, as Justin Hall had shown, was that it produced a jolt of joy. After a year, Nupedia had only about a dozen articles published, making it useless as an encyclopedia, and 150 that were still in draft stage, which indicated how unpleasant the process had become. It had been rigorously engineered not to scale.

This hit home to Wales when he decided that he would personally write an article on Robert Merton, an economist who had won the Nobel Prize for creating a mathematical model for markets contain­ing derivatives. Wales had published a paper on option pricing theory, so he was very familiar with Merton’s work. “I started to try to write the article and it was very intimidating, because I knew they were going to send my draft out to the most prestigious finance professors they could find,” Wales said. “Suddenly I felt like I was back in grad school, and it was very stressful. I realized that the way we had set things up was not going to work.”

That was when Wales and Sanger discovered Ward Cunningham’s wiki software. Like many digital-age innovations, the application of wiki software to Nupedia in order to create Wikipedia—combining two ideas to create an innovation—was a collaborative process in­volving thoughts that were already in the air. But in this case a very non-wiki-like dispute erupted over who deserved the most credit.

The way Sanger remembered the story, he was having lunch in early January 2001 at a roadside taco stand near San Diego with a friend named Ben Kovitz, a computer engineer. Kovitz had been using Cunningham’s wiki and described it at length. It then dawned on Sanger, he claimed, that a wiki could be used to help solve the problems he was having with Nupedia. “Instantly I was considering whether wiki would work as a more open and simple editorial system for a free, collaborative encyclopedia,” Sanger later recounted. “The more I thought about it, without even having seen a wiki, the more it seemed obviously right.” In his version of the story, he then convinced Wales to try the wiki approach.

Kovitz, for his part, contended that he was the one who came up with the idea of using wiki software for a crowdsourced encyclopedia and that he had trouble convincing Sanger. “I suggested that instead of just using the wiki with Nupedia’s approved staff, he open it up to the general public and let each edit appear on the site immediately, with no review process,” Kovitz recounted. “My exact words were to allow ‘any fool in the world with Internet access’ to freely modify any page on the site.” Sanger raised some objections: “Couldn’t total idiots put up blatantly false or biased descriptions of things?” Kovitz replied, “Yes, and other idiots could delete those changes or edit them into something better.”

As for Wales’s version of the story, he later claimed that he had heard about wikis a month before Sanger’s lunch with Kovitz. Wikis had, after all, been around for more than four years and were a topic of discussion among programmers, including one who worked at BOMIS, Jeremy Rosenfeld, a big kid with a bigger grin. “Jeremy showed me Ward’s wiki in December 2000 and said it might solve our problem,” Wales recalled, adding that when Sanger showed him the same thing, he responded, “Oh, yes, wiki, Jeremy showed me this last month.” Sanger challenged that recollection, and a nasty cross­fire ensued on Wikipedia’s discussion boards. Wales finally tried to de-escalate the sniping with a post telling Sanger, “Gee, settle down,” but Sanger continued his battle against Wales in a variety of forums.

The dispute presented a classic case of a historian’s challenge when writing about collaborative creativity: each player has a different rec­ollection of who made which contribution, with a natural tendency to inflate his own. We’ve all seen this propensity many times in our friends, and perhaps even once or twice in ourselves. But it is ironic that such a dispute attended the birth of one of history’s most collab­orative creations, a site that was founded on the faith that people are willing to contribute without requiring credit. (Tellingly, and laudably, Wikipedia’s entries on its own history and the roles of Wales and Sanger have turned out, after much fighting on the discussion boards, to be bal­anced and objective.)

More important than determining who deserved credit is ap­preciating the dynamics that occur when people share ideas. Ben Kovitz, for one, understood this. He was the player who had the most insightful view—call it the “bumblebee at the right time” theory—on the collaborative way that Wikipedia was created. “Some folks, aim­ing to criticize or belittle Jimmy Wales, have taken to calling me one of the founders of Wikipedia, or even ‘the true founder,’” he said. “I suggested the idea, but I was not one of the founders. I was only the bumblebee. I had buzzed around the wiki flower for a while, and then pollinated the free-encyclopedia flower. I have talked with many oth­ers who had the same idea, just not in times or places where it could take root.”

That is the way that good ideas often blossom: a bumblebee brings half an idea from one realm, and pollinates another fertile realm filled with half-formed innovations. This is why Web tools are valuable, as are lunches at taco stands.

***

Cunningham was supportive, indeed delighted when Wales called him up in January 2001 to say he planned to use the wiki software to juice up his encyclopedia project. Cunningham had not sought to patent or copyright either the software or the wiki name, and he was one of those innovators who was happy to see his products become tools that anyone could use or adapt.

At first Wales and Sanger conceived of Wikipedia merely as an adjunct to Nupedia, sort of like a feeder product or farm team. The wiki articles, Sanger assured Nupedia’s expert editors, would be rel­egated to a separate section of the website and not be listed with the regular Nupedia pages. “If a wiki article got to a high level it could be put into the regular Nupedia editorial process,” he wrote in a post. Nevertheless, the Nupedia purists pushed back, insisting that Wiki­pedia be kept completely segregated, so as not to contaminate the wisdom of the experts. The Nupedia Advisory Board tersely declared on its website, “Please note: the editorial processes and policies of Wikipedia and Nupedia are totally separate; Nupedia editors and peer reviewers do not necessarily endorse the Wikipedia project, and Wikipedia contributors do not necessarily endorse the Nupedia project.” Though they didn’t know it, the pedants of the Nupedia priesthood were doing Wikipedia a huge favor by cutting the cord.

Unfettered, Wikipedia took off. It became to Web content what GNU/Linux was to software: a peer-to-peer commons collabora­tively created and maintained by volunteers who worked for the civic satisfactions they found. It was a delightful, counterintuitive concept, perfectly suited to the philosophy, attitude, and technology of the Internet. Anyone could edit a page, and the results would show up instantly. You didn’t have to be an expert. You didn’t have to fax in a copy of your diploma. You didn’t have to be authorized by the Powers That Be. You didn’t even have to be registered or use your real name. Sure, that meant vandals could mess up pages. So could idiots or ideologues. But the software kept track of every version. If a bad edit appeared, the community could simply get rid of it by clicking on a “revert” link. “Imagine a wall where it was easier to remove graffiti than add it” is the way the media scholar Clay Shirky explained the process. “The amount of graffiti on such a wall would depend on the commitment of its defenders.” In the case of Wikipedia, its de­fenders were fiercely committed. Wars have been fought with less intensity than the reversion battles on Wikipedia. And somewhat amazingly, the forces of reason regularly triumphed.

One month after Wikipedia’s launch, it had a thousand articles, approximately seventy times the number that Nupedia had after a full year. By September 2001, after eight months in existence, it had ten thousand articles. That month, when the September 11 attacks occurred, Wikipedia showed its nimbleness and usefulness; contribu­tors scrambled to create new pieces on such topics as the World Trade Center and its architect. A year after that, the article total reached forty thousand, more than were in the World Book that Wales’s mother had bought. By March 2003 the number of articles in the English-language edition had reached 100,000, with close to five hundred ac­tive editors working almost every day. At that point, Wales decided to shut Nupedia down.

By then Sanger had been gone for a year. Wales had let him go. They had increasingly clashed on fundamental issues, such as Sanger’s desire to give more deference to experts and scholars. In Wales’s view, “people who expect deference because they have a Ph.D. and don’t want to deal with ordinary people tend to be annoying.” Sanger felt, to the contrary, that it was the nonacademic masses who tended to be annoying. “As a community, Wikipedia lacks the habit or tra­dition of respect for expertise,” he wrote in a New Year’s Eve 2004 manifesto that was one of many attacks he leveled after he left. “A policy that I attempted to institute in Wikipedia’s first year, but for which I did not muster adequate support, was the policy of respect­ing and deferring politely to experts.” Sanger’s elitism was rejected not only by Wales but by the Wikipedia community. “Consequently, nearly everyone with much expertise but little patience will avoid ed­iting Wikipedia,” Sanger lamented.

Sanger turned out to be wrong. The uncredentialed crowd did not run off the experts. Instead the crowd itself became the expert, and the experts became part of the crowd. Early in Wikipedia’s devel­opment, I was researching a book about Albert Einstein and I noticed that the Wikipedia entry on him claimed that he had traveled to Al­bania in 1935 so that King Zog could help him escape the Nazis by getting him a visa to the United States. This was completely untrue, even though the passage included citations to obscure Albanian websites where this was proudly proclaimed, usually based on some third-hand series of recollections about what someone’s uncle once said a friend had told him. Using both my real name and a Wikipedia han­dle, I deleted the assertion from the article, only to watch it reappear. On the discussion page, I provided sources for where Einstein actu­ally was during the time in question (Princeton) and what passport he was using (Swiss). But tenacious Albanian partisans kept reinserting the claim. The Einstein-in-Albania tug-of-war lasted weeks. I became worried that the obstinacy of a few passionate advocates could under­mine Wikipedia’s reliance on the wisdom of crowds. But after a while, the edit wars ended, and the article no longer had Einstein going to Albania. At first I didn’t credit that success to the wisdom of crowds, since the push for a fix had come from me and not from the crowd. Then I realized that I, like thousands of others, was in fact a part of the crowd, occasionally adding a tiny bit to its wisdom.

A key principle of Wikipedia was that articles should have a neutral point of view. This succeeded in producing articles that were generally straightforward, even on controversial topics such as global warming and abortion. It also made it easier for people of different viewpoints to collaborate. “Because of the neutrality policy, we have partisans working together on the same articles,” Sanger explained. “It’s quite remarkable.” The community was usually able to use the lodestar of the neutral point of view to create a consensus article offering competing views in a neutral way. It became a model, rarely emulated, of how digital tools can be used to find common ground in a contentious society.

Not only were Wikipedia’s articles created collaboratively by the community; so were its operating practices. Wales fostered a loose system of collective management, in which he played guide and gentle prodder but not boss. There were wiki pages where users could jointly formulate and debate the rules. Through this mechanism, guidelines were evolved to deal with such matters as reversion practices, media­tion of disputes, the blocking of individual users, and the elevation of a select few to administrator status. All of these rules grew organically from the community rather than being dictated downward by a cen­tral authority. Like the Internet itself, power was distributed. “I can’t imagine who could have written such detailed guidelines other than a bunch of people working together,” Wales reflected. “It’s common in Wikipedia that we’ll come to a solution that’s really well thought out because so many minds have had a crack at improving it.”

As it grew organically, with both its content and its governance sprouting from its grassroots, Wikipedia was able to spread like kudzu. At the beginning of 2014, there were editions in 287 lan­guages, ranging from Afrikaans to Žemaitška. The total number of articles was 30 million, with 4.4 million in the English-language edi­tion. In contrast, the Encyclopedia Britannica, which quit publishing a print edition in 2010, had eighty thousand articles in its electronic edition, less than 2 percent of the number in Wikipedia. “The cumu­lative effort of Wikipedia’s millions of contributors means you are a click away from figuring out what a myocardial infarction is, or the cause of the Agacher Strip War, or who Spangles Muldoon was,” Clay Shirky has written. “This is an unplanned miracle, like ‘the market’ deciding how much bread goes in the store. Wikipedia, though, is even odder than the market: not only is all that material contributed for free, it is available to you free.” The result has been the greatest collaborative knowledge project in history.

***

The Innovators

So why do people contribute? Harvard Professor Yochai Benkler dubbed Wikipedia, along with open-source software and other free collaborative projects, examples of “commons-based peer produc­tion.” He explained, “Its central characteristic is that groups of in­dividuals successfully collaborate on large-scale projects following a diverse cluster of motivational drives and social signals, rather than either market prices or managerial commands.” These motivations include the psychological reward of interacting with others and the personal gratification of doing a useful task. We all have our little joys, such as collecting stamps or being a stickler for good grammar, knowing Jeff Torborg’s college batting average or the order of battle at Trafalgar. These all find a home on Wikipedia.

There is something fundamental, almost primordial at work. Some Wikipedians refer to it as “wiki-crack.” It’s the rush of dopamine that seems to hit the brain’s pleasure center when you make a smart edit and it appears instantly in a Wikipedia article. Until recently, being published was a pleasure afforded only to a select few. Most of us in that category can remember the thrill of seeing our words appear in public for the first time. Wikipedia, like blogs, made that treat avail­able to anyone. You didn’t have to be credentialed or anointed by the media elite.

For example, many of Wikipedia’s articles on the British aristoc­racy were largely written by a user known as Lord Emsworth. They were so insightful about the intricacies of the peerage system that some were featured as the “Article of the Day,” and Lord Emsworth rose to become a Wikipedia administrator. It turned out that Lord Emsworth, a name taken from P. G. Wodehouse’s novels, was actu­ally a 16-year-old schoolboy in South Brunswick, New Jersey. On Wikipedia, nobody knows you’re a commoner.

Connected to that is the even deeper satisfaction that comes from helping to create the information that we use rather than just pas­sively receiving it. “Involvement of people in the information they read,” wrote the Harvard professor Jonathan Zittrain, “is an important end itself.” A Wikipedia that we create in common is more mean­ingful than would be the same Wikipedia handed to us on a platter. Peer production allows people to be engaged.

Jimmy Wales often repeated a simple, inspiring mission for Wiki­pedia: “Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge. That’s what we’re doing.” It was a huge, audacious, and worthy goal. But it badly understated what Wikipedia did. It was about more than people being “given” free access to knowledge; it was also about empowering them, in a way not seen before in history, to be part of the process of creating and distributing knowledge. Wales came to realize that. “Wikipedia allows people not merely to access other people’s knowl­edge but to share their own,” he said. “When you help build some­thing, you own it, you’re vested in it. That’s far more rewarding than having it handed down to you.”

Wikipedia took the world another step closer to the vision pro­pounded by Vannevar Bush in his 1945 essay, “As We May Think,” which predicted, “Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.” It also harkened back to Ada Lovelace, who asserted that machines would be able to do almost anything, except think on their own. Wikipedia was not about building a machine that could think on its own. It was instead a dazzling example of human-machine symbiosis, the wisdom of humans and the processing power of computers being woven to­gether like a tapestry. When Wales and his new wife had a daughter in 2011, they named her Ada, after Lady Lovelace.

The Innovators is a must read for anyone looking to better understand the creative mind.

​​(h/t The Daily Beast)

Four Reasons Why Plato Matters

1024px-Akropolis_by_Leo_von_Klenze
Plato devoted his life to one goal: helping people reach a state of fulfillment. To this day, his ideas remain deeply relevant, provocative, and fascinating. Philosophy, to Plato, was a tool to help us change the world.

In this short video Alain de Botton reminds us of the four big ideas that Plato had for making life more fulfilled.

Transcribed highlights below.

1. Think More

We rarely give ourselves time to think carefully and logically about our lives and how to lead them. Sometimes we just go along with what the Greeks called Doxa, or common sense. In the thirty-six books he wrote, Plato showed this common sense to be riddled with errors, prejudice, and superstition. … The problem is that popular opinions edge us toward the wrong values. … Plato’s answer is know yourself. (This) means doing a special kind of therapy: Philosophy. This means subjecting your ideas to examination rather than acting on impulse. … This kind of examination is called a Socratic discussion.

2. Let Your Lover Change You

That sounds weird if you think that love means finding someone who wants you just the way you are. In his play, the symposium, … Plato says true love is admiration. In other words, the person you need to get together with should have very good qualities, which you yourself lack. … By getting close to this person you can become a little like they are. The right person for us helps us grow to our full potential. … For Plato ‘a couple shouldn’t love each other exactly as they are right now,’ rather they should be committed to educating each other and enduring the stormy passages that inevitably involves. Each person should want to seduce the other into becoming a better version of themselves.

3. Decode the Message of Beauty
Everyone pretty much likes beautiful things but Plato was the first to ask why do we like them? He found a fascinating reason: beautiful objects are whispering important truths to us about the good life. We find things beautiful when we sense qualities in them that we need but are constantly missing in our lives: gentleness; harmony; balance; peace; (and) strength. Beautiful objects therefore have a really important function: they help to educate our souls.

4. Reform Society

Plato spent a lot of time thinking about how the government and society should ideally be. He was the world’s first utopian thinker.

In this, he was inspired by Athens’s great rival: Sparta. This was a city-sized machine for turning out great soldiers. Everything the Spartans did – how they raised their children, how their economy was organised, whom they admired, how they had sex, what they ate – was tailored to that one goal. And Sparta was hugely successful, from a military point of view.

But that wasn’t Plato’s concern. He wanted to know: how could a society get better at producing not military power but eudaimonia? How could it reliably help people towards fulfillment?

In his book, The Republic, Plato identifies a number of changes that should be made:

We need new heroes

Athenian society was very focused on the rich, like the louche aristocrat Alcibiades, and sports celebrities, like the boxer Milo of Croton. Plato wasn’t impressed: it really matters who we admire, for celebrities influence our outlook, ideas and conduct. And bad heroes give glamour to flaws of character.

Plato therefore wanted to give Athens new celebrities, replacing the current crop with ideally wise and good people he called Guardians: models for everyone’s good development. These people would be distinguished by their record of public service, their modesty and simple habits, their dislike of the limelight and their wide and deep experience. They would be the most honoured and admired people in society.

End Democracy

He also wanted to end democracy in Athens. He wasn’t crazy he just observed how few people think properly before they vote. Therefore we get very substandard rulers. He didn’t want to replace democracy with a dictatorship, but he wanted to prevent people from voting until they’d started to think rationally. That is, until they became philosophers. … To help the process Plato started a school: The Academy.

Still curious? So where do you go from here? The Great Books program at St. John’s College in Annapolis recommends this edition of Plato’s Complete Works. Another place to start, is this, slightly more detailed introduction to Plato.

The Ten Pillars of Cutthroat Zen

Dan Harris turned to meditation after a panic attack on live TV in front of millions of people.

In the back of his excellent book, 10% Happier: How I Tamed the Voice in My Head, Reduced Stress Without Losing My Edge, and Found Self-Help That Actually Works–A True Story, he writes a section that he wanted to call “The Ten Pillars of Cutthroat Zen” but ended up calling The Way of the Worrier.

1. Don’t Be a Jerk
2. (And/ But . . .) When Necessary, Hide the Zen
3. Meditate
4. The Price of Security Is Insecurity— Until It’s Not Useful
5. Equanimity Is Not the Enemy of Creativity
6. Don’t Force It
7. Humility Prevents Humiliation
8. Go Easy with the Internal Cattle Prod
9. Nonattachment to Results
10. What Matters Most?

Don’t Be a Jerk

It is, of course, common for people to succeed while occasionally being nasty. I met a lot of characters like this during the course of my career, but they never really seemed very happy to me. It is sometimes assumed that success in a competitive business requires the opposite of compassion. In my experience, though, that only reduced my clarity and effectiveness, leading to rash decisions. The virtuous cycle that Joseph described (more metta, better decisions, more happiness, and so on) is real. To boot, compassion has the strategic benefit of winning you allies. And then there’s the small matter of the fact that it makes you a vastly more fulfilled person.

(And/ But . . .) When Necessary, Hide the Zen Be nice, but don’t be a palooka.

Even though I’d achieved a degree of freedom from the ego, I still had to operate in a tough professional context. Sometimes you need to compete aggressively, plead your own case, or even have a sharp word with someone. It’s not easy, but it’s possible to do this calmly and without making the whole thing overly personal.

Meditate

Meditation is the superpower that makes all the other precepts possible. The practice has countless benefits— from better health to increased focus to a deeper sense of calm— but the biggie is the ability to respond instead of react to your impulses and urges. We live our life propelled by desire and aversion. In meditation, instead of succumbing to these deeply rooted habits of mind, you are simply watching what comes up in your head nonjudgmentally. For me, doing this drill over and over again had massive off-the-cushion benefits, allowing me—at least 10% of the time— to shut down the ego with a Reaganesque “There you go again.”

The Price of Security Is Insecurity— Until It’s Not Useful

Mindfulness proved a great mental thresher for separating wheat from chaff, for figuring out when my worrying was worthwhile and when it was pointless. Vigilance, diligence, the setting of audacious goals— these are all the good parts of “insecurity.” Hunger and perfectionism are powerful energies to harness. Even the much-maligned “comparing mind” can be useful. I compared myself to Joseph, Mark, and Sharon, and it made me happier. I compared myself to Bianca and it made me nicer. I compared myself to Bill Weir, David Muir, Chris Cuomo, David Wright, et al., and it upped my game. In my view, Buddhists underplay the utility of constructive anguish. In one of his dharma talks, I heard Joseph quote a monk who said something like, “There’s no point in being unhappy about things you can’t change, and no point being unhappy about things you can.” To me, this gave short shrift to the broad gray area where it pays to wring your hands at least a little bit.

Equanimity Is Not the Enemy of Creativity

Being happier did not, as many fear, make me a blissed-out zombie. This myth runs deep, all the way back to Aristotle, who said, “All men who have attained excellence in philosophy, in poetry, in art and in politics . . . had a melancholic habitus.” I found that rather than rendering me boringly problem-free, mindfulness made me, as an eminent spiritual teacher once said, “a connoisseur of my neuroses.” One of the most interesting discoveries of this whole journey was that I didn’t need my demons to fuel my drive— and that taming them was a more satisfying exercise than indulging them. Jon Kabat-Zinn has theorized that science may someday show that mindfulness actually makes people more creative, by clearing out the routinized rumination and unhelpful assumptions, making room for new and different thoughts. On retreat, for example, I would be flooded with ideas, filling notebooks with them, scribbling them down on the little sheets of paper between sitting and walking. So, who knows, maybe Van Gogh would have been an even better painter if he hadn’t been so miserable that he sliced off his ear?

Don’t Force It

It’s hard to open a jar when every muscle in your arm is tense. A slight relaxation served me well on the set of GMA, in interpersonal interactions, and when I was writing scripts. I came to see the benefits of purposeful pauses, and the embracing of ambiguity. It didn’t work every time, mind you, but it was better than my old technique of bulldozing my way to an answer.

Humility Prevents Humiliation

We’re all the stars of our own movies, but cutting back on the number of Do you know who I am? thoughts made my life infinitely smoother. When you don’t dig in your heels and let your ego get into entrenched positions from which you mount vigorous, often irrational defenses, you can navigate tricky situations in a much more agile way. For me humility was a relief, the opposite of humiliation. It sanded the edges off of the comparing mind. Of course, striking the right balance is delicate; it is possible to take this too far and become a pushover. (See precept number two, regarding hiding the Zen.)

Go Easy with the Internal Cattle Prod

As part of my “price of security” mind-set, I had long assumed that the only route to success was harsh self-criticism. However, research shows that “firm but kind” is the smarter play. People trained in self-compassion meditation are more likely to quit smoking and stick to a diet. They are better able to bounce back from missteps. All successful people fail. If you can create an inner environment where your mistakes are forgiven and flaws are candidly confronted, your resilience expands exponentially.

Nonattachment to Results

Nonattachment to results + self compassion = a supple relentlessness that is hard to match. Push hard, play to win, but don’t assume the fetal position if things don’t go your way. This, I came to believe, is what T. S. Eliot meant when he talked about learning “to care and not to care.”

What Matters Most?

One day, I was having brunch with Mark and Joseph, forcing them to help me think about the balance between ambition and equanimity for the umpteenth time. After the entrées and before dessert, Joseph got up to hit the bathroom. He came back smiling and pronounced, “I’ve figured it out. A useful mantra in those moments is ‘What matters most?’ ” At first, this struck me as somewhat generic, but as I sat with the idea for a while, it eventually emerged as the bottom-line, gut-check precept. When worrying about the future, I learned to ask myself: What do I really want? While I still loved the idea of success, I realized there was only so much suffering I was willing to endure. What I really wanted was aptly summed up during an interview I once did with Robert Schneider, the self-described “spastic” lead singer for the psych-pop group, Apples in Stereo. He was one of the happiest-seeming people I’d ever met: constantly chatting, perpetually in motion— he just radiated curiosity and enthusiasm. Toward the end of our interview, he said, “The most important thing to me is probably, like, being kind and also trying to do something awesome.”

If you think you’re on the verge of losing your way in life, I highly recommend Dan’s book.

What Is Time?

What is time

St Augustine, the theologian and philosopher, famously posed the question ‘What is time?’ in The Confessions. After waxing on for a bit about what he can say about time he admits (that he’s in a) “sorry state, for I do not even know what I do not know!”. Augustine is not alone.

Introducing Time: A Graphic Guide aims to help us understand the concept of time and its related puzzles “such as whether the past and future are real, whether time travel is possible, and the explanation of the direction of time.”

Clocks

In everyday life, we are probably most familiar with time from two sources: clocks, and our inner psychological experience of time.

Clocks are everywhere. There are grandfather clocks, watches, alarm clocks, even incense clocks that let you tell the time through scent.

There are also natural clocks.

But clocks existed well before the modern invention of portable artificial ones.

Over four thousand years ago, the Egyptians used obelisk shadow clocks, sundials, and water clocks which measured time by the flow of water passing through a stone vessel.

By 1800 BC, the ancient Babylonians had divided the day into hours, the hour into sixty minutes, and the minute into sixty seconds.

All the great civilizations of the past used the positions of the sun or stars to tell the time.

Looking at the stars with the naked eye, an ancient astronomer could tell the time to within fifteen minutes. And anyone can tell roughly the time merely by looking up at the sun.

Psychological Time

We also feel time pass. In addition to the physical time measured by various clocks, there is also psychological time. We have memories of the past and anticipations of the future. And we experience temporal durations of different sizes. We are personally, subjectively aware of time passing.

So time isn’t limited to clocks or our experience of time. It’s more than that. So is time merely in our head? That’s what Augustine argued. The Persian philosopher Avicenna agreed with him: “Time is merely a feature of our memories and expectations.”

But can this be right?

Although people disagree about their feelings of how much time has passed, they also enjoy remarkable agreement about the temporal ordering of events.

[…]

Except in rare circumstances, everyone (who has the same information available) agrees – for the most part – on the time order of events. There is definitely something objective and independent of a particular person’s feelings about the time ordering. The objectivity of the ordering of events in time proves that there is more to time than just our psychological sense of its passage. There is the fact that events seem to be laid out in a unique and observer-independent succession in time.

Maybe all there is to time is clocks.

This is actually already a deep question. But, at least at first glance, it seems the answer is “no”, for we often talk about a clock being wrong. You might say my watch is ten minutes slow or even completely off. This may be your excuse for being late for an appointment. But is your watch an infallible guide to time? No, we know it will “lose” a few seconds per year, even if it’s pretty good.

Clocks and Time-compressed

Between each “tick” of the clock, we want the same amount of time to pass. It should be no surprise that pendulums, which have regular periodic motion, can be used as clocks. But pendulums aren’t perfect. On a boat in high seas their motion will be disrupted, or in hot weather they may behave differently than in cold weather.

Consider a pendulum swinging back and forth twice. How do we know that the amount of time that passed on its first trip back and forth is the same as the amount of time that passed on its second trip? This question illustrates what the German philosopher Hans Reichenbach (1891 –1953) called the “problem of the uniformity of time”.

The uniformity of time

Firstly, your personal estimations of time won’t be precise enough for science. We need to know whether the first trip seemed exactly the same as the second trip. Secondly, your feeling as to the amount of time that passed is subjective. You might say the same amount of time went by, but your friend might not think so. Thirdly, and most importantly, you’re measuring the time that passed with your thoughts, but these are – plausibly – physical processes, and so this merely pushes our question back a step. That is, we would then ask how you know how long your thoughts last?

Newtonian Time
Newtonian Time-compressed

Real time, according to Newton, does not depend on any particular clock, or even any particular material object in the universe. Time is independent of the contents of the universe. It is this time that is used in the unchanging laws of physics. The laws of physics tell things where to be and when to be there. In telling them when to be where, Nature assumes a particular time measure.

According to Newton, we shouldn’t confuse any of (our) actual imperfect clocks with the perfect, invisible, clock that is independent of any physical object: Time.

Not everyone agrees with Newton. His idea of absolute time continues to be both influential and hugely controversial. In the language of philosophers of science, Newton is both a Realist-he thinks that the time mentioned in the laws of physics is really Time itself – and an Absolutist–he thinks that time is independent of any particular physical process.

Relationalism
Relationalism-compressed

Opponents of Absolutism, known as Relationalists, hold that time is essentially just change, or the measure of change. By change we mean change in the relationships between physical objects. Aristotle (384– 322 BC), the Greek philosopher, held that time is simply the measure of motion. Time is the measure of one physical process against another.

In this view, contrary to Newton’s, time is dependent on the physical contents of the universe since time is defined via their change. Time for Aristotle is dependent on its sensible measure – actual physical clocks.

In the Relationalist view, because time is dependent on physical movement, it seems time doesn’t pass when there is no change. Can we conceive of looking at the stars, having them stop, and still being able to experience time passing? Aristotle considered this question and pointed out that in such a case we’re still measuring the progression of time with our changing thoughts and feelings. We need these to stop too.

Because our brains will be frozen too, it is true that we wouldn’t notice the passing of time. Could time pass by nonetheless? It would, if time is independent of change. So, in Newton’s view, it would at least be conceivable for time to pass without any change at all. But according to Relationalism, this is impossible. Time is just the measure of change. No change, no time.

Tensed Time
One part of the book that stood out for me was the nature of tensed theory of time, which offers another use of branching. I connected the theory of tensed time to alternative histories.

The tensed theory of time probably best corresponds with one’s intuitive idea of time, or the idea of time shared with the proverbial “man in the street”. On this theory, the future is unreal. The event corresponding to what you will do after you read this sentence does not exist. The future is unsettled and ripe with possibility. As time passes, the world “chooses” one path from among all the available ones. The past is set and the present is that instantaneous point where the past and future meet. The world, in this picture, has the structure of a branching tree …

Tensed Time

This theory corresponds to our idea that “what’s done is done”, that the past cannot be changed, and that the future can be changed because it is “open”.

Life Without Time
Adding to our thoughts on time is Mitch Albom’s beautiful passage found in The Timekeeper:

Try to imagine a life without timekeeping. You probably can’t. You know the month, the year, the day of the week. There is a clock on your wall or the dashboard of your car. You have a schedule, a calendar, a time for dinner or a movie. Yet all around you, timekeeping is ignored. Birds are not late. A dog does not check its watch. Deer do not fret over passing birthdays. Man alone chimes the hour. And, because of this, man alone suffers a paralyzing fear that no other creature endures. A fear of time running out.

Introducing Time goes on to further explore the nature of time and our relationship to it.

John Steinbeck on Love

Nobel laureate John Steinbeck is best known as the author of The Grapes of Wrath and Of Mice and Men but we can pull from his letters a mix of insight and language that rivals that of Hunter S. Thompson.

Steinbeck hated the telephone. Letter writing was a more natural way for him to communicate his thoughts with both the people he liked and the ones he hated on all manner of subjects.

Found in Steinbeck: A Life in Letters, the master pens this beautiful and passionate response to his eldest son Thom’s 1958 letter confessing his love for a girl named Susan.

While Steinbeck urges patience, a value increasingly lost in today’s hyper-connected world, he also highlights several kinds of love: one destructive and the other unleashing.

New York
November 10, 1958

Dear Thom:

We had your letter this morning. I will answer it from my point of view and of course Elaine will from hers.

First — if you are in love — that’s a good thing — that’s about the best thing that can happen to anyone. Don’t let anyone make it small or light to you.

Second — There are several kinds of love. One is a selfish, mean, grasping, egotistical thing which uses love for self-importance. This is the ugly and crippling kind. The other is an outpouring of everything good in you — of kindness and consideration and respect — not only the social respect of manners but the greater respect which is recognition of another person as unique and valuable. The first kind can make you sick and small and weak but the second can release in you strength, and courage and goodness and even wisdom you didn’t know you had.

You say this is not puppy love. If you feel so deeply — of course it isn’t puppy love.

But I don’t think you were asking me what you feel. You know better than anyone. What you wanted me to help you with is what to do about it — and that I can tell you.

Glory in it for one thing and be very glad and grateful for it.

The object of love is the best and most beautiful. Try to live up to it.

If you love someone — there is no possible harm in saying so — only you must remember that some people are very shy and sometimes the saying must take that shyness into consideration.

Girls have a way of knowing or feeling what you feel, but they usually like to hear it also.

It sometimes happens that what you feel is not returned for one reason or another — but that does not make your feeling less valuable and good.

Lastly, I know your feeling because I have it and I’m glad you have it.

We will be glad to meet Susan. She will be very welcome. But Elaine will make all such arrangements because that is her province and she will be very glad to. She knows about love too and maybe she can give you more help than I can.

And don’t worry about losing. If it is right, it happens — The main thing is not to hurry. Nothing good gets away.

Love,