Tag: Isaac Asimov

Fiction that Influences and Inspires

Reading nonfiction is a fantastic way to expand your mind and give you an edge in this world. It’s especially useful when we have a specific idea or concept that we’d like to learn more about. However, it’s important not to over-look everything we can learn from fiction.

Fiction resonates with us because it shows us truths about the human condition through great storytelling and compelling narratives. Through an engaging story we can be introduced to big ideas that just don’t resonate the same way in nonfiction: the medium allows for freedom of thought through creativity.

With this short book list, we’d like to take a look at a handful of novels that have inspired some truly extraordinary thinkers, especially today's leaders in technology. Some of these you're probably already aware of. Some not. But they're all worth a look.


The Great Gatsby by F. Scott Fitzgerald

Considered one of Fitzgerald’s greatest works, the novel follows the story of the wealthy Jay Gatsby and his love for Daisy Buchanan during the roaring 1920s. With its focus on wealth, excess, status and privilege some have called this a cautionary tale regarding the great American dream.  It's also just a hell of a yarn.

This is one of Bill and Melinda Gates favorite books. Mr. Gates says it's “the novel that I reread the most. Melinda and I love one line so much that we had it painted on a wall in our house: ‘His dream must have seemed so close that he could hardly fail to grasp it.’”

It’s not only the Gateses who adore this book, the author Haruki Murakami has called it one of his favorites and Chuck Palahnuik has said it was a source of inspiration for Fight Club. “It showed me how to write a ‘hero’ story by using an apostle as the narrator. Really it’s the basis of the triangle of two men and one woman in my book, Fight Club. I read the book at least once a year and it continues to surprise me with layers of emotion.”

The Remains of the Day by Kazuo Ishiguro

The story paints a spiritual portrait of the quintessential English butler as his world changes from World War I era to the 1950s. The themes of professionalism and dignity versus authenticity are prevalent throughout the novel.

This is Jeff Bezos favorite book. “If you read The Remains of the Day, which is my favorite book of all time, you can’t help but come away and think, I just spent 10 hours living an alternate life and I learned something about life and about regret.”

Actress and UN Goodwill Ambassador Emma Watson has also cited this book as one of her favorites. “When I was growing up, my family, particularly my father, were very stoic. Part of me is very resentful of this British mentality that it's not good to express feelings of any kind – that it's not proper or brave.” She has said she appreciates the book for how it expressed the consequences of this type of discretion.

Catcher in the Rye by J.D. Salinger

The book that introduced us to the ever loved and ever hated Holden Caulfield. The unique narrative gives us a glimpse into the mind of a 16 year old boy and the events surrounding his expulsion from prep school.

Bill Gates has said, “I read this when I was 13. It’s my favorite book. It acknowledges that young people are a little confused, but can be smart, and see things that adults don’t.”

Salman Khan, founder of Khan Academy, also lists this as one of his favorite books.

A Wrinkle in Time by Madeleine L'Engle

The second book centered around a teenager is A Wrinkle in Time, which brings us into Science Fiction. Some of the most innovative ideas of the last two centuries (trains, planes, robots) were considered science fiction at one point and made appearances in stories before they came about in real life. Science fiction is thus a window into our visions of the future, and tells us a great deal about what people of certain eras were both looking forward to and afraid of.

A Wrinkle in Time follows high schooler Meg Murry as she travels through space and time on a quest to save her father. The novel uses Meg’s extreme/out of this world situations as a way to explore the very real trials of teenagers.

Sheryl Sandberg, COO of Facebook, has called A Wrinkle in Time her favorite book as a child.

I wanted to be Meg Murry, the admittedly geeky heroine of “A Wrinkle in Time,” by Madeleine L’Engle. I loved how she worked with others to fight against an unjust system and how she fought to save her family against very long odds. I was also captivated by the concept of time travel. I keep asking Facebook’s engineers to build me a tesseract so I, too, could fold the fabric of time and space. But so far no one has even tried. Jeff Bezos also loved the book. “I remember in fourth grade we had this wonderful contest — there was some prize — whoever could read the most Newbery Award winners in a year. I didn't end up winning. I think I read like 30 Newbery Award winners that year, but somebody else read more. The standout there is the old classic that I think so many people have read and enjoyed, A Wrinkle in Time, and I just remember loving that book.”

Seveneves / Snow Crash / Cryptnomicon by Neal Stephenson

The sci-fi author Neal Stephenson comes up multiple times in the reading lists of some incredibly successful individuals. Above are three that seemed to come up the most.

Bill Gates has said that Stephenson’s novel Seveneves rekindled his love for sci-fci, a genre he thinks can be used as a vehicle to help people think about big ideas. With Seveneves in particular, he was struck by “the way the book pushes you to think big and long-term. If everyone learned that the world would end two days from now, there would be global panic, plus a big dose of hedonism. But what if it were ending two years from now? Would people keep going to work? Would kids go to school? If they did, what would you teach them?

The novel gives us an idea of what might happen if the world were ending and we were forced to escape to space. If that idea wasn’t interesting enough, the book also shoots forward 5,000 years and has the humans going back to what once was Earth.

Larry Page, co-founder of Google, has Stephenson’s Snow Crash in his list of favorite books.

That story takes place in a future America where our protagonist Hiro is a hacker/pizza delivery boy for the mafia in reality and a warrior prince in the Metaverse. Stephenson gives us a glimpse of a what a world would look like where much of our time and definition of self is explored in a shared virtual space and effortlessly weaves together concepts of religion, economics, politics, linguistics and computer science.

Meanwhile, Samuel Arbesman, the complexity scientist and author of The Half-Life of Facts (whom we interviewed recently), told us that Stephenson's Cryptnomicon is one of the best books he's ever read, saying:

The idea that there can be a book that weaves together an amazing plot as well as some really really profound ideas on philosophy and computer science and technology together, that was, I think one of the first times I had seen a book that had really done this. There were these unbelievably informational pieces. It’s also an unbelievable fun read. I’m a big fan of most of Stephenson’s work. I love his stuff, but I would say Cryptonomicon was one in particular that really demonstrated that you could do this kind of thing together.

The Foundation Trilogy by Isaac Asimov

Isaac Asimov was another author who appeared on multiple lists, his Foundation Series in particular has influenced an extraordinary number of people. The novel centers on a group of academics (The Foundation) as they struggle to preserve civilization during the fall of the Galactic Empire.

In more than one interview, Elon Musk has expressed that he was greatly influenced by the Foundation Series. He said the books taught him, “The lessons of history would suggest that civilizations move in cycles. You can track that back quite far — the Babylonians, the Sumerians, followed by the Egyptians, the Romans, China. We're obviously in a very upward cycle right now and hopefully that remains the case. But it may not. There could be some series of events that cause that technology level to decline. Given that this is the first time in 4.5bn years where it's been possible for humanity to extend life beyond Earth, it seems like we'd be wise to act while the window was open and not count on the fact it will be open a long time.”

The series also influenced the likes of George Lucas and Douglas Adams. Speaking of…

The Hitchhiker’s Guide to the Galaxy by Douglas Adams

The story chronicles earthling Arthur Dents’ amazing voyage through space after he escapes the destruction of Earth.

Elon Musk considers Douglas Adams one of the great modern philosophers. It was Adams that taught him that “The question is harder than the answer. When we ask questions they come along with our biases. You should really ask, ‘Is this the right question?’ And that’s hard to figure out.

It’s interesting to note that Musk happened upon the book at a time that he says he was going through and existential crisis (between the ages of 12 to 15). He first turned to Friedrich Nietzsche and Arthur Schopenhauer but found what he needed through Douglas instead. Salman Khan, founder of Khan Academy also lists this as one of his favorite books.


This is in no way an exhaustive list of fiction that has influenced people whom we admire, but we hope that it has inspired you to find more places for those big ideas. Happy Reading!

If you enjoyed this post, check out a few other book recommendation lists we've put out recently:

Book Recommendations by the Legendary Washington Post CEO Don Graham – Among his answers are his favourite fiction and non-fiction books and the book that will stay with him forever.

A Short List of Books for Doing New Things – Andrew Ng thinks innovation and creativity can be learned — that they are pattern-recognition and combinatorial creativity exercises which can be performed by an intelligent and devoted practitioner with the right approach. He also encourages the creation of new things; new businesses, new technologies. And on that topic, Ng has a few book recommendations.

What Can We Learn From the Prolific Mr. Asimov?

To learn is to broaden, to experience more, to snatch new aspects of life for yourself. To refuse to learn or to be relieved at not having to learn is to commit a form of suicide; in the long run, a more meaningful type of suicide than the mere ending of physical life. 

Knowledge is not only power; it is happiness, and being taught is the intellectual analog of being loved.

— Isaac Asimov, Yours, Isaac Asimov: A Life in Letters


Fans estimate that the erudite polymath Isaac Asimov authored nearly 500 full-length books during his life. Even if some that “don't count” are removed from the list — anthologies he edited, short science books he wrote for young people and so on — Asimov's output still reaches into the many hundreds of titles.  Starting with a spate of science-fiction novels in the 1950s, including the now-classic Foundation series, Asimov's writing eventually ranged into non-fiction with works of popular science, Big History, and even annotated guides to classic novels like Paradise Lost and Gulliver's Travels.

Among his works were a 1,200 page Guide to the Bible; he also wrote books on Greece, Rome, Egypt, and the Middle East; he wrote a wonderful Guide to Shakespeare and a comprehensive Chronology of the World; he wrote books on Carbon, Nitrogen, Photosynthesis, The Moon, The Sun, and the Human Body, along with many more scientific topics. He coined the term “robotics” and his stories led to modern movies like I, Robot and Bicentennial Man. He wrote one of the most popular stories of all time: The Last Question. He even wrote a few joke books and a book of limericks.

His Intelligent Man's Guide to Science, a 500,000 word epic written in a mad dash of eight months, was nominated for a National Book Award in 1961, losing only to William Shirer's bestselling history of Nazi Germany, The Rise and Fall of the Third Reich.

His science-fiction books continue to sell to this day and are considered foundational works of the genre. He won more than a dozen book awards. His science and history books were considered some of the best published for lay audiences — the only real complaint we can make is that a few of them are outdated now. (We'll give Asimov a pass for not updating them, since he's been dead for almost 25 years.)

In his free time, he was reputed to have written over 90,000 letters while keeping a monthly column in the Magazine of Fantasy and Science Fiction for 33 years between 1958 and 1991. Between the Magazine and numerous other outlets, Asimov compiled somewhere near 1,600 essays throughout his life.

In other words, the man was a writer through and through, leading to a question that begs to be asked:

What can we mortals learn from the Prolific Mr. Asimov?

Make the Time — No Excuses

Many people complain that they don't have time for their passions because of the unavoidable duties which suck up every free moment: Well, Asimov had duties too, but he got his writing career started anyway. From 1939 until 1958, Asimov doubled as a professor of biochemistry at Boston University, during which he completed 28 novels and a list of short stories long enough to fill most writers' entire career. He simply made the time to write.

In a posthumously published memoir, Asimov reflects on the “candy store” schedule implanted on him by his father, who'd worked long hours running a convenience store in New York after emigrating from Russia. As Asimov became a professional writer, he kept the heroic schedule for himself:

I wake at five in the morning. I get to work as early as I can. I work as long as I can. I do this every day of the week, including holidays. I don't take vacations voluntarily and I try to do my work even when I'm on vacation. (And even when I'm in the hospital.)

In other words, I am still and forever in the candy store. Of course, I'm not waiting on customers; I'm not taking money and making change; I'm not forced to be polite to everyone who comes in (in actual fact, I was never good at that). I am, instead, doing things I very much want to do — but the schedule is there; the schedule that was ground into me; the schedule you would think I would have rebelled against once I had the chance.

Know your Spots, and Stick to those Spots 

“I'm no genius, but I'm smart in spots, and I stay around those spots.”
—Thomas Watson, Sr., Founder of IBM

Even though he'd been writing in his spare time as a professor, Asimov was not doing any academic research, which did not go unnoticed by his superiors at Boston University. Asimov's success as an author combined with his dedication to his craft had forced him into a decision: Be an academic or be a popular writer. The decision needed no fretting — he was making so much money and such a large impact as a writer, he knew he'd be a fool to give it up. His rationalization to the school was wise and instructive:

I finally felt angry enough to say, “…as a science writer, I am extraordinary. I plan to be the best science writer in the world and I will shed luster on the medical school [at BU]. As a researcher, I am simply mediocre and…if there's one thing this school does not need, it is one more merely mediocre researcher.”

[One faculty member complimented him on his bravery in fighting for academic freedom.] I shrugged, “There's no bravery about it. I have academic freedom and I can give it to you in two words:

“What's that?” He said.

Outside income,” I said.

In other words, Asimov knew his circle of competence and knew himself. He made that again clear in a 1988 interview, when he was asked about a number of other projects and interests outside of writing. He demurred on all of them:

SW: Do you have any time left for other things besides writing?

IA: All I do is write. I do practically nothing else, except eat, sleep and talk to my wife.


SW: Have you ever written any screenplays for SF movies?

IA: No, I'm no talent for that and I don't want to get mixed up with Hollywood. If they are going to do something of mine, they will have to find someone else to write the screenplays.


SW: Do you like the covers of your books? Do you have any input in their design?

IA: No, I don't have any input into that. Publishers take care of that entirely. They never ask any questions and I never offer any advice, because my artistic talent is zero.


SW: Do you have a favorite SF painter?

IA: Well, there is a number of painters that I like very much. To name just a few: Michael Whelan and Boris Vallejo are between my favorites. I'm impressed by them, but that doesn't necessarily mean anything – I don't know that I have any taste in art.


SW: Have you ever tried to paint something yourself?

IA: No, I can't even draw a straight line with a ruler.


SW: Do you have any favorite SF writers?

IA: My favorite is Arthur Clark. I also like people like Fred Pohl or Larry Niven and others who know their science. I like Harlan Ellison, too, although his stories are terribly emotional. But I don't consider myself a judge of good science-fiction – not even my own.

Asimov knew and recognized his own constitution at a fairly early age, smartly seizing opportunities to build his life around that self-awareness in the way Hunter S. Thompson would advise young people to do years later.

In a separate posthumously published autobiography, Asimov reflected on his highly independent nature:

I never found true peace till I turned my whole working life into self-employment. I was not made to be an employee.

For that matter, I strongly suspect I was not made to be an employer either. At least I have never had the urge to have a secretary or helper of any kind. My instinct tells me that there would surely be interactions that would slow me down. Better to be a one-man operation, which I eventually became and remained.

Find What you Love, and Work Like Hell

To be prolific, he warns, one must be a
driven, non-stop person.”
— Interview with Isaac Asimov, 1979

Although Asimov was working the “candy store” hours and producing more output than nearly anyone of his generation, it was clear that he did it out of love.  The only reason he was able to write so much, he said, was “pure hedonism.”  He simply couldn't not write. That would have been unfathomable.

One admission from his autobiography tells the tale best:

One of the few depressing lunches I have had with Austin Olney [Houghton Mifflen editor] came on July 7, 1959. I incautiously told him of the various books I had in progress, and he advised me strongly not to write so busily. He said my books would compete with each other, interfere with each other's sales, and do less well per book if there were many.

The one thing I had learned in my ill-fated class in economics in high school was “the law of diminishing returns,” whereby working ten times as hard or investing ten times as much or producing ten times the quantity does not yield ten times the return.

I was rather glum after that meal and gave the matter much thought afterward.

What I decided was that I wasn't writing ten times as many books in order to get ten times the monetary returns, but in order to have ten times the pleasure

One of Asimov's best methods to keep the work flowing was to have more than one project going at a time. If he got writers' block or got bored with one project, he simply switched to another project, a tactic which kept him from stopping work to agonize and procrastinate. By the time he came back to the first project, he found the writing flowed easily once again.

This sort of “switching” is a hugely useful method to improve your overall level of productivity and avoid major hair-pulling roadblocks. You can also use this tactic with books to improve your overall reading yield, switching between them as your mood and energy dictates.

Never Stop Learning

If anything besides sheer productivity defined Asimov, it was a thirst for knowledge. He simply never stopped learning, and with that attitude, he grew into a mental giant who was more than once accused of “knowing everything”:

Nothing goes to waste, if you're determined to learn. I had already learned, for instance, that although I was one of the most overeducated people I knew, I couldn't possibly write the variety of books I manage to do out of the knowledge I had gained in school alone. I had to keep a program of self-education in process. 


And, as I went on to discover, each time I wrote a book on some subject outside my immediate field it gave me courage and incentive to do another one that was perhaps even farther outside the narrow range of my training…I advanced from chemical writer to science writer, and, eventually, I took all of my learning for my subject (or at least all that I could cram into my head — which, alas, had a sharply limited capacity despite all I could do).

As I did so, of course, I found that I had to educate myself. I had to read books on physics to reverse my unhappy experiences in school on the subject and to learn at home what I had failed to learn in the classroom — at least up to the point where my limited knowledge of mathematics prevented me from going farther.

When the time came, I read biology, medicine, and geology. I collected commentaries on the Bible and on Shakespeare. I read history books. Everything led to something else. I became a generalist by encouraging myself to be generally interested in all matters.


As I look back on it, it seems quite possible that none of this would have happened if I had stayed at school and had continued to think of myself as, primarily, a biochemist…[so] I was forced along the path I ought to have taken of my own accord if I had had the necessary insight into my own character and abilities.

(Source: It's Been a Good Life)

Still interested? Check out Asimov's memoir I, Asimov, his collection of stories I, Robot, or his collection of letters, Yours, Isaac Asimov: A Life in Letters.

Isaac Asimov: Integrity over Honesty

This thought from Isaac Asimov sums up much of Farnam Street’s ethos in how to operate ethically in the world.

From the book he co-authored with his wife How to Enjoy Writing: A Book of Aid and Comfort.

Integrity, is, to me, a somewhat stronger word than “honesty.” “Honesty” often implies truth-telling and little more, but “integrity” implies wholeness, soundness, a complex philosophy of life.

To have integrity is to stand by your word, to have a sense of honor, to do what you have agreed to do and to do it as best as you can. To have integrity is to be satisfied with nothing less than the best job you can do.

In that sense, anyone can have integrity, regardless of how small and unimportant a role he may play in the world…

A bit later, Asimov gives a short example of his concept of integrity:

Integrity not only simplifies your life by making it easy to come to a decision, but it may keep you out of trouble.

A writer I knew slightly once suggested that I write a book very quickly and that I then engage in complicated financial dealings that would involve my risking some money to begin with. The book I wrote would, however, fail and that would enable me to write off so much money as a loss that I would save on taxes many, many times what I had invested in the book. Of course, we would have to be certain that my book would be a failure, so I would have to undertake to write a really bad one. I would be taking advantage of a “tax shelter” in this way, and it was all perfectly legal.

I shook my head. “No,” I said. “It’s perfectly possible for me to write a bad book while I am trying honestly to write a good one, but writing a bad one on purpose is more than I can undertake to do, no matter how much money it would save me on taxes and no matter how legal it might be.”

I walked away and, a couple of years later, I read that the fellow who had advanced this proposition to me was now on trial for this same “tax shelter,” I was rather relieved that I had been simpleminded enough to have integrity.

Still Interested? See our post on The Difference Between Truth and Honesty.

How Do People Get New Ideas?

In a previously unpublished 1959 essay, Isaac Asimov explores how people get new ideas.

Echoing Einstein and Seneca, Asimov believes that new ideas come from combining things together. Steve Jobs thought the same thing.

What if the same earth-shaking idea occurred to two men, simultaneously and independently? Perhaps, the common factors involved would be illuminating. Consider the theory of evolution by natural selection, independently created by Charles Darwin and Alfred Wallace.

There is a great deal in common there. Both traveled to far places, observing strange species of plants and animals and the manner in which they varied from place to place. Both were keenly interested in finding an explanation for this, and both failed until each happened to read Malthus’s “Essay on Population.”

Both then saw how the notion of overpopulation and weeding out (which Malthus had applied to human beings) would fit into the doctrine of evolution by natural selection (if applied to species generally).

Obviously, then, what is needed is not only people with a good background in a particular field, but also people capable of making a connection between item 1 and item 2 which might not ordinarily seem connected.

Undoubtedly in the first half of the 19th century, a great many naturalists had studied the manner in which species were differentiated among themselves. A great many people had read Malthus. Perhaps some both studied species and read Malthus. But what you needed was someone who studied species, read Malthus, and had the ability to make a cross-connection.

That is the crucial point that is the rare characteristic that must be found. Once the cross-connection is made, it becomes obvious. Thomas H. Huxley is supposed to have exclaimed after reading On the Origin of Species, “How stupid of me not to have thought of this.”


Making the cross-connection requires a certain daring. It must, for any cross-connection that does not require daring is performed at once by many and develops not as a “new idea,” but as a mere “corollary of an old idea.”

It is only afterward that a new idea seems reasonable. To begin with, it usually seems unreasonable. It seems the height of unreason to suppose the earth was round instead of flat, or that it moved instead of the sun, or that objects required a force to stop them when in motion, instead of a force to keep them moving, and so on.

The paradox here is that crazy people are good at seeing new connections too, one notable difference being the outcome.

As a brief aside, I wonder if people are creative, in part because they are autodidacts rather than being autodidacts because they are creative? The formal education system doesn't exactly encourage creativity. Generally, there are right and wrong answers. We're taught to get the right answer. Autodidacts try new things, often learning negative knowledge instead of positive knowledge.

When you're right about connections that others cannot see, you are called a creative genius. When you're wrong, however, you're often labelled mentally ill.

This comes back to Keynes: “Worldly wisdom teaches that it is better for the reputation to fail conventionally than to succeed unconventionally.”

A great way to connect things is with a commonplace book.

The Relativity of Wrong

The Relativity of Wrong

“The basic trouble, you see, is that people think that “right” and “wrong” are absolute;
that everything that isn't perfectly and completely right is totally and equally wrong.”
Isaac Asimov

Isaac Asimov received a letter one day from a fellow who wanted to argue with one of Asimov's essays.

Asimov used this short essay to highlight the nuances of being wrong.

It seemed that in one of my innumerable essays, I had expressed a certain gladness at living in a century in which we finally got the basis of the universe straight.

I didn't go into detail in the matter, but what I meant was that we now know the basic rules governing the universe, together with the gravitational interrelationships of its gross components, as shown in the theory of relativity worked out between 1905 and 1916. We also know the basic rules governing the subatomic particles and their interrelationships, since these are very neatly described by the quantum theory worked out between 1900 and 1930. What's more, we have found that the galaxies and clusters of galaxies are the basic units of the physical universe, as discovered between 1920 and 1930.

These are all twentieth-century discoveries, you see.

The young specialist in English Lit, having quoted me, went on to lecture me severely on the fact that in every century people have thought they understood the universe at last, and in every century they were proved to be wrong. It follows that the one thing we can say about our modern “knowledge” is that it is wrong. The young man then quoted with approval what Socrates had said on learning that the Delphic oracle had proclaimed him the wisest man in Greece. “If I am the wisest man,” said Socrates, “it is because I alone know that I know nothing.” the implication was that I was very foolish because I was under the impression I knew a great deal.

My answer to him was, “John, when people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together.”

The basic trouble, you see, is that people think that “right” and “wrong” are absolute; that everything that isn't perfectly and completely right is totally and equally wrong.

However, I don't think that's so. It seems to me that right and wrong are fuzzy concepts, and I will devote this essay to an explanation of why I think so.

When my friend the English literature expert tells me that in every century scientists think they have worked out the universe and are always wrong, what I want to know is how wrong are they? Are they always wrong to the same degree?

Asimov's friend, with the mental framing of absolute rights and wrongs, believed that all theories are wrong because they are eventually proven incorrect. But he ignored the degree of incorrectness. There is an important distinction to be made between the degree of wrongness.

What actually happens is that once scientists get hold of a good concept they gradually refine and extend it with greater and greater subtlety as their instruments of measurement improve. Theories are not so much wrong as incomplete.

This can be pointed out in many cases other than just the shape of the earth. Even when a new theory seems to represent a revolution, it usually arises out of small refinements. If something more than a small refinement were needed, then the old theory would never have endured.

Copernicus switched from an earth-centered planetary system to a sun-centered one. In doing so, he switched from something that was obvious to something that was apparently ridiculous. However, it was a matter of finding better ways of calculating the motion of the planets in the sky, and eventually the geocentric theory was just left behind. It was precisely because the old theory gave results that were fairly good by the measurement standards of the time that kept it in being so long.

Again, it is because the geological formations of the earth change so slowly and the living things upon it evolve so slowly that it seemed reasonable at first to suppose that there was no change and that the earth and life always existed as they do today. If that were so, it would make no difference whether the earth and life were billions of years old or thousands. Thousands were easier to grasp.

But when careful observation showed that the earth and life were changing at a rate that was very tiny but not zero, then it became clear that the earth and life had to be very old. Modern geology came into being, and so did the notion of biological evolution.

If the rate of change were more rapid, geology and evolution would have reached their modern state in ancient times. It is only because the difference between the rate of change in a static universe and the rate of change in an evolutionary one is that between zero and very nearly zero that the creationists can continue propagating their folly.

Since the refinements in theory grow smaller and smaller, even quite ancient theories must have been sufficiently right to allow advances to be made; advances that were not wiped out by subsequent refinements.

The Greeks introduced the notion of latitude and longitude, for instance, and made reasonable maps of the Mediterranean basin even without taking sphericity into account, and we still use latitude and longitude today.

The Sumerians were probably the first to establish the principle that planetary movements in the sky exhibit regularity and can be predicted, and they proceeded to work out ways of doing so even though they assumed the earth to be the center of the universe. Their measurements have been enormously refined but the principle remains.

Naturally, the theories we now have might be considered wrong in the simplistic sense of my English Lit correspondent, but in a much truer and subtler sense, they need only be considered incomplete.

Kathryn Schulz writes about something very similar:

Because so many scientific theories from bygone eras have turned out to be wrong, we must assume that most of today’s theories will eventually prove incorrect as well. And what goes for science goes in general. Politics, economics, technology, law, religion, medicine, child-rearing, education: no matter the domain of life, one generation’s verities so often become the next generation’s falsehoods that we might as well have a Pessimistic Meta-Induction from the History of Everything.

Good scientists understand this. They recognize that they are part of a long process of approximation. They know that they are constructing models rather than revealing reality…

Still curious? Read Kathryn Schulz's explanation of how we feel when people disagree with us. Aslo check out why old knowledge isn't necessarily a waste.

The Half-life of Facts

Facts change all the time. Smoking has gone from doctor recommended to deadly. We used to think the Earth was the center of the universe and that Pluto was a planet. For decades we were convinced that the brontosaurus was a real dinosaur.

Knowledge, like milk, has an expiry date. That's the key message behind Samuel Arbesman's excellent new book The Half-life of Facts: Why Everything We Know Has an Expiration Date.

We're bombarded with studies that seemingly prove this or that. Caffeine is good for you one day and bad for you the next. What we think we know and understand about the world is constantly changing. Nothing is immune. While big ideas are overturned infrequently, little ideas churn regularly.

As scientific knowledge grows, we end up rethinking old knowledge. Abresman calls this “a churning of knowledge.” But understanding that facts change (and how they change) helps us cope in a world of constant uncertainty. We can never be too sure of what we know.

In introducing this idea, Abresam writes:

Knowledge is like radioactivity. If you look at a single atom of uranium, whether it’s going to decay — breaking down and unleashing its energy — is highly unpredictable. It might decay in the next second, or you might have to sit and stare at it for thousands, or perhaps even millions, of years before it breaks apart.

But when you take a chunk of uranium, itself made up of trillions upon trillions of atoms, suddenly the unpredictable becomes predictable. We know how uranium atoms work in the aggregate. As a group of atoms, uranium is highly regular. When we combine particles together, a rule of probability known as the law of large numbers takes over, and even the behavior of a tiny piece of uranium becomes understandable. If we are patient enough, half of a chunk of uranium will break down in 704 million years, like clock-work. This number — 704 million years — is a measurable amount of time, and it is known as the half-life of uranium.

It turns out that facts, when viewed as a large body of knowledge, are just as predictable. Facts, in the aggregate, have half-lives: We can measure the amount of time for half of a subject’s knowledge to be overturned. There is science that explores the rates at which new facts are created, new technologies developed, and even how facts spread. How knowledge changes can be understood scientifically.

This is a powerful idea. We don’t have to be at sea in a world of changing knowledge. Instead, we can understand how facts grow and change in the aggregate, just like radioactive materials. This book is a guide to the startling notion that our knowledge — even what each of us has in our head — changes in understandable and systematic ways.

Why does this happen? Why does knowledge churn? In Zen and the Art of Motocycle Maintenance, Robert Pirsig writes:

If all hypotheses cannot be tested, then the result of any experiment are inconclusive and the entire scientific method falls short of its goal of establishing proven knowledge.

About this Einstein had said, “Evolution has shown that at any given moment out of all conceivable constructions a single one has always proved itself absolutely superior to the rest,” and let it go at that.

… But there it was, the whole history of science, a clear story of continuously new and changing explanations of old facts. The time spans of permanence seemed completely random, he could see no order in them. Some scientific truths seemed to last for centuries, others for less than a year. Scientific truth was not dogma, good for eternity, but a temporal quantitative entity that could be studied like anything else.

A few pages later, Pirsig continues:

The purpose of scientific method is to select a single truth from among many hypothetical truths. That, more than anything else, is what science is all about. But historically science has done exactly the opposite. Through multiplication upon multiplication of facts, information, theories and hypotheses, it is science itself that is leading mankind from single absolute truths to multiple, indeterminate, relative ones.

With that, lets dig into how this looks. Arbesman offers a example:

A few years ago a team of scientists at a hospital in Paris decided to actually measure this (churning of knowledge). They decided to look at fields that they specialized in: cirrhosis and hepatitis, two areas that focus on liver diseases. They took nearly five hundred articles in these fields from more than fifty years and gave them to a battery of experts to examine.

Each expert was charged with saying whether the paper was factual, out-of-date, or disproved, according to more recent findings. Through doing this they were able to create a simple chart (see below) that showed the amount of factual content that had persisted over the previous decades. They found something striking: a clear decay in the number of papers that were still valid.

Furthermore, they got a clear measurement of the half-life of facts in these fields by looking at where the curve crosses 50 percent on this chart: 45 years. Essentially, information is like radioactive material: Medical knowledge about cirrhosis or hepatitis takes about forty-five years for half of it to be disproven or become out-of-date.

half-life of facts, decay in the truth of knowledge

Old knowledge, however, isn't a waste. It's not like we have to start from scratch. “Rather,” writes Arbesman, “the accumulation of knowledge can then lead us to a fuller and more accurate picture of the world around us.”

Isaac Asimov, in a wonderful essay, uses the Earth's curvature to help explain this:

When people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together.

When our knowledge in a field is immature, discoveries come easily and often explain the main ideas. “But there are uncountably more discoveries, although far rarer, in the tail of this distribution of discovery. As we delve deeper, whether it's intro discovering the diversity of life in the oceans or the shape of the earth, we begin to truly understand the world around us.”

So what we're really dealing with the long tail of discovery. Our search for what's way out at the end of that tail, while it might not be as important or as Earth-shattering as the blockbuster discoveries, can be just as exciting and surprising. Each new little piece can teach us something about what we thought was possible in the world and help us to asymptotically approach a more complete understanding of our surroundings.

In an interview with the Economist, Arbesman was asked which scientific fields decay the slowest-and fastest-and what causes that difference.

Well it depends, because these rates tend to change over time. For example, when medicine transitioned from an art to a science, its half-life was much more rapid than it is now. That said, medicine still has a very short half-life; in fact it is one of the areas where knowledge changes the fastest. One of the slowest is mathematics, because when you prove something in mathematics it is pretty much a settled matter unless someone finds an error in one of your proofs.

One thing we have seen is that the social sciences have a much faster rate of decay than the physical sciences, because in the social sciences there is a lot more “noise” at the experimental level. For instance, in physics, if you want to understand the arc of a parabola, you shoot a cannon 100 times and see where the cannonballs land. And when you do that, you are likely to find a really nice cluster around a single location. But if you are making measurements that have to do with people, things are a lot messier, because people respond to a lot of different things, and that means the effect sizes are going to be smaller.

Arbesman concludes his economist interview:

I want to show people how knowledge changes. But at the same time I want to say, now that you know how knowledge changes, you have to be on guard, so you are not shocked when your children (are) coming home to tell you that dinosaurs have feathers. You have to look things up more often and recognise that most of the stuff you learned when you were younger is not at the cutting edge. We are coming a lot closer to a true understanding of the world; we know a lot more about the universe than we did even just a few decades ago. It is not the case that just because knowledge is constantly being overturned we do not know anything. But too often, we fail to acknowledge change.

Some fields are starting to recognise this. Medicine, for example, has got really good at encouraging its practitioners to stay current. A lot of medical students are taught that everything they learn is going to be obsolete soon after they graduate. There is even a website called “up to date” that constantly updates medical textbooks. In that sense we could all stand to learn from medicine; we constantly have to make an effort to explore the world anew—even if that means just looking at Wikipedia more often. And I am not just talking about dinosaurs and outer space. You see this same phenomenon with knowledge about nutrition or childcare—the stuff that has to do with how we live our lives.

Even when we find new information that contradicts what we thought we knew, we're likely to be slow to change our minds. “A prevailing theory or paradigm is not overthrown by the accumulation of contrary evidence,” writes Richard Zeckhauser, “but rather by a new paradigm that, for whatever reasons, begins to be accepted by scientists.”

In this view, scientific scholars are subject to status quo persistence. Far from being objective decoders of the empirical evidence, scientists have decided preferences about the scientific beliefs they hold. From a psychological perspective, this preference for beliefs can be seen as a reaction to the tensions caused by cognitive dissonance.

A lot of scientific advancement happens only when the old guard dies off. Many years ago Max Planck offered this insight: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

While we have the best intentions and our minds change slowly, a lot of what we think we know is actually just a temporary knowledge to be updated in the future by more complete knowledge. I think this is why Nassim Taleb argues that we should read Seneca and not worry about someone like Jonah Lehrer bringing us sexy narratives of the latest discoveries. It turns out most of these discoveries are based on very little data and, while they may add to our cumulative knowledge, they are not likely to be around in 10 years.

The Half-life of Facts is a good read that help puts what we think we understand about the world into perspective.

Follow your curiosity and read my interview with the author. Knowing that knowledge has a half-life isn't enough, we can use this to help us determine what to read.