Over 400,000 people visited Farnam Street last month to learn how to make better decisions, create new ideas, and avoid stupid errors. With more than 100,000 subscribers to our popular weekly digest, we've become an online intellectual hub. To learn more about what we do, start here.

Tag Archives: Philosophy

John Gray: Is Human Progress an Illusion?

“Straw Dogs is an attack on the unthinking beliefs of thinking people.”
— John Gray

***

We like to think that the tide of history is an inexorable march from barbarity to civilization, with humans “progressing” from one stage to the next through a gradual process of enlightenment. Modern humanists like Steven Pinker argue forcefully for this method of thinking.

But is this really so? Is this reality?

One of the leading challengers to that type of thinking has been the English writer and philosopher John Gray, the idiosyncratic author of books like Straw Dogs: Thoughts on Humans and Other Animals, The Soul of the Marionette, and The Silence of Animals.

To Gray, the concept of “progress” is closer to an illusion, or worse a delusion of the modern age. Civilization is not a permanent state of being, but something which can quickly recede during a time of stress.

He outlines his basic idea in a foreword to Straw Dogs:

Straw Dogs is an attack on the unthinking beliefs of thinking people. Today, liberal humanism has the pervasive power that was once possessed by revealed religion. Humanists like to think they have a rational view of the world; but their core belief in progress is a superstition, further from the truth about the human animal than any of the world's religions.

Outside of science, progress is simply a myth. In some readers of Straw Dogs this observation seems to have produced a moral panic. Surely, they ask, no one can question the central article of faith of liberal societies? Without it, will we not despair? Like trembling Victorians terrified of losing their faith, these humanists cling to the moth-eaten brocade of progressive hope. Today religious believers are more free-thinking. Driven to the margins of a culture in which science claims authority over all of human knowledge, they have had to cultivate a capacity for doubt. In contrast, secular believers — held fast by the conventional wisdom of the time — are in the grip of unexamined dogmas.

And what, pray tell, are those dogmas? They are numerous, but the central one must be that the human march of science and technology creates good for the world. Gray's not as sure: He sees science and technology as magnifying humanity “warts and all”.

Our tools allow us to go to the Moon but also murder each other with great alacrity. They have no morality attached to them.

In science, the growth of knowledge is cumulative. But human life as a whole is not a cumulative activity; what is gained in one generation may be lost in the next. In science, knowledge is an unmixed god; in ethics and politics it is bad as well as good. Science increases human power — and magnifies the flaws in human nature. It enables us to live longer and have higher living standards than in the past. At the same time it allows us to wreak destruction — on each other and the Earth — on a larger scale than ever before.

The idea of progress rests on the belief that the growth of knowledge and the advance of the species go together—if not now, then in the long run. The biblical myth of the Fall of Man contains the forbidden truth. Knowledge does not make us free. It leaves us as we have always been, prey to every kind of folly. The same truth is found in Greek myth. The punishment of Prometheus, chained to a rock for stealing fire from the gods, was not unjust.

Gray has a fairly heretical view of technology itself, pointing out that no one really controls its development or use; making humanity as a group closer to subjects than masters. Technology is both a giver of good and an ongoing source of tragedy, because it is used by fallible human beings.

Those who ignore the destructive potential of future technologies can do so only because they ignore history. Pogroms are as old as Christendom; but without railways, the telegraph and poison gas there could have been no Holocaust. There have always been tyrannies; but without modern means of transport and communication, Stalin and Mao could not have built their gulags. Humanity's worst crimes were made possible only by modern technology.

There is a deeper reason why “humanity” will never control technology. Technology is not something that humankind can control. It as an event that has befallen the world.

Once a technology enters human life — whether it be fire, the wheel, the automobile, radio, television, or the internet — it changes it in ways we can never fully understand.

[…]

Nothing is more commonplace than to lament that moral progress has failed to keep pace with scientific knowledge. If only we were more intelligent and more moral, we could use technology only for benign ends. The fault is not in our tools, we say, but in ourselves.

In one sense this is true. Technical progress leaves only one problem unsolved: the frailty of human nature. Unfortunately that problem is insoluble.

This reminds one of Garrett Hardin's idea that no system, however technically advanced, can be flawless because the human being at the center of it will always be fallible. (Our technologies, after all, are geared around our needs.) Even if we create technologies that “don't need us” — we are still fallible creators.

Gray's real problem with the idea of moral progress, technical progress, and scientific progress are they, even were they real, would be unending. In the modern conception of the world, unlike the ancient past where everything was seen as cyclical, growth has no natural stop-point. It's just an infinite path to the heavens. This manifests itself in our constant disdain for idleness.

Nothing is more alien to the present age than idleness. If we think of resting from our labours, it is only in order to return to them.

In thinking so highly of work we are aberrant. Few other cultures have ever done so. For nearly all of history and all prehistory, work was an indignity.

Among Christians, only Protestants have ever believed that work smacks of salvation; the work and prayer of medieval Christendom were interspersed with festivals. The ancient Greeks sought salvation in philosophy, the Indians in meditation, the Chinese in poetry and the love of nature. The pygmies of the African rainforests — now nearly extinct — work only to meet the needs of the day, and spend most of their lives idling.

Progress condemns idleness. The work needed to delivery humanity is vast. Indeed it is limitless, since as one plateau of achievement is reached another looms up. Of course this is only a mirage; but the worst of progress is not that it is an illusion. It is that it is endless.

Gray then goes on to compare our ideas of progress to Sisyphus forever pushing the bolder up the mountain.

He's an interesting thinker, Gray. In all of his works, though he certainly raises issue with our current modes of liberal progressive thought and is certainly not a religious man, one only finds hints of a “better” worldview being proposed. One is never sure if he even believes in “better”.

The closest thing to advice comes from the conclusion to his book The Silence of Animals. What is the point of life if not progress? Simply to see. Simply to be human. To contemplate. We must deal with human life the way we always have.

Godless contemplation is a more radical and transient condition: a temporary respite from the all-too-human world, with nothing particular in mind. In most traditions the life of contemplation promises redemption from being human: in Christianity, the end of tragedy and a glimpse of the divine comedy; in Jeffers's pantheism, the obliteration of the self in an ecstatic unity. Godless mysticism cannot escape the finality of tragedy, or make beauty eternal. It does not dissolve inner conflict into the false quietude of any oceanic calm. All it offers is mere being.

There is no redemption from being human. But no redemption is needed.

In the end, reading Gray is a good way to challenge yourself; to think about the world in a different way, and to examine your dogmas. Even the most cherished one of all.

Krista Tippett: On Generous Listening and Asking Better Questions

Krista Tippett, whose wonderful book Becoming Wise: An Inquiry Into the Art of Living distills many of her conversations, offers us a window into exploring ourselves and others, through generous listening and asking better questions by moving away from the false refuge of certitude.

On the art of starting new kinds of conversations Tippett offers shining wisdom, countering the notion that we need to win or lose.

I find myself drawn to black holes in common life— painful, complicated, shameful things we can scarcely talk about at all, alongside the arguments we replay ad nauseam, with the same polar opposites defining, winning, or losing depending on which side you’re on, with predictable dead-end results. The art of starting new kinds of conversations, of creating new departure points and new outcomes in our common grappling, is not rocket science. But it does require that we nuance or retire some habits so ingrained that they feel like the only way it can be done. We’ve all been trained to be advocates for what we care about. This has its place and its value in civil society, but it can get in the way of the axial move of deciding to care about each other.

Listening is an everyday act, and perhaps art, that many of us neglect.

Listening is more than being quiet while the other person speaks until you can say what you have to say.

Tippett introduces us to generous listening, language she picked up from a conversation with Rachel Naomi Remen, who uses it to describe what doctors should practice. Tippett explains:

Generous listening is powered by curiosity, a virtue we can invite and nurture in ourselves to render it instinctive. It involves a kind of vulnerability— a willingness to be surprised, to let go of assumptions and take in ambiguity. The listener wants to understand the humanity behind the words of the other, and patiently summons one’s own best self and one’s own best words and questions.

Of the many reasons we would want to engage and renew our listening skills, asking better questions is near the top.

[W]e trade mostly in answers— competing answers— and in questions that corner, incite, or entertain. In journalism we have a love affair with the “tough” question, which is often an assumption masked as an inquiry and looking for a fight. … My only measure of the strength of a question now is in the honesty and eloquence it elicits.

Questions are the means by which we explore ourselves, each other, and the world.

If I’ve learned nothing else, I’ve learned this: a question is a powerful thing, a mighty use of words. Questions elicit answers in their likeness. Answers mirror the questions they rise, or fall, to meet. So while a simple question can be precisely what’s needed to drive to the heart of the matter, it’s hard to meet a simplistic question with anything but a simplistic answer. It’s hard to transcend a combative question. But it’s hard to resist a generous question. We all have it in us to formulate questions that invite honesty, dignity, and revelation. There is something redemptive and life-giving about asking a better question.

Questions themselves can offer no immediate need of answers. Counter to our notion that everything must have an answer, some of the most worthwhile questions are the ones with no immediate answers.

And yet we insist on dividing so much of life into competing certainties.

We want others to acknowledge that our answers are right. We call the debate or get on the same page or take a vote and move on. The alternative involves a different orientation to the point of conversing in the first place: to invite searching— not on who is right and who is wrong and the arguments on every side; not on whether we can agree; but on what is at stake in human terms for us all. There is value in learning to speak together honestly and relate to each other with dignity, without rushing to common ground that would leave all the hard questions hanging.

In a way answers are like the goals that Scott Adams brought to our attention — a false, but comforting, refuge. Yet, for many of us probing ourselves with questions about how we should live and what it means to be a citizen in a global world, it is in the search that we find meaning.

The Most Respectful Interpretation

Consider this situation: You email a colleague with a question expecting a prompt response, but hours or days later you’ve yet to hear from them. Perhaps you can’t move forward on your project without their input so you find yourself blocked. How do you imagine you feel in this situation?

For many of us, situations like this result in feelings of anger, frustration, or annoyance. Maybe we take it personally and conclude that our colleague is lazy or that they don’t value our time or our work. Perhaps we send off a terse reminder asking for an update.

If we’re feeling particularly revengeful, we alert the person’s manager or mention our grievance to another colleague looking for validation that the offending colleague is in fact lazy and disrespectful – a form of confirmation bias.

Perhaps this colleague has been slow to respond to communications in the past, thus we extrapolate that to all of their communications, a case of the fundamental attribution error.

Of course, it's natural to feel anger and frustration when faced with these situations. But is anger the appropriate response?

In the Nicomachean Ethics Aristotle wrote about The Virtue Concerned with Anger. He begins Book IV with a description of good temper:

The man who is angry at the right things and with the right people, and, further, as he ought, when he ought, and as long as he ought, is praised. This will be the good-tempered man, then, since good temper is praised.

Aristotle tells us that anger has a time and place and that when applied to the right people and for the right reason, is justified and even praiseworthy. But we have to use anger judiciously:

For the good-tempered man tends to be unperturbed and not to be led by passion, but to be angry in the manner, at the things, and for the length of time, that reason dictates; but he is thought to err rather in the direction of deficiency; for the good-tempered man is not revengeful, but rather tends to make allowances.

In Aristotle’s description of good temper, he encourages us to err in the direction of “making allowances”. But how can we do this in practice?

Let’s return to our example.

We take our colleague's lack of response personally and assume they are lazy or disrespectful, but it is important for us to recognize that we are assuming. We often instinctively chose to assume the worst of people, because it slips easily into mind. But what if instead we chose to assume the best?

In her book Rising Strong, Brené Brown describes how she learned to assume that people are doing the best they can and shares a concept introduced to her by Dr. Jean Kantambu Latting, a professor at University of Houston. Brown writes:

Whenever someone would bring up a conflict with a colleague, she would ask, ‘What is the hypothesis of generosity? What is the most generous assumption you can make about this person’s intentions or what this person said?’

By pausing to reflect on our anger we can recognize that we are making a negative assumption and challenge ourselves to invert the situation and consider the opposite: “What is the most generous assumption I can make?”

Perhaps our colleague has been given a higher priority project, or they don’t understand that we’re blocked without their input. Maybe they are dealing with some personal challenges outside of the office, or they need input from somebody else to reply to our message and thus they’re blocked as well. Perhaps they've decided to reduce their email frequency in order to focus on important work.

When we pause to look at the situation from another angle, not only do we entertain some explanations that frame our colleagues in a more positive light, but we put ourselves into their shoes; the very definition of empathy.

We’ve all had competing priorities, distractions from personal issues outside of work, miscommunications regarding the urgent need of our response, etc. Do we think others judged us fairly or unfairly in those moments?

The point is not to make excuses or avoid addressing problems with our colleagues, but that if we recognize we are making negative assumptions by default, we might need to challenge ourselves to consider more generous alternatives. This may alter the way we approach our colleague to address the situation. It takes effort and a commitment to think about people differently.

Someone who knew this best was the late, great author David Foster Wallace.

***

In his beautiful commencement speech to the Kenyon graduating class of 2005, Wallace reminds the students that the old cliché of liberal arts education teaching you to think is truer than they might want to believe. He warns that one of the biggest challenges the graduates will face in life is to challenge their self-centered view of the world – a view that we all have by default.

Using some of life’s more mundane and annoying activities like shopping and commuting, Wallace writes:

The point here is that I think this is one part of what teaching me how to think is really supposed to mean. To be just a little less arrogant. To have just a little critical awareness about myself and my certainties. Because a huge percentage of the stuff that I tend to be automatically certain of is, it turns out, totally wrong and deluded. I have learned this the hard way, as I predict you graduates will, too.

Here is just one example of the total wrongness of something I tend to be automatically sure of: everything in my own immediate experience supports my deep belief that I am the absolute centre of the universe; the realest, most vivid and important person in existence. We rarely think about this sort of natural, basic self-centeredness because it’s so socially repulsive. But it’s pretty much the same for all of us. It is our default setting, hard-wired into our boards at birth. Think about it: there is no experience you have had that you are not the absolute centre of. The world as you experience it is there in front of YOU or behind YOU, to the left or right of YOU, on YOUR TV or YOUR monitor. And so on. Other people’s thoughts and feelings have to be communicated to you somehow, but your own are so immediate, urgent, real.

Please don’t worry that I’m getting ready to lecture you about compassion or other-directedness or all the so-called virtues. This is not a matter of virtue. It’s a matter of my choosing to do the work of somehow altering or getting free of my natural, hard-wired default setting which is to be deeply and literally self-centered and to see and interpret everything through this lens of self. People who can adjust their natural default setting this way are often described as being “well-adjusted”, which I suggest to you is not an accidental term.

The recognition that we are inherently self-centered and that this affects the way in which we interpret the world seems so obvious when pointed out, but how often do we stop to consider it? This is our hard-wired default setting, so it's quite a challenge to become willing to think differently.

As an example, Wallace describes a situation where he is disgusted by the gas guzzling Hummer in front of him in traffic. The idea of these cars offends him and he starts making assumptions about the drivers: they're wasteful, inconsiderate of the planet, and inconsiderate of future generations.

Look, if I choose to think this way in a store and on the freeway, fine. Lots of us do. Except thinking this way tends to be so easy and automatic that it doesn’t have to be a choice. It is my natural default setting. It’s the automatic way that I experience the boring, frustrating, crowded parts of adult life when I’m operating on the automatic, unconscious belief that I am the centre of the world, and that my immediate needs and feelings are what should determine the world’s priorities.

But then he challenges himself to consider alternative interpretations, something often described as making the Most Respectful Interpretation (MRI). Wallace decides to consider more respectful interpretations of the other drivers – maybe they have a legitimate need to be driving a large SUV or to be rushing through traffic.

In this traffic, all these vehicles stopped and idling in my way, it’s not impossible that some of these people in SUV’s have been in horrible auto accidents in the past, and now find driving so terrifying that their therapist has all but ordered them to get a huge, heavy SUV so they can feel safe enough to drive. Or that the Hummer that just cut me off is maybe being driven by a father whose little child is hurt or sick in the seat next to him, and he’s trying to get this kid to the hospital, and he’s in a bigger, more legitimate hurry than I am: it is actually I who am in HIS way.

Again, please don't think that I'm giving you moral advice, or that I'm saying you're “supposed to” think this way, or that anyone expects you to just automatically do it, because it's hard, it takes will and mental effort, and if you're like me, some days you won't be able to do it, or you just flat out won't want to. But most days, if you’re aware enough to give yourself a choice, you can choose to look differently at this fat, dead-eyed, over-made-up lady who just screamed at her kid in the checkout line. Maybe she’s not usually like this. Maybe she’s been up three straight nights holding the hand of a husband who is dying of bone cancer. Or maybe this very lady is the low-wage clerk at the motor vehicle department, who just yesterday helped your spouse resolve a horrific, infuriating, red-tape problem through some small act of bureaucratic kindness. Of course, none of this is likely, but it’s also not impossible. It just depends what you want to consider. If you’re automatically sure that you know what reality is, and you are operating on your default setting, then you, like me, probably won’t consider possibilities that aren’t annoying and miserable. But if you really learn how to pay attention, then you will know there are other options.

A big part of learning to think is recognizing our default reactions and responses to situations — the so-called “System 1” thinking espoused by Daniel Kahneman. Learning to be “good-tempered” and “well-adjusted” requires us to try to be more self-aware, situationally aware, and to acknowledge our self-centered nature; to put the brakes on and use System 2 instead.

So the next time you find yourself annoyed with your colleagues, angry at other drivers on the road, or judgmental about people standing in line at the store, use it as an opportunity to challenge your negative assumptions and try to interpret the situation in a more respectful and generous way. You might eventually realize that the broccoli tastes good.

The 16 Best Books of 2016

Rewarding reads on love, life, knowledge, history, the future, and tools for thinking. Out of all the books I read this year, here is a list of what I found most worth reading in 2016.

1. The Psychology of Man’s Possible Evolution
These lectures, which were originally called Six Psychological Lectures, were first privately printed in the 1940s. Of the first run of 150 copies, none were sold. The essays were published once again after Ouspensky’s death, and unlike last time became a hit. While the book is about psychology, it’s different than what we think of as psychology — “for thousands of years psychology existed under the name philosophy.” Consider this a study in what man may become — by working simultaneously on knowledge and inner unity.

2. The Island of Knowledge: The Limits of Science and the Search for Meaning
Imagine the sum of our knowledge as an Island in a vast and endless ocean. This is the Island of Knowledge. The coastline represents the boundary between the known and unknown. As we grow our understanding of the world, the Island grows and with it so does the shores of our ignorance. “We strive toward knowledge, always more knowledge,” Gleiser writes, “but must understand that we are, and will remain, surrounded by mystery.” The book is a fascinating and wide-ranging tour through scientific history. (Dig Deeper into this amazing read here.)

3. When Breath Becomes Air
It’s been a while since I’ve cried reading a book. This beautifully written memoir, by a young neurosurgeon diagnosed with terminal cancer, attempts to answer the question What makes a life worth living? If you read this and you’re not feeling something you’re probably a robot.

4. The Sovereign Individual: Mastering the Transition to the Information Age
The book, which argues “the information revolution will destroy the monopoly power of the nation-state as surely as the Gunpowder Revolution destroyed the Church’s monopoly,” is making the rounds in Silicon Valley and being passed around like candy. Even if its forecasts are controversial, the book is a good read and it’s full of interesting and detailed arguments. I have underlines on nearly every page. “Information societies,” the authors write, “promise to dramatically reduce the returns to violence … When the payoff for organizing violence at a large scale tumbles, the payoff from violence at a smaller scale is likely to jump. Violence will become more random and localized.” The Sovereign Individual, who, for the first time “can educate and motivate himself,” will be “almost entirely free to invest their own work and realize the full benefits of their own productivity.” An unleashing of human potential which will, the authors argue, shift the greatest source of wealth to ideas rather than physical capital — “anyone who thinks clearly will potentially be rich.” Interestingly, in this potential transition, the effects are “likely to be centered among those of the middle talent in currently rich countries. They particularly may come to feel that information technology poses a threat to their way of life.” The book predicts the death of politics, “weakened by the challenge from technology, the state will treat increasingly autonomous individuals, its former citizens, with the same range of ruthlessness and diplomacy it has heretofore displayed in its dealings with other governments.” As technology reshapes the world, it also “antiquates laws, reshapes morals, and alters preconceptions. This book explains how.”

5. To Kill a Mockingbird
I know, I know. Hear me out. Someone I respect mentioned that he thought Atticus Finch was the perfect blend of human characteristics. Tough and skilled, yet humble and understanding. He’s frequently rated as a “most admired” hero in fiction, yet he’s a lawyer competing with Jedis, Detectives, Spies, and Superheroes. Isn’t that kind of interesting? Since it had been at least 15 years since I’d read TKM, I wanted to go back and remember what made Atticus so admired. His courage, his humility, his understanding of people. I forgot just how perceptive Finch was when it came to what we’d call “group social dynamics” — he forgives the individual members of the mob that show up to hurt Tom Robinson simply because he understands that mob psychology is capable of overwhelming otherwise good people. How many of us would be able to do that? Atticus Finch is certainly a fictional, and perhaps “unattainably” moral hero. But I will point out that not only do real life “Finch’s” exist, but that even if we don’t “arrive” at a Finchian level of heroic integrity and calm temperament, it’s certainly a goal worth pursuing. Wise words from the book Rules for a Knight sums it up best: “To head north, a knight may use the North Star to guide him, but he will not arrive at the North Star. A knight’s duty is to proceed in that direction.” (Here are some of the lessons I took away from the book.)

6. Lee Kuan Yew: The Grand Master’s Insights on China, the United States, and the World
If you’re not familiar with Lee Kuan Yew, he’s the “Father of Modern Singapore,” the man who took a small, poor island just north of the equator in Southeast Asia with GDP per capita of ~$500 in 1965 and turned it into a modern powerhouse with GDP per capita of over $70,000 as of 2014, with some of the lowest rates of corruption and highest rates of economic freedom in the world. Finding out how he did it is worth anyone’s time. This book is a short introduction to his style of thinking: A series of excerpts of his thoughts on modern China, the modern U.S., Islamic Terrorism, economics, and a few other things. It’s a wonderful little collection. (We’ve actually posted about it before.) Consider this an appetizer (a delicious one) for the main course: From Third World to First, Yew’s full account of the rise of Singapore. (Dig deeper here.)

7. An Illustrated Book of Bad Arguments
Perfect summer reading for adults and kids alike. One friend of mine has created a family game where they all try to spot the reasoning flaws of others. The person with the most points at the end of the week gets to pick where they go for dinner. I have a suspicion his kids will turn out to be politicians or lawyers.

8. Intuition Pumps and Other Tools for Thinking
Dan Dennett is one of the most well known cognitive scientists on the planet . This book is a collection of 77 short essays on different “thinking tools,” basically thought experiments Dennett uses to slice through tough problems, including some tools for thinking about computing, thinking about meaning, and thinking about consciousness. Like Richard Feynman’s great books, this one acts as a window into a brilliant mind and how it handles interesting and difficult problems. If you only walk away with a few new mental tools, it’s well worth the time spent. (You can learn a lot more about Dennett here, here, and here.)

9. The Seven Sins of Memory (How the Mind Forgets and Remembers)
I found this in the bibliography of Judith Rich Harris’ No Two Alike. Schacter is a psychology professor at Harvard who runs the Schacter Memory Lab. The book explores the seven “issues” we tend to find with regard to our memory: Absent-mindedness, transience, blocking, misattribution, suggestibility, bias, and persistence. The fallibility of memory is so fascinating: We rely on it so heavily and trust it so deeply, yet as Schacter shows, it’s extremely faulty. It’s not just about forgetting where you left your keys. Modern criminologists know that eyewitness testimony is deeply flawed. Some of our deepest and most hard-won memories — the things we know are true — are frequently wrong or distorted. Learning to calibrate our confidence in our own memory is not at all easy. Very interesting topic to explore. (We did a three part series on this book. Introduction and parts One, Two, and Three).

10. Talk Lean: Shorter Meetings. Quicker Results. Better Relations
This book is full of useful tips on listening better, being candid and courteous, and learning what derails meetings, conversations, and relationships with people at work. Don’t worry. It’s not about leaving things unsaid that might be displeasing for other people. In fact, leaving things unsaid is often more detrimental to the relationship than airing them out. Rather, it’s about finding a way to say them so people will hear them and not feel defensive. If you want to get right to the point and not alienate people, this book will help you. I know because this is something, personally, I struggle with at times.

11. The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World
I recently had a fascinating multi-hour dinner with the author, Pedro Domingos, on where knowledge comes from. Historically, at least, the answer has been evolution, experience, and culture. Now, however, there is a new source of knowledge: Machine learning. The book offers an accessible overview of the different ways of machine learning and the search for a master, unifying, theory. The book also covers how machine learning works and gives Pedro’s thoughts on where we’re headed. (Dig deeper in this podcast.)

12. Why Don’t We Learn from History?
This is a short (~120pp) book by the military historian and strategist B.H. Liddell Hart, a man who not only wrote military history but surely influenced it, especially in Germany in the World War period. He wrote this short synthesis at the end of his life and didn’t have a chance to finish it, but the result is still fascinating. Hart takes a “negative” view of history; in other words, What went wrong? How can we avoid it? The result of that study, as he writes in the introduction, is that “History teaches us personal philosophy.” Those who learn vicariously as well as directly have a big leg up. Something to take to heart. I plan to read more of his works.

13. A Powerful Mind: The Self-Education of George Washington
What a great book idea by Adrienne Harrison. There are a zillion biographies of GW out there, with Chernow's getting a lot of praise recently. But Harrison narrows in on Washington’s self-didactic nature. Why did he read so much? How did he educate himself? Any self-motivated learner is probably going to enjoy this.

14. Sapiens: A Brief History of Humankind
One of the best books I’ve come across in a long time. Sapiens is a work of “Big History” — in the style of Jared Diamond’s Guns, Germs, and Steel — that seeks to understand humanity in a deep way. Many of Professor Harari’s solutions will be uncomfortable for some to read, there is no attempt at political correctness, but his diagnosis of human history is undeniably interesting and at least partially correct. He draws on many fields to arrive at his conclusions; a grand method of synthesis that will be familiar to long-time Farnam Street readers. The book is almost impossible to summarize given the multitude of ideas presented. But then again, most great books are. (Dig deeper into this amazing read here, here, and here.)

15. Becoming Wise: An Inquiry into the Mystery and Art of Living — A refreshing signal in world of noise that should be read and immediately re-read. There is so much goodness in here that scarcely will you find more than a page or two in my copy without a mark, bent page, or highlight. The entire book offers texture to thoughts you knew you had but didn't know how to express.

16. The Happiness Trap: How to Stop Struggling and Start Living
The way most of us search for and attempt to hold onto fleeting moments of happiness ends up ensuring that we’re miserable. A great practical book on developing mindfulness, which is so important in many aspects of your life, including satisfaction. Might be the best self-help book I’ve read.

 

 

Epistemology: How do you Know that you Know what you Know?

The role of perception in knowledge

It is hard to imagine a world that exists outside of what we can perceive. In the effort to get through each day without crashing our cars or some other calamity, we make assumptions about the objects in our physical world. Their continuity, their behaviour.

Some of these assumptions are based on our own experience, some on the knowledge imparted by others of their experience, and some on inferences of logic.

Experience, however, comes through the lens of perception. How things look, how they feel, how they sound.

Our understanding of, and interaction with, the world comes through particular constructs of the human body – eyes, ears, fingers, etc. Most people intuitively understand the subjectivity of some of our perceptions.

Colors look ‘different' to people who are color blind. Our feeling of temperature is impacted by immediate contrast – People stepping outside the doors of an airport will have a different impression of the temperature if they have just come from Moose Jaw or Cancun.

Even more substantial understandings come to us through the lens of our senses. We can see the shape of a tree, or we could close our eyes and infer the shape through touch, but in either case, or even combining the two, we are relying on our senses to impart an understanding of the physical world.

The question of what objectively ‘is', is something that has long been one of the subjects of philosophy. Philosophers from Descartes to Kant have tried to describe our existence in such a way as to arrive at understanding of the physical world in which things can be conclusively known.

Descartes introduces the idea in his Meditations: “Surely whatever I had admitted until now as most true I received either from the senses or through the senses. However, I have noticed that the senses are sometimes deceptive; and it is a mark of prudence never to place our complete trust in those who have deceived us even once.”

Descartes famously employed systematic doubt, questioning all knowledge conveyed by his experience in the world until the only knowledge he couldn't doubt was the fact that he could doubt.

Therefore I suppose that everything I see is false. I believe that none of what my deceitful memory represents ever existed. I have no sense whatever. Body, shape, extension, movement, and place are all chimeras. What then will be true? … Thus, after everything has been most carefully weighed, it must finally be established that this pronouncement “I am, I exist” is necessarily true every time I utter it or conceive it in my mind. (Descartes, Meditations)

Descartes confirmed we have a self. Unfortunately this self could be the one we see in the mirror each morning or a brain in a vat. If the only thing we cannot doubt is that we can doubt, essentially that guarantees us having only the mechanism to doubt. No body. We could therefore be isolated brains, being manipulated by things unknown, our entire world a mirage.

How then can we hope to claim knowledge about the physical world?

For Locke, our understanding of the world comes from our experience of it. It is this experience that provides knowledge. He says, in his Essay Concerning Human Understanding:

Let us then suppose the mind to be, as we say, white paper, void of all characters, without any ideas: – How comes it to be furnished? Whence comes it by that vast store with the busy and boundless fancy of man has painted on it with an almost endless variety Whence has it all the materials or reason and knowledge? To this I answer, in one word, from EXPERIENCE. In that all our knowledge is founded; and from that it ultimately derives itself.

He wrote that there were two types of qualities, ones that existed innately in an object or series of objects, such as size, number, or motion, and those that are wholly dependent on our perception of them, such as color or smell.

The particular bulk, number, figure, and motion of the parts of fire or snow are really in them, whether one's senses perceive them or no: and therefore they may be called real qualities, because they really exist in those bodies. But light, heat, whiteness, or coldness are not more really in them than sickness or pain is in manna. (Locke, An Essay Concerning Human Understanding)

Experience then, as long as we have an understanding of the limitations of our perception, will confer certain truths about the physical world we inhabit. For example, through experience we can claim knowledge of how many crows are perched on a telephone wire, but not how many of them have ‘black’’ as an intrinsic property of their feathers.

Quite in opposition to this was George Berkeley (pronounced Bar-clay), for whom ‘to be' was ‘to be perceived'. Berkeley wrote in A Treatise Concerning the Principles of Human Knowledge:

Besides all that endless variety of ideas or objects of knowledge, there is likewise something which knows or perceives them and exercised divers operations, as willing, imagining, remembering, about them. This perceiving … does not denote any one of my ideas, but a thing entirely distinct from them, wherein they exist or, which is the same thing, whereby they are perceived – for the existence of an idea consists in being perceived.

Because our knowledge of the world comes from our perception of it, it is impossible to conclusively know the existence of anything independent of our perception. Berkeley, wrote:

Hence, as it is impossible for me to see or feel anything without an actual sensation of that thing, so it is impossible for me to conceive in my thoughts any sensible thing or object distinct from the sensation or perception of it.

This line of inquiry ultimately results in the entire physical world being called into question, as Berkeley observed:

If we have any knowledge at all of external things, it must be by reason, inferring their existence from what is immediately perceived by sense. {However} it is granted on all hands (and what happens in dreams, frenzies, and the like, puts it beyond dispute) that it is possible we might be affected with all the ideas we have now, though no bodies existed without resembling them.

If we can not know things outside of perception, and our perceptions are entirely unreliable, where does that leave us? It certainly isn't useful to imagine your existence as the sum total of your knowledge, or that our experiences are inherently mistrustful.

What these philosophies can be useful for understanding though, is that often what we consider knowledge is more of a general social agreement on a somewhat consistent comprehension of the things before us. For example, we appreciate that the color green can be perceived differently by various people, but we organize our language based on a general understanding of the color green without worrying about the particular experience of green that any individual may have.

For David Hume, there definitely was a physical world, our perception of which was ultimately responsible for all of our ideas, no matter how complex or abstract. He wrote in An Enquiry Concerning Human Understanding:

When we analyze our thoughts or ideas, however compounded or sublime, we always find that they resolve themselves into such simple ideas as were copied from a precedent feeling or sentiment. Even those ideas, which, at first view, seem the most wide of this origin, are found, upon a nearer scrutiny, to be derived from it.

Furthermore, since all of our perceptions of the physical world are coming from the same physical world, and the nature of perceiving works more or less the same in each person, we can achieve a consistency in our understanding.

So although it may not be possible to know things with the same certainty as knowing oneself, or to be able to really describe the construct of the world outside of our perception of it, at least we can get along with each other because of a general consistency of experience.

However, this experience still admits to a certain fragility. There is no guarantee that past experiences will be consistent with future ones. In An Enquiry Concerning Human Understanding, Hume observes:

Being determined by custom to transfer the past to the future, in all our inferences; where the past has been entirely regular and uniform, we expect the event with the greatest assurance and leave no room for any contrary supposition. But where different effects have been found to follow from causes, which are to appearance exactly similar, all these various effects must occur to the mind in transferring the past to the future, and enter into our consideration, when we determine the probability of the event.

To simultaneously understand all effects when considering an event in the future is not necessarily a limitation, thanks to our amazingly sophisticated brains. Immanuel Kant thought that the way we process the information provided by our senses was an important component of knowledge. Kant wrote in the Prolegomena to Any Future Metaphysics:

The difference between truth and dreaming is not ascertained by the nature of the representations which are referred to objects (for they are the same in both cases), but by their connection according to those rules which determine the coherence of the representation in the concept of an object, and by ascertaining whether they can subsist together in experience or not.

Kant did not support the view that the existence of objects was called into question because of the subjectivity of the perceptions by which we must experience them, but neither that all knowledge of the physical world comes from experience. Kant argued:

Experience teaches us what exists and how it exists, but never that it must necessarily exist so and not otherwise. Experience therefore can never teach us the nature of things in themselves.

Knowledge then, is made up of things we infer, things we experience, and the way our brain processes both. The great metaphysical question of ‘Why it is all this way?’ may always be out of our reach.

Understanding some of this metaphysical uncertainty in knowledge does not mean that we have to give up on knowing anything. It simply points to a certain subjectivity, an allowance for different conceptions of the world. And hopefully it offers a set of tools with which to evaluate or build claims of knowledge.

The Island of Knowledge: Science and the Meaning of Life

“As the Island of Knowledge grows, so do the shores of our ignorance—the boundary between the known and unknown. Learning more about the world doesn't lead to a point closer to a final destination—whose existence is nothing but a hopeful assumption anyways—but to more questions and mysteries. The more we know, the more exposed we are to our ignorance, and the more we know to ask.”

***

Common across human history is our longing to better understand the world we live in, and how it works. But how much can we actually know about the world?

In his book, The Island of Knowledge: The Limits of Science and the Search for Meaning, Physicist Marcelo Gleiser traces our progress of modern science in the pursuit to the most fundamental questions on existence, the origin of the universe, and the limits of knowledge.

What we know of the world is limited by what we can see and what we can describe, but our tools have evolved over the years to reveal ever more pleats into our fabric of knowledge. Gleiser celebrates this persistent struggle to understand our place in the world and travels our history from ancient knowledge to our current understanding.

While science is not the only way to see and describe the world we live in, it is a response to the questions on who we are, where we are, and how we got here. “Science speaks directly to our humanity, to our quest for light, ever more light.

To move forward, science needs to fail, which runs counter to our human desire for certainty. “We are surrounded by horizons, by incompleteness.” Rather than give up, we struggle along a scale of progress. What makes us human is this journey to understand more about the mysteries of the world and explain them with reason. This is the core of our nature.

While the pursuit is never ending, the curious journey offers insight not just into the natural world, but insight into ourselves.

“What I see in Nature is a magnificent structure that we can comprehend only
very imperfectly,
and that must fill a thinking person with a feeling of humility.”
— Albert Einstein

We tend to think that what we see is all there is — that there is nothing we cannot see. We know it isn't true when we stop and think, yet we still get lulled into a trap of omniscience.

Science is thus limited, offering only part of the story — the part we can see and measure. The other part remains beyond our immediate reach.

What we see of the world,” Gleiser begins, “is only a sliver of what's out there.”

There is much that is invisible to the eye, even when we augment our sensorial perception with telescopes, microscopes, and other tools of exploration. Like our senses, every instrument has a range. Because much of Nature remains hidden from us, our view of the world is based only on the fraction of reality that we can measure and analyze. Science, as our narrative describing what we see and what we conjecture exists in the natural world, is thus necessarily limited, telling only part of the story. … We strive toward knowledge, always more knowledge, but must understand that we are, and will remain, surrounded by mystery. This view is neither antiscientific nor defeatist. … Quite the contrary, it is the flirting with this mystery, the urge to go beyond the boundaries of the known, that feeds our creative impulse, that makes us want to know more.

While we may broadly understand the map of what we call reality, we fail to understand its terrain. Reality, Gleiser argues, “is an ever-shifting mosaic of ideas.”

However…

The incompleteness of knowledge and the limits of our scientific worldview only add to the richness of our search for meaning, as they align science with our human fallibility and aspirations.

What we call reality is a (necessarily) limited synthesis. It is certainly our reality, as it must be, but it is not the entire reality itself:

My perception of the world around me, as cognitive neuroscience teaches us, is synthesized within different regions of my brain. What I call reality results from the integrated sum of countless stimuli collected through my five senses, brought from the outside into my head via my nervous system. Cognition, the awareness of being here now, is a fabrication of a vast set of chemicals flowing through myriad synaptic connections between my neurons. … We have little understanding as to how exactly this neuronal choreography engenders us with a sense of being. We go on with our everyday activities convinced that we can separate ourselves from our surroundings and construct an objective view of reality.

The brain is a great filtering tool, deaf and blind to vast amounts of information around us that offer no evolutionary advantage. Part of it we can see and simply ignore. Other parts, like dust particles and bacteria, go unseen because of limitations of our sensory tools.

As the Fox said to the Little Prince in Antoine de Saint-Exupery's fable, “What is essential is invisible to the eye.” There is no better example than oxygen.

Science has increased our view. Our measurement tools and instruments can see bacteria and radiation, subatomic particles and more. However precise these tools have become, their view is still limited.

There is no such thing as an exact measurement. Every measurement must be stated within its precision and quoted together with “error bars” estimating the magnitude of errors. High-precision measurements are simply measurements with small error bars or high confidence levels; there are no perfect, zero-error measurements.

[…]

Technology limits how deeply experiments can probe into physical reality. That is to say, machines determine what we can measure and thus what scientists can learn about the Universe and ourselves. Being human inventions, machines depend on our creativity and available resources. When successful, they measure with ever-higher accuracy and on occasion may also reveal the unexpected.

“All models are wrong, some are useful.”
— George Box

What we know about the world is only what we can detect and measure — even if we improve our “detecting and measuring” as time goes along. And thus we make our conclusions of reality on what we can currently “see.”

We see much more than Galileo, but we can't see it all. And this restriction is not limited to measurements: speculative theories and models that extrapolate into unknown realms of physical reality must also rely on current knowledge. When there is no data to guide intuition, scientists impose a “compatibility” criterion: any new theory attempting to extrapolate beyond tested ground should, in the proper limit, reproduce current knowledge.

[…]

If large portions of the world remain unseen or inaccessible to us, we must consider the meaning of the word “reality” with great care. We must consider whether there is such a thing as an “ultimate reality” out there — the final substrate of all there is — and, if so, whether we can ever hope to grasp it in its totality.

[…]

We thus must ask whether grasping reality's most fundamental nature is just a matter of pushing the limits of science or whether we are being quite naive about what science can and can't do.

Here is another way of thinking about this: if someone perceives the world through her senses only (as most people do), and another amplifies her perception through the use of instrumentation, who can legitimately claim to have a truer sense of reality? One “sees” microscopic bacteria, faraway galaxies, and subatomic particles, while the other is completely blind to such entities. Clearly they “see” different things and—if they take what they see literally—will conclude that the world, or at least the nature of physical reality, is very different.

Asking who is right misses the point, although surely the person using tools can see further into the nature of things. Indeed, to see more clearly what makes up the world and, in the process to make more sense of it and ourselves is the main motivation to push the boundaries of knowledge. … What we call “real” is contingent on how deeply we are able to probe reality. Even if there is such thing as the true or ultimate nature of reality, all we have is what we can know of it.

[…]

Our perception of what is real evolves with the instruments we use to probe Nature. Gradually, some of what was unknown becomes known. For this reason, what we call “reality” is always changing. … The version of reality we might call “true” at one time will not remain true at another. … Given that our instruments will always evolve, tomorrow's reality will necessarily include entitles not known to exist today. … More to the point, as long as technology advances—and there is no reason to suppose that it will ever stop advancing for as long as we are around—we cannot foresee an end to this quest. The ultimate truth is elusive, a phantom.

Gleiser makes his point with a beautiful metaphor. The Island of Knowledge.

Consider, then, the sum total of our accumulated knowledge as constituting an island, which I call the “Island of Knowledge.” … A vast ocean surrounds the Island of Knowledge, the unexplored ocean of the unknown, hiding countless tantalizing mysteries.

The Island of Knowledge grows as we learn more about the world and ourselves. And as the island grows, so too “do the shores of our ignorance—the boundary between the known and unknown.”

Learning more about the world doesn't lead to a point closer to a final destination—whose existence is nothing but a hopeful assumption anyways—but to more questions and mysteries. The more we know, the more exposed we are to our ignorance, and the more we know to ask.

As we move forward we must remember that despite our quest, the shores of our ignorance grow as the Island of Knowledge grows. And while we will struggle with the fact that not all questions will have answers, we will continue to progress. “It is also good to remember,” Gleiser writes, “that science only covers part of the Island.”

Richard Feynman has pointed out before that science can only answer the subset of question that go, roughly, “If I do this, what will happen?” Answers to questions like Why do the rules operate that way? and Should I do it? are not really questions of scientific nature — they are moral, human questions, if they are knowable at all.

There are many ways of understanding and knowing that should, ideally, feed each other. “We are,” Gleiser concludes, “multidimensional creatures and search for answers in many, complementary ways. Each serves a purpose and we need them all.”

“The quest must go on. The quest is what makes us matter: to search for more answers, knowing that the significant ones will often generate surprising new questions.”

The Island of Knowledge is a wide-ranging tour through scientific history from planetary motions to modern scientific theories and how they affect our ideas on what is knowable.