Farnam Street helps you make better decisions, innovate, and avoid stupidity.
With over 400,000 monthly readers and more than 93,000 subscribers to our popular weekly digest, we've become an online intellectual hub.
Farnam Street helps you make better decisions, innovate, and avoid stupidity.
With over 400,000 monthly readers and more than 93,000 subscribers to our popular weekly digest, we've become an online intellectual hub.
We’re constantly asked for examples of the “multiple mental models” approach in practice. Our standard response includes great books like Garrett Hardin’s Filters Against Folly and Will Durant’s The Lessons of History.
One of the well-known examples of this brand of thinking is Guns, Germs, and Steel, a book that opened thousands of eyes to the power of leaping across the walls of history, sociology, biology, geography and other fields to truly understand the world. (If you haven’t read it yet, why are you still here? Go order it and read it!)
Jared Diamond, the book’s author, is a great master of synthesis across many fields — works like The Third Chimpanzee and Collapse show great critical thinking prowess, even if you don’t come to 100% agreement with him.
Lesser known than Guns, Germs, and Steel is a follow-up talk Diamond gave entitled How to Get Rich:
… probably most lectures one hears at the museum are on fascinating but impractical subjects: namely, they don’t help you to get rich. This evening I plan to redress the balance and talk about the natural history of becoming rich.
The talk is a great, and short, introduction to “multiple mental models” thinking. Diamond, of course, does not literally answer the question of How to Get Rich. He’s smart enough to know that this is charlatan territory if answered too literally. (Three steps to surefire wealth!)
But he does effectively answer an interesting part of the equation of getting rich: What conditions do we need to set up maximal productivity, learning, and cooperation among our groups?
Diamond answers his question through the same use of inter-disciplinary synthesis his readers would be familiar with: As you read it, you’ll see models from biology, military history, business/economics, and geography.
His answer has two main parts: Optimal group size/fragmentation, and optimal exposure to outside competition:
So what this suggests is that we can extract from human history a couple of principles. First, the principle that really isolated groups are at a disadvantage, because most groups get most of their ideas and innovations from the outside. Second, I also derive the principle of intermediate fragmentation: you don’t want excessive unity and you don’t want excessive fragmentation; instead, you want your human society or business to be broken up into a number of groups which compete with each other but which also maintain relatively free communication with each other. And those I see as the overall principles of how to organize a business and get rich.
Those are wonderful lessons, and you should read the piece to see how he arrives at them. But there’s another important reason we bring the talk to your attention, one of methodology.
Diamond’s talk offers us a powerful principle for our efforts to understand the world: Look for and study natural experiments, the more controlled, the better.
I propose to try to learn from human history. Human history over the last 13,000 years comprises tens of thousands of different experiments. Each human society represents a different natural experiment in organizing human groups. Human societies have been organized very differently, and the outcomes have been very different. Some societies have been much more productive and innovative than others. What can we learn from these natural experiments of history that will help us all get rich? I propose to go over two batches of natural experiments that will give you insights into how to get rich.
This wonderfully useful approach, reminiscent of Peter Kaufman’s idea about the Three Buckets of Knowledge, is one we see used effectively all the time.
Judith Rich Harris used the naturally controlled experiment of identical twins separated at birth to solve the problem of human personality development. Michael Abrashoff had a naturally controlled experiment in leadership principles when he had to turn around the USS Benfold without hiring or firing, or changing ships or missions, or offering any financial incentive to his cadets. Ken Iverson had a naturally controlled experiment in business principles by succeeding dramatically in a business with massive headwinds and no tailwinds.
And so if we follow in the steps of Diamond, Peter Kaufman, Judith Rich Harris, Ken Iverson, and Michael Abrashoff, we might find natural experiments that help illuminate the solutions to our problems in unusual ways. As Diamond says in his talk, the world has already tried thousands of things: All we have to do is study them and then align with the way the world works.
What if eating right wasn’t actually all that complicated?
What if you read enough to see patterns develop, to realize that when you stripped away all the confusing bits that maybe the skeleton underneath was actually pretty simple?
This is what happened to author Michael Pollan a few years ago when he started doing research to try and figure out what he should be eating.
Most of the time when I embark on such an investigation, it quickly becomes clear that matters are much more complicated and ambiguous — several shades grayer — than I thought going in. Not this time. The deeper I delved into the confused and confusing thicket of nutritional science, sorting through the long-running fats versus carb wars, the fiber skirmishes and the raging dietary supplement debates, the simpler the picture gradually became. I learned that in fact science knows a lot less about nutrition than you would expect – that in fact nutrition science is, to put it charitably, a very young science. It’s still trying to figure out exactly what happens in your body when you sip a soda, or what is going on deep in the soul of a carrot to make it so good for you, or why in the world you have so many neurons – brain cells! – in your stomach, of all places. It’s a fascinating subject, and someday the field may produce definitive answers to the nutritional questions that concern us, but — as nutritionists themselves will tell you — they’re not there yet. Not even close. Nutrition science, which after all only got started less than two hundred years ago, is today approximately where surgery was in the year 1650 – very promising, and very interesting to watch, but are you ready to let them operate on you? I think I’ll wait awhile.
The diet industry brings in billions and billions of dollars every year and some of the latest internet celebrities are food and fitness models/gurus. Is it any surprise? Our survival and well-being depends very largely on our health and (arguably) ours looks. The diet industry taps directly into one of our basic survival instincts. Food is cultural.
There is good money to be had if you can find the magical thing that will help people lose weight and feel better. Unfortunately, there is also good money to be had in treating people for illnesses that occur from poor diet and lack of exercise. In short, complexity is good for business. (This is a misaligned incentive problem of the highest order.)
… consider first the complexity that now attends this most basic of creaturely activities. Most of us have come to rely on experts of one kind or another to tell us how to eat — doctors and diet books, media accounts of the latest findings in nutritional science, government advisories and food pyramids, the proliferating health claims on food packages. We may not always heed these experts’ advice, but their voices are in our heads every time we order from a menu or wheel down the aisle in the supermarket. Also in our heads today resides an astonishing amount of biochemistry. How odd is it that everybody now has at least a passing acquaintance with words like “antioxidant,” “saturated fat,” “omega-3 fatty acids,” “carbohydrates,” “polyphenols,” “folic acid,” “gluten,” and “probiotics”? It’s gotten to the point where we don’t see foods anymore but instead look right through them to the nutrients (good and bad) they contain, and of course to the calories — all these invisible qualities in our food that properly understood, supposedly hold the secret to eating well.
But for all the scientific and pseudoscientific food baggage we’ve taken on in recent years we still don’t know what we should be eating. Should we worry more about the fats or the carbohydrates? Then what about the “good” fats? Or the “bad” carbohydrates, like high-fructose corn syrup? How much should we be worrying about gluten? What’s the deal with artificial sweeteners? Is it really true that this breakfast cereal will improve my son’s focus at school or that other cereal will protect me from a heart attack? And when did eating a bowl of breakfast cereal become a therapeutic procedure, anyway?
For Pollan, the picture actually got clearer the further he traveled down the rabbit hole.
While his research uncovered the fact the we don’t know a whole lot about nutrition — there’s a lot of pseudoscience here — one obvious fact seems to recur: populations that eat a Western diet are generally less healthy than those who eat more traditional diets.
What does Pollan mean by “more traditional diet”?
These diets run the gamut from ones very high in fat (the Inuit in Greenland subsist largely on seal blubber) to ones high in carbohydrate (Central American Indians subsist largely on maize and beans) to ones very high in protein (Masai tribesmen in Africa subsist chiefly on cattle blood, meat, and milk), to cite three rather extreme examples. But much the same holds true for more mixed traditional diets. What this suggests is that there is no single ideal human diet but that the human omnivore is exquisitely adapted to a wide range of different foods and a variety of different diets. Except, that is, for one: the relatively new (in evolutionary terms) Western diet that most of us now are eating.
Research has shown that moving away from the Western diet can reduce your chances of developing the chronic illnesses it causes. Pollan believes this shift is most easily done by coming up with a set of simple rules to govern how we eat and interact with food. (This idea reminded us a lot of Donald Sull’s work in Simple Rules. More specifically his decision rules which help us to set boundaries, prioritize, and know when to stop an action.)
No one is quite sure which parts of the Western diet are the most destructive. There are a lot of confounding variables here — one type of food or macronutrient is tough to isolate. Gary Taubes thinks it’s the easily digestible carbohydrates. Others disagree. And since we’re not quite sure, Pollan thinks we should stick with a set of heuristics to get as close as we can.
Pollan curated these rules into a book called Food Rules: An Eater’s Manual. Let’s take a closer look at some of our favorites.
Don’t Eat Anything Your Great Grandmother Wouldn’t Recognize as Food
Agriculture has come a long way since your great grandmother was born. Many chemicals have been created to both enhance the flavor of food and to help with its shelf life. While all these additives aren’t necessarily bad for you it’s still smart to avoid them most of the time. So if you think great grandma wouldn’t be able to pronounce or understand most of the words on that box of frozen lasagna you’re holding it’s best to pass it up. Speaking of that frozen entree …
Eat Only Foods That Have Been Cooked by Humans.
Pollan means buying raw ingredients and making the food yourself, rather than buying food pre-cooked and pre-packaged. Corporations use too much junk in cooking your food. This is the biggest predictor of a healthy diet.
Eat All the Junk Food You Want as Long as You Cook It Yourself.
This is an interesting rule because generally we are trying to either remove or go around obstacles and in this instance we are very purposefully adding one. If you have a sweet tooth there is nothing wrong with eating cake on occasion. The key here is to eat those unhealthful foods only occasionally. Taking the time to make the food means that you have to be incredibly motivated to have that cake. (And you’re probably not going to whip up a bag of Oreos or potato chips.)
If You’re Not Hungry Enough to Eat an Apple, Then You’re Probably Not Hungry
This is another obstacle style rule, but it also offers you the opportunity to get better in tune with your hunger. Are you grabbing that candy bar from the vending machine at two o’clock in the afternoon because you are hungry or because you do that same thing at two o’clock every day? There are many reasons why we eat and hunger is only one of them.
Stop Eating Before You’re Full
This probably sounds a bit crazy to the average North American. In our society we eat because we are hungry and we stop because we are full. This is our tradition, but in many other cultures the goal of eating is to simply stop the hunger, which is actually quite different. Try this experiment for yourself. Try to wait until you are hungry for your next meal. You want to be able to feel it. Then as you are eating try to be mindful of the moment you stop feeling hungry. You’ll notice that this moment comes quite a few bites before that full feeling comes.
Food Rules feels like a succinct tool to help you navigate the confusing nutritional landscape. It’s a quick read that is packed with a lot of information. Imitate Bruce Lee and Absorb what is useful, discard what is useless and add what is specifically your own.
Still Interested? The Omnivore’s Dilemma: A Natural History of Four Meals is one of the best food books we’ve ever read.
It’s a wonderful idea to try to find a set of systems and principles that “work better” for big swaths of your life. Better habits, better mental tendencies, better methods of inquiry, and so on. We’re strong advocates of this approach, believing that good thinking and good decision making can be learned the same as a good golf swing can: Through practice and instruction.
So, read the below with this caveat in mind: Constant learning and self-improvement can and must be done for great life results.
Now, with that out of the way.
The problem with the search for self-improvement methods, including the kind of multidisciplinary thinking we espouse, is that many, perhaps most of them, are a snare and a delusion for most people. And there’s a simple reason why: They won’t actually do it.
Think about it. Isn’t that the most common result? That you don’t do it?
For example, we heard from many people after we wrote a piece late last year on Reading 25 Pages a Day, a little practice that we think would benefit almost anyone in creating a very desirable reading habit.
What we suspect, though, is that even of the subset of people who felt so strongly about the idea that they contacted us, only a minority of them followed through and maintained to the habit to this day, ten months later.
Why is that? A huge part of it is Homeostasis: The basic self-regulating feedback loops that keep us repeating the same habits over and over. Predictable forces that keep us from changing ourselves, just as some forces keep us from changing organizations. (Or any self-regulating system.)
The failure to follow new systems and habits (mental or physical) follows this basic formula:
Now, with regards to the 25-pages a day “system” we outlined, we were careful not to make a “no broccoli” promise: All we said was that reading 25 pages per day was a habit that almost anyone could form, and that it would lead them far. But you still have to do all the reading. You have to do the thing. That’s the part where everyone falls away.You still have to *do* the hard thing. That's the part where everyone falls away. Click To Tweet
We suspect that some people thought it would be easy to read 25 pages per day. That the pages would essentially “read themselves”, or that the time to do so would spontaneously free up, just because they starting wanting it.
This is never, ever the case. At some point, to be healthy, you do need to suck it up and eat some broccoli! And for many days in a row. Or, more to the point: The “failure point” with any new system; any method of improvement; any proposed solution to a life problem or an organization problem, is when the homeostatic regulation kicks in, when we realize some part of it will be hard, new, or unnatural.
Even a really well-designed system can only cut up the broccoli into little pieces and sneak it into your mac-and-cheese. A popular examples would be a fitness system whereby you do one pushup a day, then two pushups the second day, then three the third day, and so on. It makes the habit digestible at first, as you get used to it. This is plenty smart.
But eventually, if you’re going to hang on to that habit, you’ll have to do a whole lot of pushups every day! You can’t just go back to plain mac-and-cheese, no broccoli. When the newness of the “one day at a time” system wears off, you’ll be left with a heaping portion of broccoli. Will you continue eating it?
The point is this: When you’re evaluating a proposed improvement to your life or to your organization, you must figure out when and where the broccoli will get eaten, and understand that you will have to sacrifice something (even if it’s just comfort) to get what you want. And if anyone ever promises you “no broccoli,” it’s probably a sham.
Remember that anything really worth doing is probably hard work, and will absolutely require you to do things you don’t currently do, which will feel uncomfortable for a while. This is a “hard truth” we must all face. If it was easy, everyone would already be doing it.
Let’s take the example of learning how to give better feedback. What could be a more useful skill? But actually doing so, actually following through with the idea, is not at all easy. You have to overcome your natural impulse to criticize. You have to get over your natural ego. You have to be very careful to watch your words, trying to decipher what will be heard when you deliver feedback. All of these are hard things to do, all of them unnatural. All will require some re-doubling to accomplish.
Thus, most people won’t actually do it. This an Iron Rule of life: Biological systems tend towards what is comfortable. (Yes, human beings are “biological systems”.)
But this Iron Rule is a problem and an opportunity wrapped together. As the saying goes, “If you do what everyone else does, you’ll get what everyone else gets.” If you can recognize that all things worth doing are hard at first, and that there is always some broccoli to be eaten, you are part of the way toward true advantageous differentiation. The rest is self-discipline.
And the real and comforting truth is that you might really start liking, and even get used to eating, broccoli. Eating potato chips and candy will eventually feel like the uncomfortable and unnatural thing.
And that’s when you know you’ve really got a great new discipline: Going back would feel like cutting off your hands.
We’ve written before about the legendary businessman Ken Iverson, the former CEO of Nucor Steel, who took it from a tiny steel operation to a true steel powerhouse in his own lifetime.
To recap, in Iverson’s tenure, Nucor:
And so on. He was incredible.
His short business memoir, Plain Talk, describes a much different kind of company than most; one where a culture of teamwork and group winning trumped personal fiefdom. He also got the incentives right. Boy did it ever work.
Turns out Iverson had some thoughts on business education as well.
What are we really missing?
In his recommended curriculum, Iverson highlights just how different his thoughts are: No classes on grand strategy (Henry Singleton would agree), or sales, or marketing, or financial structuring. (Not that those can’t be useful. Just not enough.)
His idea? Teach aspiring managers how to truly interact with, understand, and lead the people who work for them by forcing young MBAs to take on an “internship” as a leader similar to the way doctors take up residence before being given the full leash.
In the epilogue to Plain Talk, Iverson calls this the Cure for the Common MBA.
Here are some of the subjects that might form the core of first-year MBA curricula:
Earning Employees’ Trust and Loyalty
Far too many managers have no clue how their employees feel or even what their people’s work lives are like, day to day. Employees pick up on this lack of insight in a heartbeat, and that realization taints everything their managers say to them from that point forward. Conversely, employees clearly give the benefit of the doubt to managers whom they see as understanding “what’s really going on” and “what we’re really up against.” That’s only natural.
I’d suggest, then, that every MBA candidate be required to spend at least a few weeks engaged in manual, clerical, and/or other forms of non-management labor.
Further, they should be required to keep a journal of their experiences—the kinds of problems they encounter, their frustrations, their successes, and so forth. They will find that what seems a small thing to them as managers often takes on great significance to them as employees.
Developing managers should also contemplate the implicit and explicit commitments they will make to the people who work for them. They should understand their obligations under those commitments as well as the limitations of those obligations. And they should grasp the consequences of failing to be consistently trustworthy.
Listening is among the scarcest of all human skills, in and out of management. Listening requires concentration, skill, patience, and a lot of practice. But such practice is a very sound investment of the developing manager’s time.
Real listening enables managers not only to hear what people say to them, but to sense what may be behind what is said (i.e., employees’ emotions, assumptions, biases).
Better still, their reputation for competent listening will encourage others to bring them information. Listening proficiency is an immense advantage to any manager. No MBA should be sent forth into the business world without it.
The Hazards of Hierarchical Power
Inexperienced managers tend to lean heavily on formal, hierarchical sources of authority. This is understandable. They have not yet had the opportunity to develop other forms of authority such as experience, expertise, and seniority.
The problem is, young managers don’t often comprehend the hazards of hierarchical power. They do not understand that, by setting themselves above and apart from their employees, they may actually be digging themselves into a hole. I think it is only fair, then, that we warn inexperienced managers of the hazards of hierarchical power.
Principles of Equitable Treatment
Few managers receive much in the way of explicit instruction in the principles of equitable treatment of employees, either in business school or in the course of management development. All too often, managers fill that vacuum with their own self serving precepts of what is equitable. A few common- sense principles, clearly stated and strongly advocated in the business schools, could make the business world a better, more equitable place for employees and managers alike.
The notion of an internship for managers has a precedent in medical education, of course. Doctors intern for a number of years before they are turned loose on the world. There ought to be a comparable transitional step in completing the requirements for an MBA. Further, that transition should focus on providing the management candidate hands-on experience. Any MBA who ventures into business with the intent of managing people should first develop his or her skills under the watchful guidance of an experienced manager.
The fact is, few business school professors have ever managed anything, and their lack of hands-on experience shows in their students. Medical school faculties, in contrast, are comprised of the best and most respected practicing physicians.
MBA candidates should preferably complete their internships within relatively small, self-contained operations, so they can perceive the operation in its entirety and grasp the overall dynamics of a business.
People throughout the corporate world lament that other parts of their company don’t understand them or what they do. They’re usually right. It takes an extraordinary individual to understand aspects of a business to which he or she has never been exposed. We are expecting far too many managers to be extraordinary.
In 2015, Mark Zuckerberg did something slightly unusual for a CEO of a major technology company: He started a book club!
In that year, Zuckerberg ended up recommending and discussing 23 books with the group — about one every two weeks. We found it a great list of interesting reads. Let’s check it out.
Creativity, Inc. is a book for managers who want to lead their employees to new heights, a manual for anyone who strives for originality, and the first-ever, all-access trip into the nerve center of Pixar Animation—into the meetings, postmortems, and “Braintrust” sessions where some of the most successful films in history are made. It is, at heart, a book about how to build a creative culture—but it is also, as Pixar co-founder and president Ed Catmull writes, “an expression of the ideas that I believe make the best in us possible.”
One hundred thousand years ago, at least six different species of humans inhabited Earth. Yet today there is only one—homo sapiens. What happened to the others? And what may happen to us?
Most books about the history of humanity pursue either a historical or a biological approach, but Dr. Yuval Noah Harari breaks the mold with this highly original book that begins about 70,000 years ago with the appearance of modern cognition. From examining the role evolving humans have played in the global ecosystem to charting the rise of empires, Sapiens integrates history and science to reconsider accepted narratives, connect past developments with contemporary concerns, and examine specific events within the context of larger ideas.
In a bold and provocative interpretation of economic history, Matt Ridley, theNew York Times-bestselling author of Genome and The Red Queen, makes the case for an economics of hope, arguing that the benefits of commerce, technology, innovation, and change—what Ridley calls cultural evolution—will inevitably increase human prosperity.
With The Structure of Scientific Revolutions, Kuhn challenged long-standing linear notions of scientific progress, arguing that transformative ideas don’t arise from the day-to-day, gradual process of experimentation and data accumulation but that the revolutions in science, those breakthrough moments that disrupt accepted thinking and offer unanticipated ideas, occur outside of “normal science,” as he called it. Though Kuhn was writing when physics ruled the sciences, his ideas on how scientific revolutions bring order to the anomalies that amass over time in research experiments are still instructive in our biotech age.
Daron Acemoglu and James Robinson conclusively show that it is man-made political and economic institutions that underlie economic success (or lack of it). Korea, to take just one of their fascinating examples, is a remarkably homogeneous nation, yet the people of North Korea are among the poorest on earth while their brothers and sisters in South Korea are among the richest. The south forged a society that created incentives, rewarded innovation, and allowed everyone to participate in economic opportunities.
In The End of Power, award-winning columnist and former Foreign Policyeditor Moisés Naím illuminates the struggle between once-dominant megaplayers and the new micropowers challenging them in every field of human endeavor. Drawing on provocative, original research, Naím shows how the antiestablishment drive of micropowers can topple tyrants, dislodge monopolies, and open remarkable new opportunities, but it can also lead to chaos and paralysis. Naím deftly covers the seismic changes underway in business, religion, education, within families, and in all matters of war and peace.
Once in a great while a book comes along that changes the way we see the world and helps to fuel a nationwide social movement. The New Jim Crow is such a book. Praised by Harvard Law professor Lani Guinier as “brave and bold,” this book directly challenges the notion that the election of Barack Obama signals a new era of colorblindness. With dazzling candor, legal scholar Michelle Alexander argues that “we have not ended racial caste in America; we have merely redesigned it.” By targeting black men through the War on Drugs and decimating communities of color, the U.S. criminal justice system functions as a contemporary system of racial control—relegating millions to a permanent second-class status—even as it formally adheres to the principle of colorblindness. In the words of Benjamin Todd Jealous, president and CEO of the NAACP, this book is a “call to action.”
Arguably the most significant scientific discovery of the new century, the mapping of the twenty-three pairs of chromosomes that make up the human genome raises almost as many questions as it answers. Questions that will profoundly impact the way we think about disease, about longevity, and about free will. Questions that will affect the rest of your life.
Genome offers extraordinary insight into the ramifications of this incredible breakthrough. By picking one newly discovered gene from each pair of chromosomes and telling its story, Matt Ridley recounts the history of our species and its ancestors from the dawn of life to the brink of future medicine. From Huntington's disease to cancer, from the applications of gene therapy to the horrors of eugenics, Matt Ridley probes the scientific, philosophical, and moral issues arising as a result of the mapping of the genome. It will help you understand what this scientific milestone means for you, for your children, and for humankind.
Nearly forty percent of humanity lives on an average of two dollars a day or less. If you’ve never had to survive on an income so small, it is hard to imagine. How would you put food on the table, afford a home, and educate your children? How would you handle emergencies and old age? Every day, more than a billion people around the world must answer these questions. Portfolios of the Poor is the first book to systematically explain how the poor find solutions to their everyday financial problems.
In Dealing with China, Paulson draws on his unprecedented access to modern China’s political and business elite, including its three most recent heads of state, to answer several key questions:
How did China become an economic superpower so quickly?
How does business really get done there?
What are the best ways for Western business and political leaders to work with, compete with, and benefit from China
How can the U.S. negotiate with and influence China given its authoritarian rule, its massive environmental concerns, and its huge population’s unrelenting demands for economic growth and security?
The Varieties of Religious Experience: A Study in Human Nature is a book by Harvard University psychologist and philosopher William James. It comprises his edited Gifford Lectures on natural theology, which were delivered at the University of Edinburgh in Scotland in 1901 and 1902. The lectures concerned the nature of religion and the neglect of science in the academic study of religion.
Believe it or not, today we may be living in the most peaceful moment in our species’ existence. In his gripping and controversial new work, New York Times bestselling author Steven Pinker shows that despite the ceaseless news about war, crime, and terrorism, violence has actually been in decline over long stretches of history. Exploding myths about humankind’s inherent violence and the curse of modernity, this ambitious book continues Pinker’s exploration of the essence of human nature, mixing psychology and history to provide a remarkable picture of an increasingly enlightened world.
Set against the backdrop of China’s Cultural Revolution, a secret military project sends signals into space to establish contact with aliens. An alien civilization on the brink of destruction captures the signal and plans to invade Earth. Meanwhile, on Earth, different camps start forming, planning to either welcome the superior beings and help them take over a world seen as corrupt, or to fight against the invasion. The result is a science fiction masterpiece of enormous scope and vision.
When first-year graduate student Sudhir Venkatesh walked into an abandoned building in one of Chicago’s most notorious housing projects, he hoped to find a few people willing to take a multiple-choice survey on urban poverty–and impress his professors with his boldness. He never imagined that as a result of this assignment he would befriend a gang leader named JT and spend the better part of a decade embedded inside the projects under JT’s protection. From a privileged position of unprecedented access, Venkatesh observed JT and the rest of his gang as they operated their crack-selling business, made peace with their neighbors, evaded the law, and rose up or fell within the ranks of the gang’s complex hierarchical structure.
As Einstein pointed out in his famous equation, E=MC2, all matter can be described as energy. It is everywhere; it is everything. In this engaging book, prolific author and academic Vaclav Smil provides an introduction to the far-reaching term and gives the reader a greater understanding of energy’s place in both past and present society. Starting with an explanation of the concept, he goes on to cover such exciting topics as the inner workings of the human body, and the race for more efficient and environmentally friendly fuels. With global warming becoming a mainstream political issue, this guide will help shed light on the science behind it and efforts to prevent it, and how our seemingly insignificant daily decisions affect energy consumption. Whether you’re after insight or dinner table conversation, “Energy: A Beginner’s Guide” will amaze and inform, uncovering the science behind one of the most important concepts in our universe.
In an extraordinary demonstration of the emerging supermedium’s potential to engender new forms of creativity, Huber’s book boldly reimagines 1984 from the computer’s point of view. After first scanning all of Orwell’s writings into his personal computer, Huber used the machine to rewrite the book completely, for the most part using Orwell’s own language. Alternating fiction and non-fiction chapters, Huber advances Orwell’s plot to a surprising new conclusion while seamlessly interpolating his own explanations and arguments. The result is a fascinating utopian work which envisions a world at our fingertips of ever-increasing information, equal opportunity, and freedom of choice.
Why do Internet, financial service, and beer commercials dominate Super Bowl advertising? How do political ceremonies establish authority? Why does repetition characterize anthems and ritual speech? Why were circular forms favored for public festivals during the French Revolution? This book answers these questions using a single concept: common knowledge.
The Muqaddimah, often translated as “Introduction” or “Prolegomenon,” is the most important Islamic history of the premodern world. Written by the great fourteenth-century Arab scholar Ibn Khaldûn (d. 1406), this monumental work established the foundations of several fields of knowledge, including the philosophy of history, sociology, ethnography, and economics. The first complete English translation, by the eminent Islamicist and interpreter of Arabic literature Franz Rosenthal, was published in three volumes in 1958 as part of the Bollingen Series and received immediate acclaim in the United States and abroad. A one-volume abridged version of Rosenthal’s masterful translation first appeared in 1969.
The Culture – a human/machine symbiotic society – has thrown up many great Game Players, and one of the greatest is Gurgeh. Jernau Morat Gurgeh. The Player of Games. Master of every board, computer and strategy. Bored with success, Gurgeh travels to the Empire of Azad, cruel and incredibly wealthy, to try their fabulous game…a game so complex, so like life itself, that the winner becomes emperor. Mocked, blackmailed, almost murdered, Gurgeh accepts the game, and with it the challenge of his life – and very possibly his death.
In this bold, fascinating book, Eula Biss addresses our fear of the government, the medical establishment, and what may be in our children’s air, food, mattresses, medicines, and vaccines. Reflecting on her own experience as a new mother, she suggests that we cannot immunize our children, or ourselves, against the world. As she explores the metaphors surrounding immunity, Biss extends her conversations with other mothers to meditations on the myth of Achilles, Voltaire’s Candide, Bram Stoker’s Dracula, Rachel Carson’s Silent Spring, Susan Sontag’s AIDS and Its Metaphors, and beyond.On Immunity is an inoculation against our fear and a moving account of how we are all interconnected-our bodies and our fates.
In this groundbreaking book, award-winning physicist David Deutsch argues that explanations have a fundamental place in the universe—and that improving them is the basic regulating principle of all successful human endeavor. Taking us on a journey through every fundamental field of science, as well as the history of civilization, art, moral values, and the theory of political institutions, Deutsch tracks how we form new explanations and drop bad ones, explaining the conditions under which progress—which he argues is potentially boundless—can and cannot happen. Hugely ambitious and highly original, The Beginning of Infinity explores and establishes deep connections between the laws of nature, the human condition, knowledge, and the possibility for progress.
Henry Kissinger offers in World Order a deep meditation on the roots of international harmony and global disorder. Drawing on his experience as one of the foremost statesmen of the modern era—advising presidents, traveling the world, observing and shaping the central foreign policy events of recent decades—Kissinger now reveals his analysis of the ultimate challenge for the twenty-first century: how to build a shared international order in a world of divergent historical perspectives, violent conflict, proliferating technology, and ideological extremism.
From its beginnings in the 1920s until its demise in the 1980s, Bell Labs-officially, the research and development wing of AT&T-was the biggest, and arguably the best, laboratory for new ideas in the world. From the transistor to the laser, from digital communications to cellular telephony, it’s hard to find an aspect of modern life that hasn’t been touched by Bell Labs. In The Idea Factory, Jon Gertner traces the origins of some of the twentieth century’s most important inventions and delivers a riveting and heretofore untold chapter of American history. At its heart this is a story about the life and work of a small group of brilliant and eccentric men-Mervin Kelly, Bill Shockley, Claude Shannon, John Pierce, and Bill Baker-who spent their careers at Bell Labs. Today, when the drive to invent has become a mantra, Bell Labs offers us a way to enrich our understanding of the challenges and solutions to technological innovation. Here, after all, was where the foundational ideas on the management of innovation were born.
Common across human history is our longing to better understand the world we live in, and how it works. But how much can we actually know about the world?
In his book, The Island of Knowledge: The Limits of Science and the Search for Meaning, Physicist Marcelo Gleiser traces our progress of modern science in the pursuit to the most fundamental questions on existence, the origin of the universe, and the limits of knowledge.
What we know of the world is limited by what we can see and what we can describe, but our tools have evolved over the years to reveal ever more pleats into our fabric of knowledge. Gleiser celebrates this persistent struggle to understand our place in the world and travels our history from ancient knowledge to our current understanding.
While science is not the only way to see and describe the world we live in, it is a response to the questions on who we are, where we are, and how we got here. “Science speaks directly to our humanity, to our quest for light, ever more light.”
To move forward, science needs to fail, which runs counter to our human desire for certainty. “We are surrounded by horizons, by incompleteness.” Rather than give up, we struggle along a scale of progress. What makes us human is this journey to understand more about the mysteries of the world and explain them with reason. This is the core of our nature.
While the pursuit is never ending, the curious journey offers insight not just into the natural world, but insight into ourselves.
“What I see in Nature is a magnificent structure that we can comprehend only
very imperfectly, and that must fill a thinking person with a feeling of humility.”
— Albert Einstein
We tend to think that what we see is all there is — that there is nothing we cannot see. We know it isn’t true when we stop and think, yet we still get lulled into a trap of omniscience.
Science is thus limited, offering only part of the story — the part we can see and measure. The other part remains beyond our immediate reach.
“What we see of the world,” Gleiser begins, “is only a sliver of what’s out there.”
There is much that is invisible to the eye, even when we augment our sensorial perception with telescopes, microscopes, and other tools of exploration. Like our senses, every instrument has a range. Because much of Nature remains hidden from us, our view of the world is based only on the fraction of reality that we can measure and analyze. Science, as our narrative describing what we see and what we conjecture exists in the natural world, is thus necessarily limited, telling only part of the story. … We strive toward knowledge, always more knowledge, but must understand that we are, and will remain, surrounded by mystery. This view is neither antiscientific nor defeatist. … Quite the contrary, it is the flirting with this mystery, the urge to go beyond the boundaries of the known, that feeds our creative impulse, that makes us want to know more.
While we may broadly understand the map of what we call reality, we fail to understand its terrain. Reality, Gleiser argues, “is an ever-shifting mosaic of ideas.”
The incompleteness of knowledge and the limits of our scientific worldview only add to the richness of our search for meaning, as they align science with our human fallibility and aspirations.
What we call reality is a (necessarily) limited synthesis. It is certainly our reality, as it must be, but it is not the entire reality itself:
My perception of the world around me, as cognitive neuroscience teaches us, is synthesized within different regions of my brain. What I call reality results from the integrated sum of countless stimuli collected through my five senses, brought from the outside into my head via my nervous system. Cognition, the awareness of being here now, is a fabrication of a vast set of chemicals flowing through myriad synaptic connections between my neurons. … We have little understanding as to how exactly this neuronal choreography engenders us with a sense of being. We go on with our everyday activities convinced that we can separate ourselves from our surroundings and construct an objective view of reality.
The brain is a great filtering tool, deaf and blind to vast amounts of information around us that offer no evolutionary advantage. Part of it we can see and simply ignore. Other parts, like dust particles and bacteria, go unseen because of limitations of our sensory tools.
As the Fox said to the Little Prince in Antoine de Saint-Exupery’s fable, “What is essential is invisible to the eye.” There is no better example than oxygen.
Science has increased our view. Our measurement tools and instruments can see bacteria and radiation, subatomic particles and more. However precise these tools have become, their view is still limited.
There is no such thing as an exact measurement. Every measurement must be stated within its precision and quoted together with “error bars” estimating the magnitude of errors. High-precision measurements are simply measurements with small error bars or high confidence levels; there are no perfect, zero-error measurements.
Technology limits how deeply experiments can probe into physical reality. That is to say, machines determine what we can measure and thus what scientists can learn about the Universe and ourselves. Being human inventions, machines depend on our creativity and available resources. When successful, they measure with ever-higher accuracy and on occasion may also reveal the unexpected.
“All models are wrong, some are useful.”
— George Box
What we know about the world is only what we can detect and measure — even if we improve our “detecting and measuring” as time goes along. And thus we make our conclusions of reality on what we can currently “see.”
We see much more than Galileo, but we can’t see it all. And this restriction is not limited to measurements: speculative theories and models that extrapolate into unknown realms of physical reality must also rely on current knowledge. When there is no data to guide intuition, scientists impose a “compatibility” criterion: any new theory attempting to extrapolate beyond tested ground should, in the proper limit, reproduce current knowledge.
If large portions of the world remain unseen or inaccessible to us, we must consider the meaning of the word “reality” with great care. We must consider whether there is such a thing as an “ultimate reality” out there — the final substrate of all there is — and, if so, whether we can ever hope to grasp it in its totality.
We thus must ask whether grasping reality’s most fundamental nature is just a matter of pushing the limits of science or whether we are being quite naive about what science can and can’t do.
Here is another way of thinking about this: if someone perceives the world through her senses only (as most people do), and another amplifies her perception through the use of instrumentation, who can legitimately claim to have a truer sense of reality? One “sees” microscopic bacteria, faraway galaxies, and subatomic particles, while the other is completely blind to such entities. Clearly they “see” different things and—if they take what they see literally—will conclude that the world, or at least the nature of physical reality, is very different.
Asking who is right misses the point, although surely the person using tools can see further into the nature of things. Indeed, to see more clearly what makes up the world and, in the process to make more sense of it and ourselves is the main motivation to push the boundaries of knowledge. … What we call “real” is contingent on how deeply we are able to probe reality. Even if there is such thing as the true or ultimate nature of reality, all we have is what we can know of it.
Our perception of what is real evolves with the instruments we use to probe Nature. Gradually, some of what was unknown becomes known. For this reason, what we call “reality” is always changing. … The version of reality we might call “true” at one time will not remain true at another. … Given that our instruments will always evolve, tomorrow’s reality will necessarily include entitles not known to exist today. … More to the point, as long as technology advances—and there is no reason to suppose that it will ever stop advancing for as long as we are around—we cannot foresee an end to this quest. The ultimate truth is elusive, a phantom.
Gleiser makes his point with a beautiful metaphor. The Island of Knowledge.
Consider, then, the sum total of our accumulated knowledge as constituting an island, which I call the “Island of Knowledge.” … A vast ocean surrounds the Island of Knowledge, the unexplored ocean of the unknown, hiding countless tantalizing mysteries.
The Island of Knowledge grows as we learn more about the world and ourselves. And as the island grows, so too “do the shores of our ignorance—the boundary between the known and unknown.”
Learning more about the world doesn’t lead to a point closer to a final destination—whose existence is nothing but a hopeful assumption anyways—but to more questions and mysteries. The more we know, the more exposed we are to our ignorance, and the more we know to ask.
As we move forward we must remember that despite our quest, the shores of our ignorance grow as the Island of Knowledge grows. And while we will struggle with the fact that not all questions will have answers, we will continue to progress. “It is also good to remember,” Gleiser writes, “that science only covers part of the Island.”
Richard Feynman has pointed out before that science can only answer the subset of question that go, roughly, “If I do this, what will happen?” Answers to questions like Why do the rules operate that way? and Should I do it? are not really questions of scientific nature — they are moral, human questions, if they are knowable at all.
There are many ways of understanding and knowing that should, ideally, feed each other. “We are,” Gleiser concludes, “multidimensional creatures and search for answers in many, complementary ways. Each serves a purpose and we need them all.”
The Island of Knowledge is a wide-ranging tour through scientific history from planetary motions to modern scientific theories and how they affect our ideas on what is knowable.“As the Island of Knowledge grows, so do the shores of our ignorance.” Click To Tweet