Seneca on Gathering Ideas And Combinatorial Creativity

Bees

“Combinatory play,” said Einstein, “seems to be the essential feature in productive thought.”

Ruminating on the necessity of both reading and writing, so as not to confine ourselves to either, Seneca in one of his Epistles, advised that we gather ideas, sift them, and combine them into a new creation.

We should follow, men say, the example of the bees, who flit about and cull the flowers that are suitable for producing honey, and then arrange and assort in their cells all that they have brought in; these bees, as our Vergil says,

Pack close the flowing honey,
And swell their cells with nectar sweet.

It is not certain whether the juice which they obtain from the flowers forms at once into honey, or whether they change that which they have gathered into this delicious object by blending something therewith and by a certain property of their breath. For some authorities believe that bees do not possess the art of making honey, but only of gathering it … Certain others maintain that the materials which the bees have culled from the most delicate of blooming and flowering plants is transformed into this peculiar substance by a process of preserving and careful storing away, aided by what might be called fermentation,— whereby separate elements are united into one substance.

But I must not be led astray into another subject than that which we are discussing. We also, I say, ought to copy these bees, and sift whatever we have gathered from a varied course of reading, for such things are better preserved if they are kept separate; then, by applying the supervising care with which our nature has endowed us,— in other words, our natural gifts,— we should so blend those several flavors into one delicious compound that, even though it betrays its origin, yet it nevertheless is clearly a different thing from that whence it came.

Montaigne, perhaps echoing Seneca, reasoned that we must take knowledge and make it our own, Seneca comments:

We must digest it; otherwise it will merely enter the memory and not the reasoning power. Let us loyally welcome such foods and make them our own, so that something that is one may be formed out of many elements, just as one number is formed of several elements whenever, by our reckoning, lesser sums, each different from the others, are brought together. This is what our mind should do: it should hide away all the materials by which it has been aided, and bring to light only what it has made of them. Even if there shall appear in you a likeness to him who, by reason of your admiration, has left a deep impress upon you, I would have you resemble him as a child resembles his father, and not as a picture resembles its original; for a picture is a lifeless thing.

The Loeb Classic Library collection of Seneca’s Epistles in three volumes (1-65, 66-92, and 92-124), should be read by all in its entirety. Of course, if you don’t have time to read them all, you can read a heavily curated version of them.

(Image source)

Claude Shannon: The Man Who Turned Paper Into Pixels

"The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning."— Claude Shannon (1948)
“The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning.”— Claude Shannon (1948)

Claude Shannon is the most important man you’ve probably never heard of. If Alan Turing is to be considered the father of modern computing, then the American mathematician Claude Shannon is the architect of the Information Age.

The video, created by the British filmmaker Adam Westbrook, echoes the thoughts of Nassim Taleb that boosting the signal does not mean you remove the noise, in fact, just the opposite: you amplify it.

Any time you try to send a message from one place to another something always gets in the way. The original signal is always distorted. Where ever there is signal there is also noise.

So what do you do? Well, the best anyone could do back then was to boost the signal. But then all you do is boost the noise.

Thing is we were thinking about information all wrong. We were obsessed with what a message meant.

A Renoir and a receipt? They’re different, right? Was there a way to think of them in the same way? Like so many breakthroughs the answer came from an unexpected place. A brilliant mathematician with a flair for blackjack.

***

The transistor was invented in 1948, at Bell Telephone Laboratories. This remarkable achievement, however, “was only the second most significant development of that year,” writes James Gleick in his fascinating book: The Information: A History, a Theory, a Flood. The most important development of 1948 and what still underscores modern technology is the bit.

An invention even more profound and more fundamental came in a monograph spread across seventy-nine pages of The Bell System Technical Journal in July and October. No one bothered with a press release. It carried a title both simple and grand “A Mathematical Theory of Communication” and the message was hard to summarize. But it was a fulcrum around which the world began to turn. Like the transistor, this development also involved a neologism: the word bit, chosen in this case not by committee but by the lone author, a thirty-two-year -old named Claude Shannon. The bit now joined the inch, the pound, the quart, and the minute as a determinate quantity— a fundamental unit of measure.

But measuring what? “A unit for measuring information,” Shannon wrote, as though there were such a thing, measurable and quantifiable, as information.

[...]

Shannon’s theory made a bridge between information and uncertainty; between information and entropy; and between information and chaos. It led to compact discs and fax machines, computers and cyberspace, Moore’s law and all the world’s Silicon Alleys. Information processing was born, along with information storage and information retrieval. People began to name a successor to the Iron Age and the Steam Age.

Gleick also recounts the relationship between Turing and Shannon:

In 1943 the English mathematician and code breaker Alan Turing visited Bell Labs on a cryptographic mission and met Shannon sometimes over lunch, where they traded speculation on the future of artificial thinking machines. (“ Shannon wants to feed not just data to a Brain, but cultural things!” Turing exclaimed. “He wants to play music to it!”)

Commenting on vitality of information, Gleick writes:

(Information) pervades the sciences from top to bottom, transforming every branch of knowledge. Information theory began as a bridge from mathematics to electrical engineering and from there to computing. … Now even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level— an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions.… If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.” The cells of an organism are nodes in a richly interwoven communications network, transmitting and receiving, coding and decoding. Evolution itself embodies an ongoing exchange of information between organism and environment.

The bit is the very core of the information age.

The bit is a fundamental particle of a different sort: not just tiny but abstract— a binary digit, a flip-flop, a yes-or-no. It is insubstantial, yet as scientists finally come to understand information, they wonder whether it may be primary: more fundamental than matter itself. They suggest that the bit is the irreducible kernel and that information forms the very core of existence.

In the words of John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, information gives rise to “every it— every particle, every field of force, even the spacetime continuum itself.”

This is another way of fathoming the paradox of the observer: that the outcome of an experiment is affected, or even determined, when it is observed. Not only is the observer observing, she is asking questions and making statements that must ultimately be expressed in discrete bits. “What we call reality,” Wheeler wrote coyly, “arises in the last analysis from the posing of yes-no questions.” He added: “All things physical are information-theoretic in origin, and this is a participatory universe.” The whole universe is thus seen as a computer —a cosmic information-processing machine.

The greatest gift of Prometheus to humanity was not fire after all: “Numbers, too, chiefest of sciences, I invented for them, and the combining of letters, creative mother of the Muses’ arts, with which to hold all things in memory .”

Information technologies are both relative in the time they were created and absolute in terms of the significance. Gleick writes:

The alphabet was a founding technology of information. The telephone, the fax machine, the calculator, and, ultimately, the computer are only the latest innovations devised for saving, manipulating, and communicating knowledge. Our culture has absorbed a working vocabulary for these useful inventions. We speak of compressing data, aware that this is quite different from compressing a gas. We know about streaming information, parsing it, sorting it, matching it, and filtering it. Our furniture includes iPods and plasma displays, our skills include texting and Googling, we are endowed, we are expert, so we see information in the foreground. But it has always been there. It pervaded our ancestors’ world, too, taking forms from solid to ethereal, granite gravestones and the whispers of courtiers. The punched card, the cash register, the nineteenth-century Difference Engine, the wires of telegraphy all played their parts in weaving the spiderweb of information to which we cling. Each new information technology, in its own time, set off blooms in storage and transmission. From the printing press came new species of information organizers: dictionaries, cyclopaedias, almanacs— compendiums of words, classifiers of facts, trees of knowledge. Hardly any information technology goes obsolete. Each new one throws its predecessors into relief. Thus Thomas Hobbes, in the seventeenth century, resisted his era’s new-media hype: “The invention of printing, though ingenious, compared with the invention of letters is no great matter.” Up to a point, he was right. Every new medium transforms the nature of human thought. In the long run, history is the story of information becoming aware of itself.

The Information: A History, a Theory, a Flood is a fascinating read.

(image source)

The Uses Of Being Wrong

Confessions of wrongness are the exception not the rule.

Daniel Drezner, a professor of international politics at the Fletcher School of Law and Diplomacy at Tufts University, pointing to the difference between being wrong in a prediction and making an error, writes:

Error, even if committed unknowingly, suggests sloppiness. That carries a more serious stigma than making a prediction that fails to come true.

Social sciences, unlike physical and natural sciences, finds a shortage of high-quality data on which to make predictions.

How does Science Advance?

A theory may be scientific even if there is not a shred of evidence in its favour, and it may be pseudoscientific even if all the available evidence is in its favour. That is, the scientific or non-scientific character of a theory can be determined independently of the facts. A theory is ‘scientific’ if one is prepared to specify in advance a crucial experiment (or observation) which can falsify it, and it is pseudoscientific if one refuses to specify such a ‘potential falsifier’. But if so, we do not demarcate scientific theories from pseudoscientific ones, but rather scientific methods from non-scientific method.

Karl Popper viewed the progression of science as falsification — that is science progresses by elimination of what doesn’t work and hold. Popper’s falsifiability criterion ignores the tenacity of scientific theories, even in the face of disconfirming evidence. Scientists, like many of us, do not abandon a theory because the evidence may contradict it.

The wake of science is littered with discussions on anomalies and not refutations.

Another theory on scientific advancement, proposed by Thomas Kuhn, a distinguished American philosopher of science, argues that science proceeds with a series of revolutions with an almost religious conversion.

Imre Lakatos, a Hungarian philosopher of mathematics and science, wrote:

(The) history of science, of course, is full of accounts of how crucial experiments allegedly killed theories. But all such accounts are fabricated long after the theory has been abandoned.

Lakatos bridged the gap between Popper and Khun by addressing what they failed to solve.

The hallmark of empirical progress is not trivial verifications: Popper is right that there are millions of them. It is no success for Newtonian theory that stones, when dropped, fall towards the earth, no matter how often this is repeated. But, so-called ‘refutations’ are not the hallmark of empirical failure, as Popper has preached, since all programmes grow in a permanent ocean of anomalies. What really counts are dramatic, unexpected, stunning predictions: a few of them are enough to tilt the balance; where theory lags behind the facts, we are dealing with miserable degenerating research programmes.

Now, how do scientific revolutions come about? If we have two rival research programmes, and one is progressing while the other is degenerating, scientists tend to join the progressive programme. This is the rationale of scientific revolutions. But while it is a matter of intellectual honesty to keep the record public, it is not dishonest to stick to a degenerating programme and try to turn it into a progressive one.

As opposed to Popper the methodology of scientific research programmes does not offer instant rationality. One must treat budding programmes leniently: programmes may take decades before they get off the ground and become empirically progressive. Criticism is not a Popperian quick kill, by refutation. Important criticism is always constructive: there is no refutation without a better theory. Kuhn is wrong in thinking that scientific revolutions are sudden, irrational changes in vision. [The history of science refutes both Popper and Kuhn: ] On close inspection both Popperian crucial experiments and Kuhnian revolutions turn out to be myths: what normally happens is that progressive research programmes replace degenerating ones.

***

A lot of the falsification effort is devoted to proving others wrong and not ourselves. “It’s rare for academics, Drezner writes, to publicly disavow their own theories and hypotheses.”

Indeed, a common lament in the social sciences is that negative findings—i.e., empirical tests that fail to support an author’s initial hypothesis—are never published.

Why is it so hard for us to see when we are wrong?

It is not necessarily concern for one’s reputation. Even predictions that turn out to be wrong can be intellectually profitable—all social scientists love a good straw-man argument to pummel in a literature review. Bold theories get cited a lot, regardless of whether they are right.

Part of the reason is simple psychology; we all like being right much more than being wrong.

As Kathryn Schulz observes in Being Wrong, “the thrill of being right is undeniable, universal, and (perhaps most oddly) almost entirely undiscriminating … . It’s more important to bet on the right foreign policy than the right racehorse, but we are perfectly capable of gloating over either one.”

As we create arguments and gather supporting evidence (while discarding evidence that does not fit) we increasingly persuade ourselves that we are right. We gain confidence and try to sway the opinions of others.

There are benefits to being wrong.

Schulz argues in Being Wrong that “the capacity to err is crucial to human cognition. Far from being a moral flaw, it is inextricable from some of our most humane and honorable qualities: empathy, optimism, imagination, conviction, and courage. And far from being a mark of indifference or intolerance, wrongness is a vital part of how we learn and change.”

Drezner argues that some of the tools of the information age give us hope that we might become increasingly likely to admit being wrong.

Blogging and tweeting encourages the airing of contingent and tentative arguments as events play out in real time. As a result, far less stigma attaches to admitting that one got it wrong in a blog post than in peer-reviewed research. Indeed, there appears to be almost no professional penalty for being wrong in the realm of political punditry. Regardless of how often pundits make mistakes in their predictions, they are invited back again to pontificate more.

As someone who has blogged for more than a decade, I’ve been wrong an awful lot, and I’ve grown somewhat more comfortable with the feeling. I don’t want to make mistakes, of course. But if I tweet or blog my half-formed supposition, and it then turns out to be wrong, I get more intrigued about why I was wrong. That kind of empirical and theoretical investigation seems more interesting than doubling down on my initial opinion. Younger scholars, weaned on the Internet, more comfortable with the push and pull of debate on social media, may well feel similarly.

Still curious? Daniel W. Drezner is the author of The System Worked: How the World Stopped Another Great Depression.

Sir William Osler: A Way of Life

"No mind, however dull, can escape the brightness that comes from steady application."
“No mind, however dull, can escape the brightness that comes from steady application.”

In several of his speeches, Charlie Munger has referred to Sir William Osler, the Canadian physician and co-founder of Johns Hopkins Hospital. The first to bring medical students out of the classroom and directly into the hospital for clinical training, he is often described as the “Father of Modern Medicine.”

Osler was a fascinating, accomplished, and erudite man who liked to quote Thomas Carlyle’s prescription that “Our main business is not to see what lies dimly in the distance, but to do what lies clearly at hand.”

As I followed up on Osler, I quickly came to his speech “A Way of Life,” delivered to students at Yale University in 1913. True to Carlyle’s prescription, Osler proposes that men work steadily towards success and fulfillment in life by taking the world in strict 24-hour increments, letting neither yesterday nor tomorrow be a worry today. (He called it “Life in day-tight compartments.”)

Below are some of my favorite excerpts from this wonderful talk. I recommend you read it slowly and read it twice.

***

While we are all fools to some extent, Osler expounds on the value of putting one foot in front of the other and slowly progressing.

I wish to point out a path in which the way-faring man, though a fool, cannot err; not a system to be worked out painfully only to be discarded, not a formal scheme, simply a habit as easy or as hard to adopt as any other habit, good or bad … The way of life that I preach is a habit to be acquired gradually by long and steady repetition: It is the practice of living for the day only, and for the day’s work; Life in day-tight compartments.

***

Tomorrow is uncertain and yesterday is history. Osler advises that we need to find peace in the moment.

The workers in Christ’s vineyard were hired by the day; only for this day are we to ask for our daily bread, and we are expressly bidden to take no thought for the morrow.

To the modern world, these commands have an Oriental savor, counsels of perfection akin to certain of the Beatitudes, stimuli to aspiration, not to action. I am prepared on the contrary to urge the literal acceptance of the advice … since the chief worries of life arise from the foolish habit of looking before and after. As a patient with double vision from some transient unequal action of the muscles of the eye finds magical relief from well-adjusted glasses, so, returning to the clear binocular vision of today, the over-anxious student finds peace when he looks neither backward to the past nor forward to the future.

***

In De Oratore, Cicero tells the story of how Temistocles was approached by someone offering to teach him the “art of memory,” which would enable him to remember everything. Temistocles, however, tells the man that he would be more grateful if the man could tell him how to forget. In a similar vein Osler advises unshackling yourself from the daily problems of life.

As a vaccine against all morbid poisons left in the system by the infections of yesterday, I offer “a way of life.” Undress your soul at night; not by self-examination, but by shedding, as you do your garments, the daily sins, whether of omission or of commission, and you will wake a free man, with a new life.

***

Long before grit became a subject of study, Osler echoed the advice of Tyler Cowen, that one of the keys to success is the ability to sit and focus your attention and wrestle with your problems.

Realise that you have sixteen waking hours, three or four of which at least should be devoted to making a silent conquest of your mental machinery. Concentration, by which is grown gradually the power to wrestle successfully with any subject, is the secret of successful study. No mind, however dull, can escape the brightness that comes from steady application … Shut closer in hour-tight compartments, with the mind directed intensely upon the subject in hand, you will acquire the capacity to do more and more, you will get into training; and once the mental habit is established, you are safe for life … Concentration is an art of slow acquisition, but little by little the mind is accustomed to habits of slow eating and careful digestion …

***

Osler counselled living a quiet and peaceful life, as this would help you with your responsibilities.

The quiet life in day-tight compartments will help you to bear your own and others’ burdens with a light heart.

***

Want more Osler? Check out this collection of his addresses and letters, and this biography.

Forget The “To-Do” List, You Need A ‘Stop Doing’ List

Warren Buffett

It’s interesting to think about the things you want to accomplish in life and work towards those goals.

This is, after all, what we’ve been taught to do since birth. But over time we accumulate other habits and end up spending our time on things that aren’t important to us.

Jim Collins, author of the cult business classics Good to Great and Great by Choice, suggests an interesting thought experiment (reminiscent of Alan Watts) to help clean the windshield so-to-speak.

Suppose you woke up tomorrow and received two phone calls. The first phone call tells you that you have inherited $20 million, no strings attached. The second tells you that you have an incurable and terminal disease, and you have no more than 10 years to live. What would you do differently, and, in particular, what would you stop doing?

What would you stop doing?

In his book How To Avoid Work, William J. Reilly offers the three most common reasons we give for not doing what we want.

Whenever a person is not doing what he says he wants to do, he always has what sounds like a good excuse. And it’s always one or more of three:

  1. ‘I haven’t the time.’
  2. ‘I haven’t the money.’
  3. ‘My folks don’t want me to.’

Each of these, Reilly argues, “melts away as an imaginary obstacle when we shine the light of intelligence upon it.” Time is the key. “Without time nothing is possible.”

Everything requires Time. Time is the only permanent and absolute ruler in the universe. But she is a scrupulously fair ruler. She treats every living person exactly alike every day. No matter how much of the world’s goods you have managed to accumulate, you cannot successfully plead for a single moment more than the pauper receives without ever asking for it. Time is the one great leveler. Everyone has the same amount to spend every day.

The next time you feel that you ‘haven’t the time’ to do what you really want to do, it may be worth-while for you to remember that you have as much time as anyone else — twenty-four hours a day. How you spend that twenty-four hours is really up to you.

We invest time consciously and unconsciously. If you believe the advice of the wisest Americans many of us think about how our time is spent at the end of our lives, only to find regret about how our precious resource was squandered on the meaningless.

Collins’ thought experiment is an attempt to help us think about how we’re spending our time today, when we can still do something about it to change our ways. We don’t want to wake up when we’re 80, for instance, and realize that we unconsciously allocated all of our thought and effort.

But the value of this experiment applies not only to people but to organizations. The velocity and complexity of problems is increasing. In part, to ward off this pressure and delegate decisions to lower levels, organizations respond with a perpetually increasing internal information velocity. New policies and procedures are easily added while legacy ones are slowly removed. Culturally we value decisions to add things more than we value decisions to remove things.

Echoing the words of Steve Jobs on focus, Collins writes:

(This) lesson came back to me a number of years later while puzzling over the research data on 11 companies that turned themselves from mediocrity to excellence, from good to great. In cataloguing the key steps that ignited the transformations, my research team and I were struck by how many of the big decisions were not what to do, but what to stop doing.

A lot of people wait until the start of the New Year to pause and reflect but there is no better time than now.

Collins also suggests that you ask yourself these three questions as “a personal guidance mechanism.” The answers can be used to course-correct.

  1. What are you deeply passionate about?
  2. What are you are genetically encoded for — what activities do you feel just “made to do”?
  3. What makes economic sense — what can you make a living at?

Think of the three circles as a personal guidance mechanism. As you navigate the twists and turns of a chaotic world, it acts like a compass. Am I on target? Do I need to adjust left, up, down, right? If you make an inventory of your activities today, what percentage of your time falls outside the three circles?

If it is more than 50%, then the stop doing list might be your most important tool. The question is: Will you accept good as good enough, or do you have the courage to sell the mills?

Question 3 is the most complicated, perhaps because in the ‘find your passion’ movement doing what you love will not necessarily lead to a living. Cal Newport argues the counter-point: following your passion is horrible advice.

(Image Source)

French Nobility And The Origins of Modern Culture

"Nobles explored alternatives to patriarchal ideology not because it was weakening in the seventeenth century but because its hold over them was so very strong."
“Nobles explored alternatives to patriarchal ideology not because it was weakening in the seventeenth century but because its hold over them was so very strong.”

With increasing subordination of the noble individual to the collective, seventeenth-century French nobles moved away from tradition and towards individualism. They began to see themselves more as individuals than as the product of inheritance and tradition. Accompanying this preoccupation with the self was an increasingly critical view of society, monarchy, and religious teachings. The very foundations of the monarchy were questioned and answers focused on tradition to guide behaviour were rejected. With the newfound view that most social relations were artificial, the French nobility turned increasingly toward meaningful relationships with friends and lovers. In so doing, they illustrated the emergence of a modern culture in the wake of a traditional social order.

In his book Aristocratic Experience and the Origins of Modern Culture, Jonathan Dewald, a distinguished professor of history at the State University of New York at Buffalo, explores the history of individuality and the cultural push back by the seventeenth-century French nobility. It addresses how the nobility thought about their world and themselves by looking at the responses to increasing tension on personal worth, ambition, careers, money, civic order and sexuality. While these are different topics, Dewald argues they “can also be understood as aspects of a single larger problem. Each represents one form of connection between the individual and her or his society.” The book then is an elegant essay on “how aristocratic men and woman understood their bonds to the society around them at a decisive moment in the evolution of early modern society.”

While it may seem foreign now, at the time, the concept of the person as an individual flew in the face of tradition. Questioning this perspective sowed the seeds of a new worldview where tradition and social hierarchy were displaced in favor of individual pursuits.

Seventeenth century nobles became preoccupied with the nature of selfhood … and they came at the same time to doubt many of the moral underpinnings of their society. They came, in other words, to see the isolated self as real, important, and complicated, and they correspondingly doubted the value, even the reality, of the social conventions that surrounded it.

Answering cultural questions by focusing on such a small segment of society, perhaps 1 percent, begs an explanation. While limited in size, the French aristocracy at the end of the seventeenth century “exercised an influence on the rest of society out of proportion to its numbers.”

For sixteenth- and seventeenth-century nobles, everything rested on traditional order and familial continuity.

Property and political rights descended from the past, and so too did personal qualities, a dual inheritance from the individual family and the larger aristocratic order. Most nobles simply assumed these values … Yet the French nobles also participated enthusiastically in many of the most innovative currents in early modern culture. They followed and helped to shape cultural movements toward individualism, skepticism about established social arrangements, and belief in the primacy of change in human affairs.

This tension started to show itself, even in public ideological defences of the aristocracy, and was “still more evident when the nobles spoke privately, in memoirs, letters, fiction.” These intimate private thoughts exposed assumptions and fears that differed from the confident and public projections of tradition.

***

In the shadow of the culture he encountered in the United States, Alexis de Tocqueville ruminated on the cultural implications of aristocratic society in his masterwork Democracy in America:

Take the case of an aristocratic people interested in literature … When a small, unchanging group of men are concerned at the same time with the same subject, they easily get together and agree on certain guiding principles to direct their efforts. If it is literature with which they are concerned, strict canons will soon prescribe rules that may not be broken. If these men occupy a hereditary position in their country they will naturally be inclined not only to invent rules for themselves but to follow those laid down by their ancestors. Their code will be both strict and traditional. … Such men, beginning and ending their lives in comfortable circumstances, naturally conceive a taste for choice pleasures, full of refinement and delicacy. Moreover, the long and peaceful enjoyment of so much wealth will have induced a certain softness of thought and feeling, and even in their enjoyments they will avoid anything too unexpected or too lively. They would rather be amused than deeply moved; they want to be interested but not carried away.

The problem with Tocqueville, Dewald argues, is that he “seeks to connect specific cultural expressions both to experience and to the ideology of inheritance that undergirded the aristocracy’s existence during the Old Regime.” But treating aristocratic culture as “essentially ideological” comes with some limitations: “it implies a fundamental unity in culture, and thus shields from our view its points of uncertainty or contradiction; it often implies a functionalist view of how ideas and values form, and this seems inadequate to the complexities of both the ideas themselves and of the processes by which they developed; above all, an ideological approach to aristocratic culture treats culture as only a reflection of deeper realities, a secondary level of reality, a superstructure.”

Dewald spends most of his time in the nooks and crannies of uncertainty and contradiction.

***

As the nobles struggled to hold on to a slipping aristocracy, they increasingly found their life shaped by new pressures to subdue the individual in deference to the family.

Lineage gained increasing importance in public life, as social status became more clearly a matter of birth and as venal office-holding created castes within the military and the civil service; in consequence, families increasingly organized themselves along dynastic lines, celebrating paternal authority and subordinating individual desires to dynastic needs. Standards of personal behaviour rose, a process encouraged by both a reinvigorated Catholicism and by courtly libertinism; each demanded that men and women more tightly control their impulses and fit their behaviour to elaborate standards.

In the face of this backdrop the state too heightened its demands on citizens with conformity, the “rigid subordination of individual impulse to collective orderings.” The state—through a web of political influences and ambition—also made it clear that nothing was off limits and intervened regularly on issues of property, law, and distinctions of birth. This is where Dewald’s book takes us. To the “individuals’ responses to these pressures.” While some responses were enthusiastic, producing elaborate “defences of their order’s superiority to the rest of society and emphas(ing) the value of noble birth,” others were more contradictory to expectations of aristocratic life and “directly undercut respect for tradition and inheritance.”

Enthusiasm for courtly manners involved a startlingly explicit rejection of the past as a guide; and this rejection recurred in other domains, as nobles stressed the superiority of their own culture to that of the past. Similarly, the conditions of seventeenth-century warfare required sophisticated political and numerical calculations and encouraged familiarity with classical writers, who acquired renewed relevance in an age dominated by carefully organized masses of infantry. Seventeenth-century political careers demanded similar thought and focused attention on individual ambition rather than dynastic continuity as a key to understanding social arrangements. By selling high positions and by intervening so often in matters of property, the state itself disrupted belief in a stable social order and forced nobles to think carefully about money; in such circumstances, nobles came to view their society as in some sense an artificial creation rather than an organic hierarchy. In these and a variety of other specific ways, seventeenth-century conditions undermined patriarchal ideas and forced nobles into more individualistic modes of thought.

The increasing demands placed on the seventeenth-century’s nobility resulted in both “inner rebellions as well as celebrations” creating a paradox: “as family, state, and ethical ideals increasingly demanded renunciation of individual desires, men and women became increasingly absorbed in understanding themselves as individuals, and indeed in understanding personal desire itself.”

They explored their inner lives in autobiographies and novels, and they presented their lives in terms of personal achievement. They became increasingly preoccupied with emotion, which attached them to friends and lovers—in other words, to chosen objects of affection. Such deepening concern with the personal offered one response to the oppressiveness of seventeenth-century expectations.

In something J.K. Rowling could be proud of, many of the French nobility sought out social settings where distinctions of birth were disguised by anonymity.

The rejection was explicit in the case of the Académie Française, which admitted men without reference to rank. It was implicit in such events as the masked ball, the gambling party, and the decision (taken with growing frequency in the seventeenth century) to write or appear in published literary works, works that exposed author and subject to the judgment of any book-buyer. All of these choices presented momentary, experimental departures from aristocratic society itself. Like the exploration of the personal, they expressed nobles’ ambivalence about their social order. Nobles fully accepted the ordering that dynastic ideology proposed and that accorded them such a privileged place. But they also felt acutely the weight of that order.

Ironically, the conflicting demands of seventeenth-century aristocratic culture created an untenable state that fostered the more egalitarian ideologies of the eighteenth century and the weakening of the patriarchal ideology.

Nobles explored alternatives to patriarchal ideology not because it was weakening in the seventeenth century but because its hold over them was so very strong. … Seventeenth-century aristocratic culture … placed contradictory demands on its participants, between, for instance, ideals of inheritance and of individual ambition. By the early eighteenth century, the weight of these contradictions had for many nobles become intolerable. The more egalitarian ideologies of the eighteenth century, it may be suggested, offered resolution of contradictions that had become burdensome beyond endurance. From this vantage point, Louis XIV’s rise to power seems less a turning point than does his death.

Aristocratic Experience and the Origins of Modern Culture offers a beautiful exploration of how French nobles, in the face tightening restrictions on the individual, sowed the seeds of modern culture.

(Image source)

Charlie Munger on the Value of Thinking Backward and Forward

One of the five simple notions to solve problems is the concept of inversion. To solve problems we need to look at them both forward and backward.

But how does this look in practice? Let me give you an example that Charlie Munger gave during a speech.

Munger liked to give his family little puzzles. And one of the puzzles he gave his family was:

There’s an activity in America, with one-on-one contests, and a national championship. The same person won the championship on two occasions about 65 years apart.

“Now,” I said, “name the activity.”

Any ideas? How would you answer this?

“In my family,” Munger said, “not a lot of light bulbs were flashing.” Except for one.

But I have a physicist son who has been trained more in the type of thinking I like. And he immediately got the right answer, and here’s the way he reasoned:

It can’t be anything requiring a lot of hand-eye coordination. Nobody 85 years of age is going to win a national billiards tournament, much less a national tennis tournament. It just can’t be. Then he figured it couldn’t be chess, which this physicist plays very well, because it’s too hard. The complexity of the system, the stamina required are too great. But that led into checkers. And he thought, “Ah ha! There’s a game where vast experience might guide you to be the best even though you’re 85 years of age.”

And sure enough that was the right answer.

Flipping one’s thinking both forward and backward is a powerful sort of mental trickery that will help improve your thinking.

An 18-Minute Plan for Managing Your Day And Finding Focus

We start every day knowing we’re not going to get it all done or fit it all in. How we spend our time is really a function of priorities. That’s why Peter Bregman argues in 18 Minutes: Find Your Focus, Master Distraction, and Get the Right Things Done that we need to plan ahead, “create a to-do list and an ignore list, and use our calendars.”

“The hardest attention to focus,” he writes, “is our own.”

***
The Ritual of Managing Our Day

We need ritual to manage our days, “clear enough to keep us focused on our priorities. Efficient enough not to get in the way.”

Bregman argues that ritual should take 18 minutes a day: Your Morning Minutes, Refocus, and Your Evening Minutes.

***
Step 1 (5 Minutes) : Your Morning Minutes

Echoing Tim Ferriss Bregman recommends planning ahead. Ferriss prefers the night before, Bregman prefers the morning.

Before you turn on your computer, sit down with your to-do list and “decide what will make this day highly successful.”

Take the items off your to-do list (a picture of Bregman’s to-do list is below) and schedule them into your day.

Berger's To Do List
Bregman’s To Do List

“Make sure,” he writes, “that anything that’s been on your list for three days gets a slot somewhere in your calendar or move it off the list.”

***
Step 2 (1 Minute Every Hour): Refocus

Some interruptions help us course correct.

Set your watch, phone, or computer to ring every hour and start the work that’s listed on your calendar. When you hear the beep, take a deep breath and ask yourself if you spent your last hour productively. Then look at your calendar and deliberately recommit to how you are going to use the next hour. Manage your day hour by hour. Don’t let the hours manage you.

***
Step 3 (5 Minutes): Your Evening Minutes

“At the end of your day,” Bregman writes, “shut off your computer and review how the day went.”

Ask yourself three sets of questions:

  1. How did the day go? What success did I experience? What challenges did I endure?
  2. What did I learn today? About myself? About others? What do I plan to do—differently or the same— tomorrow?
  3. Whom did I interact with? Anyone I need to update? Thank? Ask a question of? Share feedback with?

***

The key to this is the ritual and its predictability.

If you do the same thing in the same way over and over again, the outcome is predictable. In the case of 18 minutes, you’ll get the right things done.

Bregman speaks worldwide on how we can lead, work, and live more powerfully. 18 Minutes: Find Your Focus, Master Distraction, and Get the Right Things Done is an easy to read book that will add a few tools to your toolbox.