Category: Psychology

Kristin Dombek: The Selfishness of Others

I'll bet you think this article is about you.

“We all know selfishness when we see it,” writes essayist Kristin Dombek opening The Selfishness of Others: An Essay on The Fear of Narcissism. She's right. We see it everywhere from TV to family and lovers. Playing in the tension between pathology and common selfishness, her book offers a thought-provoking look at how narcissism became a cultural phenomenon and repository for our fears.

What is wrong with the narcissist she asks?

This is harder to know. If you see the smile on the face of a murderer, you must run. But if you are unlucky enough to love someone who seems suddenly so into himself that he doesn't care who he hurts, someone who turns from warm to gone when he doesn't need you, so self-adoring or wounded he meets criticism with violence or icy rage, who turns into another person in front of your eyes, or simply turns away when he said he'd be there—if you love someone who seems to have the particular twenty-first-century selfishness in some more subtle, or worse, invisible way, you will likely go to the internet for help.

The internet of course offers answers to even the wrong questions.

You'll read, in that seizable portion of the self-help internet we might call, awkwardly, the narcisphere, a story that can change the way you see everything if you start believing in it, giving you the uncanny but slightly exciting sensation that you're living in a movie. It's familiar, this movie, as if you've seen in before and it's a creepy one, but you have the most important role in the script. You're the hero.

The basic script plays out like this.

At first, the narcissist is extraordinarily charming, even kind and sweet. Then, after a while, he seems full of himself. It could be a “he” or a “she,” but let's stick with “he.” That's what you start to think, when you know someone like this: he's full of himself. But the narcissist is empty.

Normal, healthy people are full of self, a kind of substance like a soul or personhood that, if you have it, emanates warmly from inside of you toward the outside of you. No one knows what it is, but everyone agrees that narcissists do not have it. Disturbingly, however, they are often better than anyone else at seeming to have it. Because what they have inside is empty space, they have had to make a study of the selves of others in order to invent something that looks and sounds like one. Narcissists are imitators par excellence. The murderer plagiarized most of his manifesto, obviously and badly, but often narcissists are so good at imitating that you won't even notice. And they do not copy the small, boring parts of selves. They take what they think are the biggest, most impressive parts of other selves, and devise a hologram of self that seems superpowered. Let's call it “selfiness,” this simulacrum of a superpowered self. Sometimes they seem crazy or are really dull, but often, perhaps because they have had to try harder than most to make it, the selfiness they've come up with is qualitatively better, when you first encounter it, than the ordinary, naturally occurring selves of normal, healthy people.


Because for the narcissist, this appreciation of you is entirely contingent on the idea that you will help him to maintain his selfiness. If you do not, or if you are near him when someone or something does not, then God help you. When that picture shatters, his hurt and his rage will be unmatched in its heat or, more often, its coldness. He will unfriend you, stop following you, stop returning your emails, stop talking to you completely. He will cheat on you without seeming to think it's a big deal, or break up with you, when he has said he'd be with you forever. He will fire you casually and without notice. Whatever hurts most, he will do it. Whatever you need the most, he will withhold it. He cannot feel other people's feelings, but he is uncannily good at figuring out how to demolish yours.


It isn't that the narcissist is just not a good person; she's like a caricature of what we mean by “not a good person.” She's not just bad; she's a living, breathing lesson in what badness is.

Immanuel Kant offered a formulation for how to do the right thing: Asking yourself, if everyone acted this way, would the world be a better place? Good people, we tend to believe, will treat others as the ends themselves, not the means. Narcissists, along with psychopaths, do the opposite. For them, people are the means toward other ends. “If everyone were to follow suit,” Dombek writes, “the world would go straight to hell.”

The realization that the narcissist, not so much selfish as not really having a self, changes everything. Suddenly you can see them for what they are: puppets or clowns. While they may look human, they are not.

So what should you do when you are confronted with a narcissist?

It seems no matter what you answer, you'll be haunted forever. With equal certainty the internet offers two pieces of common advice: love them and expect nothing and hope that they change, or run as fast and as far as you can.

If the prevailing wisdom that narcissism is becoming more and more common is indeed true, today's prevailing advice doesn't scale.

Kant's advice no longer holds. But that is not the worst of it. Running is an act of the very same coldness described by the diagnosis. What if the only way to escape a narcissist is to act like one yourself?

The question of the selfishness of others, though, leads quickly to the very difficult question of how we know things about others at all, and the mind-knotting question of how we know things at all.

Dombek goes on to explore provocative questions of ourselves—most of us can be put in environments where we display situational narcissisms; why is having a boyfriend or boss like having a villain; why do the narcissistic descriptions of others (“in moments you quietly bury deep inside you”) remind you of yourself.


Words Like Loaded Pistols: Wartime Rhetoric

Rhetoric, or the art of persuasion, is an ancient topic that's no less relevant today. We are in a golden age of information sharing, which means you are swimming in a pool of rhetoric every day, whether you realize it or not.

The book Words Like Loaded Pistols: Rhetoric from Aristotle to Obama by Sam Leith is one tool to help navigate our choppy waters. Leith does an impressive job of unpacking rhetorical concepts while also providing all the knowledge and nuance required to be a powerful speaker.

The book is laid out beautifully, with sections entitled ‘Champions of Rhetoric,’ in which he dissects the work of some of the most famous orators. The chapter comparing Adolf Hitler to Winston Churchill is particularly interesting. 


Churchill was a prolific speaker: Between 1900 and 1955 he averaged one speech a week. (That’s 2,860 speeches for those who like math). And they were not just speeches; They carried some of the most famous sayings produced in the twentieth century:

Among the phrases he minted were ‘blood, toil, tears, and sweat,’ ‘their finest hour,’ ‘the few,’ ‘the end of the beginning,’ ‘business as usual,’ ‘iron curtain,’ ‘summit meeting,’ and ‘peaceful coexistence.’

While this impressive resume and history solidified his place on the throne of oratory excellence, it’s important to note that he wasn’t a “born speaker” — in fact he made many mistakes. And he learned. 

Like many of us, Churchill would even get nervous to the point of nausea before addressing the public. To counter this he engaged in deliberate practiceHe would rehearse his speeches in the mirror, modify them as needed, and scribble meticulous notes including pauses and stage direction. In other words, one of history's great orators painfully engaged himself in a process of trial, error, and practice

To shape himself as an orator he learned by heart the speeches of Disraeli, Gladstone, Cromwell, Burke, and Pitt. Churchill combined their example with his father Randolph’s gift for invective. But he added something of his own – and it was this that helped tether his high style to something more conversational. He was a master of the sudden change of register – a joke, or a phrase of unexpected intimacy.

Stylistically, Churchill was known for building up to a great crescendo and then suddenly becoming gentle and quiet. Students of rhetoric recognize this as a device to keep your audience engaged, to surprise it. The joking and intimacy showed his prowess with another important rhetorical device, ethos.

Ethos is about establishing a connection with your audience. A joke can help with this because humor is often based on joint assumptions and beliefs; sharing a laugh with someone tends to make us feel closer to them. It’s human nature to gravitate towards those people who are like us (see the principles of influence). 

Yet, for all the aspects of the ethos appeal which Churchill got right, on more than one occasion he didn’t judge his audience well and was unable to persuade them.

When he was an MP in 1935, his colleague Herbert Samuel reported, ‘The House always crowds in to hear him. It listens and admires. It laughs when he would have it laugh, and it trembles when he would have it tremble… but it remains unconvinced, and in the end it votes against him.’

Much like today, in Churchill’s time parliament was designed for a type of call and response dialogue, not a grand soapbox type speech that he was so fond of.

Leith argues that if it wasn’t for the war, Churchill might have never found his audience and surely would have been remembered much differently, if at all.

The thing about Churchill was that, like the stopped clock that’s right twice a day, he occupied one position and waited for the world to come to him. He spend much of his political career predicting the imminent end of Western civilization — and it was only by the damnedest good luck that it happened to be on his watch that it suddenly appeared to be coming about. If not, he might have been remembered as a self-aggrandizing windbag with an old-fashioned speaking style and a love of the sound of his own voice.

But when the country really was under threat, Churchill’s fierce certainties were what an anxious audience wanted, while his style — steeped in the language of the previous centuries — seemed to encapsulate the very traditions that he was exhorting them to fight for. What at another time might have been faults became rhetorical strengths. That, you could say, is kairos writ large.

What does that last phrase “kairos” mean? It's all about timing and fit:

As a rhetorical concept, decorum encompasses not only the more obvious features of style, but kairos, or the timeliness of a speech, the tone and physical comportment of the speaker, the commonplaces and topics of argument chosen, and so on. It is a giant umbrella concept meaning no more nor less than the fitting of a speech to the temper and expectations of its audience.

You could argue that the war needed Churchill and that Churchill needed the war. And unlike conflicts of the past, he also had access to the public like no other leader had before. You didn’t need to crowd into a square to hear Churchill speak, you needed to only turn on the radio.

One of the virtues of Churchill’s wartime rhetoric, however, was that whatever his peers in the House of Commons thought, he was able to speak — as politicians a generation before had not been able to — directly to the public through the wireless.

After delivering many of his key speeches in the Commons, Churchill read them out on the radio. Here, that presidential style — all that gruffness and avunicularity all those rumbling climaxes — was able to take full effect without being interrupted by rustling order papers and barracking Opposition MPs. He was pure voice.

Churchill indeed was pure of voice, but there was another loud voice in this conflict: Adolf Hitler. When it came to speaking, the two shared many things in common, but their differences were just as noticeable.


Hitler understood the power of words: He saw them as a tool which he needed to master if he wanted to achieve his goals. He had a strong vision which he believed in passionately and he knew that he needed his people to share that passion if he was to succeed.

From Mein Kampf:

The power which has always started the greatest religious and political avalanches in history has been, from time immemorial, none but the magic power of the word, and that alone. Particularly the broad masses of the people can be moved only by the power of speech… Only a storm of hot passion can turn the destinies of peoples, and he alone can arouse passion who hears it within himself.

It would seem that Hitler associated passion with anger, his speeches were known to peak with shouting resembling rage. Even when writing his speeches he would work himself up into a frenzy.

Traudl Junge, the young secretary whose memoir of the last days in the Fuhrerbunker formed the basis for the film Downfall, recalled him composing the speech he gave to mark the tenth anniversary of his dictatorship. He started out mumbling almost inaudibly, and pacing up and down, but by the time his speech reached its crescendo he had his back to her and was yelling at the wall.

Like Churchill, Hitler would often practice in front of a mirror and choreograph the whole performance, but he would take it much further. With an eye for theatrics, he would pay close attention to the acoustics of the venue to accent both his booming voice and the martial music that would accompany him. He was particular about the visuals, with his dramatic lights and placement of flags.

Hitler also used pauses to his advantage. While Churchill would use them mid speech to maintain an audience's attention or ‘reel them in’, Hitler would use them at the beginning.

It could go on for anything up to half a minute, which is (you’ll know if you’ve tried it) a very, very long time to stand on a stage without saying or doing anything. When he started – which he’d typically do while the applause was still fading out, causing the audience to prick up its ears the more — he would do so at a slow pace and in a deep voice. The ranting was something he built up to, taking the audience with him.

Hitler liked to control every aspect of his performance and paid close attention to those details that others dismissed, specifically the time of day that he gave his speeches (a lesson infomercials learned).

He preferred to speak in the evening, believing that ‘in the morning and during the day it seems that the power of the human will rebel with its strongest energy against any attempt to impose upon it the will or opinion of another. On the other hand, in the evening it easily succumbs to the domination of a stronger will.’

Hitler had a keen interest and insight into human nature. He knew what he needed from the German people and knew the psychological devices to use to sway the masses. He was even cognizant of how his attire would resonate with the population.

While other senior Nazis went about festooned with ribbons and medals, Hitler always dressed in a plain uniform, the only adornment being the Iron Cross First Class that he had won in 1914. That medal, let it be noted, is a token of bravery, not of rank.

This was a calculated move, an appeal to ethos: I am one of you. It was a tricky balance, because he needed to seem like one of the people but also to portray an air of exceptionality. Why else would people follow him if he wasn't the only one who could do tend to Germany in its time of need?

As a wartime leader, you need to make yourself both of and above your audience. You need to stress the identify of their interests with yours, to create unity in a common purpose. You need, therefore, to cast yourself as the ideal exemplar of all that is best and most determined and most courageous in your people.

As expected, the same type of thing happens in modern politics, which is especially amplified during election time. Everyone is scrambling to seem like a leader of the people and to establish trust while still setting themselves apart from the crowd, convincing us that they are the only person fit for the job.

If you look closely, many of the rhetorical devices examined in Words Like Loaded Pistols are in high use today. Leith discusses a speechwriter for Reagan and one of his Champions of Rhetoric is Obama; these sections of the book are just as interesting as the piece on Churchill and Hitler.


Still Interested? If you have a fascination with politics and/or Rhetoric (or just want someone to skillfully distill the considerable amounts of information from Ad Herennium and Aristotle’s Rhetoricthen we highly recommend you pick the book up.

The Fundamental Attribution Error: Why Predicting Behavior is so Hard

“Psychologists refer to the inappropriate use of dispositional
explanation as the fundamental attribution error, that is,
explaining situation-induced behavior as caused by
enduring character traits of the agent.”
— Jon Elster


The problem with any concept of “character” driving behavior is that “character” is pretty hard to pin down. We call someone “moral” or “honest,” we call them “courageous” or “naive” or any other number of names. The implicit connotation is that someone “honest” in one area will be “honest” in most others, or someone “moral” in one situation is going to be “moral” elsewhere.

Old-time folk psychology supports the notion, of course. As Jon Elster points out in his wonderful book Explaining Social Behavior, folk wisdom would have us believe that much of this “predicting and understanding behavior” thing is pretty darn easy! Simply ascertain character, and use that as a basis to predict or explain action.

People are often assumed to have personality traits (introvert, timid, etc.) as well as virtues (honesty, courage, etc.) or vices (the seven deadly sins, etc.). In folk psychology, these features are assumed to be stable over time and across situations. Proverbs in all languages testify to this assumption. “Who tells one lie will tell a hundred.” “Who lies also steals.” “Who steals an egg will steal an ox.” “Who keeps faith in small matters, does so in large ones.” “Who is caught red-handed once will always be distrusted.” If folk psychology is right, predicting and explaining behavior should be easy.

A single action will reveal the underlying trait or disposition and allow us to predict behavior on an indefinite number of other occasions when the disposition could manifest itself. The procedure is not tautological, as it would be if we took cheating on an exam as evidence of dishonesty and then used the trait of dishonesty to explain the cheating. Instead, it amounts to using cheating on an exam as evidence for a trait (dishonesty) that will also cause the person to be unfaithful to a spouse. If one accepts the more extreme folk theory that all virtues go together, the cheating might also be used to predict cowardice in battle or excessive drinking. 

This is a very natural and tempting way to approach the understanding of people. We like to think of actions that “speak volumes” about others' character, thus using that as a basis to predict or understand their behavior in other realms.

For example, let's say you were interviewing a financial advisor. He shows up on time, in a nice suit, and buys lunch. He says all the right words. Will he handle your money correctly?

Almost all of us would be led to believe he would, reasoning that his sharp appearance, timeliness, and generosity point towards his “good character”.

But what the study of history shows us is that appearances are flawed, and behavior in one context often does not have correlation to behavior in other contexts. Judging character becomes complex when we appreciate the situational nature of our actions. The U.S. President Lyndon Johnson was an arrogant bully and a liar who stole an election when he was young. He also fought like hell to pass the Civil Rights Act, something almost no other politician could have done.

Henry Ford standardized and streamlined the modern automobile and made it affordable to the masses, while paying “better than fair” wages to his employees and generally treating them well and with respect, something many “Titans” of business had trouble with in his day. He was also a notorious anti-Semite! If it's true that “He who is moral in one respect is also moral in all respects,” then what are we to make of this?

Jon Elster has some other wonderful examples coming from the world of music, regarding impulsivity versus discipline:

The jazz musician Charlie Parker was characterized by a doctor who knew him as “a man living from moment to moment. A man living for the pleasure principle, music, food, sex, drugs, kicks, his personality arrested at an infantile level.” Another great jazz musician, Django Reinhardt, had an even more extreme present-oriented attitude in his daily life, never saving any of his substantial earnings, but spending them on whims or on expensive cars, which he quickly proceeded to crash. In many ways he was the incarnation of the stereotype of “the Gypsy.” Yet you do not become a musician of the caliber of Parker and Reinhardt if you live in the moment in all respects. Proficiency takes years of utter dedication and concentration. In Reinhardt's case, this was dramatically brought out when he damaged his left hand severely in a fire and retrained himself so that he could achieve more with two fingers than anyone else with four. If these two musicians had been impulsive and carefree across the board — if their “personality” had been consistently “infantile” — they could never have become such consummate artists.

Once we realize this truth, it seems obvious. We begin seeing it everywhere. Dan Ariely wrote a book about situational dishonesty and cheating which we have written about before. Judith Rich Harris based her theory of child development on the idea that children do not behave the same elsewhere as they do at home, misleading parents into thinking they were molding their children. Good interviewing and hiring is a notoriously difficult problem because we are consistently misled into thinking that what we learn in the interview process is representative of the interviewee's general competence. Books have been written about the Halo Effect, a similar idea that good behavior in one area creates a “halo” around all behavior.

The reason we see this everywhere is because it's how the world works!

This basic truth is called the Fundamental Attribution Error, the belief that behavior in one context carries over with any consistency into other areas.

Studying the error leads us to conclude that we have a natural tendency to:

A. Over-rate some general consideration of “character” and,
B. Under-rate the “power of the situation”, and its direct incentives, to compel a variety of behavior.

Elster describes a social psychology experiment that effectively demonstrates how quickly any thought of “morality” can be lost in the right situation:

In another experiment, theology students were told to prepare themselves to give a brief talk in a nearby building. One-half were told to build the talk around the Good Samaritan parable(!), whereas the others were given a more neutral topic. One group was told to hurry since the people in the other building were waiting for them, whereas another was told that they had plenty of time. On their way to the other building, subjects came upon a man slumping in the doorway, apparently in distress. Among the students who were told they were late, only 10 percent offered assistance; in the other group, 63 percent did so. The group that had been told to prepare a talk on the Good Samaritan was not more likely to behave as one. Nor was the behavior of the students correlated with answers to a questionnaire intended to measure whether their interest in religion was due to the desire for personal salvation or to a desire to help others. The situational factor — being hurried or not — had much greater explanatory power than any dispositional factor.

So with a direct incentive in front of them — not wanting to be late when people were waiting for them, which could cause shame — the idea of being a Good Samaritan was thrown right out the window! So much for good character.

What we need to appreciate is that, in the words of Elster, “Behavior is often no more stable than the situations that shape it.” A shy young boy on the playground might be the most outgoing and aggressive boy in his group of friends. A moral authority in the realm of a religious institution might well cheat on their taxes. A woman who treats her friends poorly might treat her family with reverence and care.

We can't throw the baby out with the bathwater, of course. Elster refers to contingent response tendencies that would carry from situation to situation, but they tend to be specific rather than general. If we break down character into specific interactions between person and types of situations, we can understand things a little more accurately.

Instead of calling someone a “liar,” we might understand that they lie on their taxes but are honest with their spouse. Instead of calling someone a “hard worker,” we might come to understand that they drive hard in work situations, but simply cannot be bothered to work around the house. And so on. We should pay attention to the interplay between the situation, the incentives and the nature of the person, rather than just assuming that a broad  character trait applies in all situations.

This carries two corollaries:

A. As we learn to think more accurately, we get one step closer to understanding human nature as it really is. We can better understand the people with whom we coexist.

B. We might better understand ourselves! Imagine if you could be the rare individual whose positive traits truly did carry over into all, or at least all important, situations. You would be traveling an uncrowded road.


Want More? Check out our ever-growing database of mental models.

Our Genes and Our Behavior

“But now we are starting to show genetic influence on individual differences using DNA. DNA is a game changer; it's a lot harder to argue with DNA than it is with a twin study or an adoption study.”
— Robert Plomin


It's not controversial to say that our genetics help explain our physical traits. Tall parents will, on average, have tall children. Overweight parents will, on average, have overweight children. Irish parents have Irish looking kids. This is true to the point of banality and only a committed ignorant would dispute it.

It's slightly more controversial to talk about genes influencing behavior. For a long time, it was denied entirely. For most of the 20th century, the “experts” in human behavior had decided that “nurture” beat “nature” with a score of 100-0. Particularly influential was the child's early life — the way their parents treated them in the womb and throughout early childhood. (Thanks Freud!)

So, where are we at now?

Genes and Behavior

Developmental scientists and behavioral scientists eventually got to work with twin studies and adoption studies, which tended to show that certain traits were almost certainly heritable and not reliant on environment, thanks to the natural controlled experiments of twins separated at birth. (This eventually provided fodder for Judith Rich Harris's wonderful work on development and personality.)

All throughout, the geneticists, starting with Gregor Mendel and his peas, kept on working. As behavioral geneticist Robert Plomin explains, the genetic camp split early on. Some people wanted to understand the gene itself in detail, using very simple traits to figure it out (eye color, long or short wings, etc.) and others wanted to study the effect of genes on complex behavior, generally:

People realized these two views of genetics could come together. Nonetheless, the two worlds split apart because Mendelians became geneticists who were interested in understanding genes. They would take a convenient phenotype, a dependent measure, like eye color in flies, just something that was easy to measure. They weren't interested in the measure, they were interested in how genes work. They wanted a simple way of seeing how genes work.

By contrast, the geneticists studying complex traits—the Galtonians—became quantitative geneticists. They were interested in agricultural traits or human traits, like cardiovascular disease or reading ability, and would use genetics only insofar as it helped them understand that trait. They were behavior centered, while the molecular geneticists were gene centered. The molecular geneticists wanted to know everything about how a gene worked. For almost a century these two worlds of genetics diverged.

Eventually, the two began to converge. One camp (the gene people) figured out that once we could sequence the genome, they might be able to understand more complicated behavior by looking directly at genes in specific people with unique DNA, and contrasting them against one another.

The reason why this whole gene-behavior game is hard is because, as Plomin makes clear, complex traits like intelligence are not like eye color. There's no “smart gene” — it comes from the interaction of thousands of different genes and can occur in a variety of combinations. Basic Mendel-style counting (the sort of dominant/recessive eye color gene thing you learned in high school biology) doesn't work in analyzing the influence of genes on complex traits:

The word gene wasn't invented until 1903. Mendel did his work in the mid-19th century. In the early 1900s, when Mendel was rediscovered, people finally realized the impact of what he did, which was to show the laws of inheritance of a single gene. At that time, these Mendelians went around looking for Mendelian 3:1 segregation ratios, which was the essence of what Mendel showed, that inheritance was discreet. Most of the socially, behaviorally, or agriculturally important traits aren't either/or traits, like a single-gene disorder. Huntington's disease, for example, is a single-gene dominant disorder, which means that if you have that mutant form of the Huntington's gene, you will have Huntington's disease. It's necessary and sufficient. But that's not the way complex traits work.

The importance of genetics is hard to understate, but until the right technology came along, we could only observe it indirectly. A study might have shown that 50% of the variance in cognitive ability was due to genetics, but we had no idea which specific genes, in which combinations, actually produced smarter people.

But the Moore's law style improvement in genetic testing means that we can cheaply and effectively map out entire genomes for a very low cost. And with that, the geneticists have a lot of data to work with, a lot of correlations to begin sussing out. The good thing about finding strong correlations between genes and human traits is that we know which one is causative: The gene! Obviously, your reading ability doesn't cause you to have certain DNA; it must be the other way around. So “Big Data” style screening is extremely useful, once we get a little better at it.


The problem is that, so far, the successes have been a bit minimal. There are millions of “ATCG” base pairs to check on.  As Plomin points out, we can only pinpoint about 20% of the specific genetic influence for something simple like height, which we know is about 90% heritable. Complex traits like schizophrenia are going to take a lot of work:

We've got to be able to figure out where the so-called missing heritability is, that is, the gap between the DNA variants that we are able to identify and the estimates we have from twin and adoption studies. For example, height is about 90 percent heritable, meaning, of the differences between people in height, about 90 percent of those differences can be explained by genetic differences. With genome-wide association studies, we can account for 20 percent of the variance of height, or a quarter of the heritability of height. That's still a lot of missing heritability, but 20 percent of the variance is impressive.

With schizophrenia, for example, people say they can explain 15 percent of the genetic liability. The jury is still out on how that translates into the real world. What you want to be able to do is get this polygenic score for schizophrenia that would allow you to look at the entire population and predict who's going to become schizophrenic. That's tricky because the studies are case-control studies based on extreme, well-diagnosed schizophrenics, versus clean controls who have no known psychopathology. We'll know soon how this polygenic score translates to predicting who will become schizophrenic or not.

It brings up an interesting question that gets us back to the beginning of the piece: If we know that genetics have an influence on some complex behavioral traits (and we do), and we can with the continuing progress of science and technology, sequence a baby's genome and predict to a certain extent their reading level, facility with math, facility with social interaction, etc., do we do it?

Well, we can't until we get a general recognition that genes do indeed influence behavior and do have predictive power as far as how children perform. So far, the track record on getting educators to see that it's all quite real is pretty bad. Like the Freudians before, there's a resistance to the “nature” aspect of the debate, probably influenced by some strong ideologies:

If you look at the books and the training that teachers get, genetics doesn't get a look-in. Yet if you ask teachers, as I've done, about why they think children are so different in their ability to learn to read, and they know that genetics is important. When it comes to governments and educational policymakers, the knee-jerk reaction is that if kids aren't doing well, you blame the teachers and the schools; if that doesn't work, you blame the parents; if that doesn't work, you blame the kids because they're just not trying hard enough. An important message for genetics is that you've got to recognize that children are different in their ability to learn. We need to respect those differences because they're genetic. Not that we can’t do anything about it.

It's like obesity. The NHS is thinking about charging people to be fat because, like smoking, they say it's your fault. Weight is not as heritable as height, but it's highly heritable. Maybe 60 percent of the differences in weight are heritable. That doesn't mean you can't do anything about it. If you stop eating, you won't gain weight, but given the normal life in a fast-food culture, with our Stone Age brains that want to eat fat and sugar, it's much harder for some people.

We need to respect the fact that genetic differences are important, not just for body mass index and weight, but also for things like reading disability. I know personally how difficult it is for some children to learn to read. Genetics suggests that we need to have more recognition that children differ genetically, and to respect those differences. My grandson, for example, had a great deal of difficulty learning to read. His parents put a lot of energy into helping him learn to read. We also have a granddaughter who taught herself to read. Both of them now are not just learning to read but reading to learn.

Genetic influence is just influence; it's not deterministic like a single gene. At government levels—I've consulted with the Department for Education—I don't think they're as hostile to genetics as I had feared, they're just ignorant of it. Education just doesn't consider genetics, whereas teachers on the ground can't ignore it. I never get static from them because they know that these children are different when they start. Some just go off on very steep trajectories, while others struggle all the way along the line. When the government sees that, they tend to blame the teachers, the schools, or the parents, or the kids. The teachers know. They're not ignoring this one child. If anything, they're putting more energy into that child.

It's frustrating for Plomin because he knows that eventually DNA mapping will get good enough that real, and helpful, predictions will be possible. We'll be able to target kids early enough to make real differences — earlier than problems actually manifest — and hopefully change the course of their lives for the better. But so far, no dice.

Education is the last backwater of anti-genetic thinking. It's not even anti-genetic. It's as if genetics doesn't even exist. I want to get people in education talking about genetics because the evidence for genetic influence is overwhelming. The things that interest them—learning abilities, cognitive abilities, behavior problems in childhood—are the most heritable things in the behavioral domain. Yet it's like Alice in Wonderland. You go to educational conferences and it's as if genetics does not exist.

I'm wondering about where the DNA revolution will take us. If we are explaining 10 percent of the variance of GCSE scores with a DNA chip, it becomes real. People will begin to use it. It's important that we begin to have this conversation. I'm frustrated at having so little success in convincing people in education of the possibility of genetic influence. It is ignorance as much as it is antagonism.

Here's one call for more reality recognition.


Still Interested? Check out a book by John Brookman of with a curated collection of articles published on genetics.

Maria Konnikova on How we Get Conned

There's a scene in the classic Paul Newman film The Sting, where Johnny Hooker (played by a young Robert Redford) tries to get Henry Gondorf (played by Newman) to finally tell him when they're going to pull the big con. His response tells the tale:

You gotta keep his con even after you take his money. He can't know you took him.

It's this same subject that our friend Maria Konnikova — whom we interviewed a few years ago upon the release of her book Mastermind: How to Think like Sherlock Holmes — has mined with her new book The Confidence Game: Why We Fall For it…Every Time.

It's a good question: Why do we fall for it every time? Confidence games (cons for short) are a wonderful arena to study the Psychology of Human Misjudgment.

In fact, you could call a good con artist — you have to love the term artist here — a master of human psychology. They are, after all, in the game of manipulating people into parting with their money. They are so good, a successful con is a lot like a magic trick:

When we step into a magic show, we come in actively wanting to be fooled. We want deception to cover our eyes and make our world a tiny bit more fantastical, more awesome than it was before. And the magician, in many ways, uses the exact same approaches as the confidence man—only without the destruction of the con’s end game. “Magic is a kind of a conscious, willing con,” Michael Shermer, a science historian and writer who has devoted many decades to debunking claims about the supernatural and the pseudoscientific, told me one December afternoon. “You’re not being foolish to fall for it. If you don’t fall for it, the magician is doing something wrong.”

Shermer, the founder of the Skeptics Society and Skeptic magazine, has thought extensively about how the desire to embrace magic so often translates into susceptibility to its less savory forms. “Take the Penn and Teller cup and balls. I can explain it to you and it still would work. It’s not just knowing the secret; it’s not a trick. It’s the whole skill and art of presentation. There’s a whole narrative—and that’s why it’s effective.” At their root, magic tricks and confidence games share the same fundamental principle: a manipulation of our beliefs. Magic operates at the most basic level of visual perception, manipulating how we see—and don’t see—and experience reality. It changes for an instant what we think possible, quite literally taking advantage of our eyes’ and brains’ foibles to create an alternative version of the world. The con does the same thing, but can go much deeper. Tricks like three-card monte are identical to a magician’s routine—except the intent is more nefarious.

Psychology and show magic have more in common than you'd think: As Shermer says, there are many magic tricks that you can explain ahead of time and they will still work, and still baffle. But…wait…how?

The link between everyday psychological manipulation and show magic is so close that the magician Harry Houdini spent a good portion of his later life trying to sniff out cons in the form of mediums, mystics, and sooth-sayers. Even he couldn't totally shake free of the illusions:

Mysticism, [Houdini] argued, was a game as powerful as it was dangerous. “It is perfectly rational to suppose that I may be deceived once or twice by a new illusion,” he wrote, “but if my mind, which has been so keenly trained for years to invent mysterious effects, can be deceived, how much more susceptible must the ordinary observer be?

Such is the power of the illusion. The same, of course, goes for the mental tricks in our psychological make-up. A great example is the gambling casino: Leaving out the increasingly rare exceptions, who ever walks in thinking they have a mathematical edge over the house? Who would be surprised to find out the casino is deliberately manipulating them into losing money with social proof, deprival super-reaction, commitment bias, over-confidence bias, and other tricks? Most intelligent folks aren't shocked or surprised by the concept of a house edge. And yet casinos continue to do healthy business. We participate in the magic trick. In a perverse sense, we allow ourselves to be conned.

In some ways, confidence artists like Demara have it easy. We’ve done most of the work for them; we want to believe in what they’re telling us. Their genius lies in figuring out what, precisely, it is we want, and how they can present themselves as the perfect vehicle for delivering on that desire.

The Beginning of a Con: The “Put-Up” & The “Mark”

Who makes a good mark for a con artist? Essentially, it could be anyone. Context trumps character. Konnikova wisely retracts from trying to pinpoint exactly who is easiest to con: The truth is, in the right time and place, we can all get hit by a good enough con man. In fact, con artists themselves often make great marks. This is probably linked, in part, to over-confidence. (In fact, you might call conning a con man an…Over-confidence game?)

The con artist starts by getting to know us at a deep level. Konnikova argues that con artists combine excellent judgment of character with a honed ability to show the mark exactly what he wants to see. An experienced con artist has been drowned in positive and negative feedback on what works and does not. Through practice evolution, he's learned what works. That's why we end up letting him in, even if we're on guard:

A con artist looks at everyone at that fine level. When it comes to the put-up, accuracy matters—and con men don’t just want to know how someone looks to them. They want to correctly reflect how they want to be seen.

What’s more, confidence artists can use what they’re learning as they go in order to get us to give up even more. We are more trusting of people who seem more familiar and more similar to us, and we open up to them in ways we don’t to strangers. It makes a certain sense: those like us and those we know or recognize are unlikely to want to hurt us. And they’re more likely to understand us.

There are a few things at play here. The con is triggering a bias from liking/loving, which we all have in us. By getting us committed and then drawing us in slowly, they also trigger commitment bias — in fact, Konnikova explains that the term Confidence Game itself comes from a basic trust exercise: Get into a conversation with a mark, commit them to saying that they trust you, then ask them if they'll let you hold their wallet as a show of that trust. Robert Cialdini — the psychology professor who wrote the wonderfully useful book Influence — would certainly not be surprised to see that this little con worked pretty frequently. (Maria smartly points out the connection between con artists and Cialdini's work in the book.)

The “Play,” the “Rope,” the “Tale,” and the “Convincer”

Once the con artist decides that we're a mark, the fun begins.

After the mark is chosen, it is time to set the actual con in motion: the play, the moment when you first hook a victim and begin to gain her trust. And that is accomplished, first and foremost, through emotion. Once our emotions have been captured, once the con artist has cased us closely enough to identify what it is we want, feeling, at least in the moment, takes over from thinking.


What visceral states do is create an intense attentional focus. We tune out everything else and tune in to the in-the-moment emotional cues. It’s similar to the feeling of overwhelming hunger or thirst—or the need to go to the bathroom—when you suddenly find yourself unable to think about anything else. In those moments, you’re less likely to deliberate, more likely to just say yes to something without fully internalizing it, and generally more prone to lapses that are outside the focus of your immediate attention.

As far as the context of a good con, emotion rules the day. People in financial straits, or who find themselves in stressful or unusual situations are the easiest to con. This is probably because these situations trigger what Danny Kahneman would call System 1 thinking: Fast, snap judgments, often very bad ones. Influenced by stress, we're not slowing down and thinking things through. In fact, many people won't even admit to be conned after the fact because they feel so ashamed of their lack of judgment in the critical moments. (Cult conversions use some of the same tactics.)

Now begins the “Tale”

A successful story does two things well. It relies on the narrative itself rather than any overt arguments or logical appeals to make the case on its own, and it makes us identify with its characters. We’re not expecting to be persuaded or asked to do something. We’re expecting to experience something inherently pleasant, that is, an interesting tale. And even if we’re not relating to the story as such, the mere process of absorbing it can create a bond between us and the teller—a bond the teller can then exploit.

It’s always harder to argue with a story, be it sad or joyful. I can dismiss your hard logic, but not how you feel. Give me a list of reasons, and I can argue with it. Give me a good story, and I can no longer quite put my finger on what, if anything, should raise my alarm bells. After all, nothing alarming is ever said explicitly, only implied.

This is, of course, the con artist preying on our inherent bias for narrative. It's how we sense-make, but as Cialdini knows so well, it can be used for nefarious purposes to cause a click, whirr automatic reaction where our brain doesn't realize it's being tricked. Continuing the fallacy, the con artist reinforces the narrative we've been building in our head:

One of the key elements of the convincer, the next stage of the confidence game, is that it is, well, convincing: the convincer makes it seem like you’re winning and everything is going according to plan. You’re getting money on your investment. Your wrinkles are disappearing and your weight, dropping. That doctor really seems to know what he’s doing. That wine really is exceptional, and that painting, exquisite. You sure know how to find the elusive deal. The horse you bet on, both literal and figurative, is coming in a winner.

 The “Breakdown,” and the “Send”

And now comes the break-down. We start to lose. How far can the grifter push us before we balk? How much of a beating can we take? Things don’t completely fall apart yet—that would lose us entirely, and the game would end prematurely — but cracks begin to show. We lose some money. Something doesn’t go according to plan. One fact seems to be off. A figure is incorrectly labeled. A wine bottle is “faulty.” The crucial question: do we notice, or do we double down? High off the optimism of the convincer, certain that good fortune is ours, we often take the second route. When we should be cutting our losses, we instead recommit—and that is entirely what the breakdown is meant to accomplish.

A host of biases are being triggered at this point, turning our brains into mush. We're starting to lose a little, but we feel if we hang in long enough, we can probably at least come out even, or ahead. (Deprival super-reaction tendency, so common at the roulette table, and sunk-cost fallacies.) We've already put our trust in this nice fellow, so any new problems can probably be rationalized as something we “knew could happen all along,” so no reason to worry. (Commitment & consistency, hindsight bias.) And of course, this is where the con artist really has us. It's called The Send.

The send is that part of the con where the victim is recommitted, that is, asked to invest increasingly greater time and resources into the con artist’s scheme—and in the touch, the con finally comes to its fruition and the mark is completely, irrevocably fleeced.

The End of the Line

Of course, all things eventually come to an end.

The blow-off is often the final step of the con, the grifter’s smooth disappearance after the game has played out. Sometimes, though, the mark may not be so complacent. If that happens, there’s always one more step that can be taken: the fix, when a grifter puts off the involvement of law enforcement to prevent marks from making their complaints official.

Like the scene in The Sting, the ideal con ends without trouble for the con-man: Ideally, the mark won't even know it was a con. But if they do, Konnikova makes an interesting point that the blow-off and the fix often end up being unnecessary, for reputational reasons. This self-preservation mechanism is one reason so many frauds never come to light, why there are few prosecutions in relation to the amount of fraud really going on:

The blow-off is the easiest part of the game, and the fix hardly ever employed. The Drake fraud persisted for decades—centuries, in fact—because people were too sheepish about coming forward after all that time. Our friend Fred Demara was, time and time again, not actually prosecuted for his transgressions. People didn’t even want to be associated with him, let alone show who they were publically by suing him. The navy had only one thing to say: go quietly—leave, don’t make a scene, and never come back.

Besides the reputational issue, there are clearly elements of Pavlovian mere association at play. Who wants to be reminded of their own stupidity? Much easier to sweep it away as soon as possible, never to be reminded again.


Confidence Game is an enjoyable read with tales of cons and con artists throughout history – a good reminder of our own fallibility in the face of a good huckster and the power of human misjudgment.

Attentional Blink

Despite my experiments with meditation, I have difficulty focusing on my breath if I take a few days off meditating or yoga.

The world is distracting, there are texts coming in, fire trucks going by, an ache in my back, and an itch on my nose.

This, however, is the way we move forward. After a few days of regular meditation, I'm back. My ability to concentrate and focus becomes so much higher. I read with greater ease and retain more information.

This passage by Winifred Gallagher in Rapt: Attention and the Focused Life, talking about attentional blink, is worth flagging.

… different types of attentional training affect the brain and behavior in different ways. Practices that feature neutral, single-pointed concentration, such as mindfulness meditation, particularly improve your ability to focus as you go about your daily life. ‘Attentional blink’ experiments suggest why. If you’re shown two letters flashed a half-second apart in a series of twenty numbers, for example, you’ll almost certainly see the first letter but miss the second one. The glitch is caused by ‘sticky’ attention, which keeps you glued to the first cue, preventing your from catching it the next time. After three months of breath-centered meditation, however, you’re able to ‘let go’ of the first letter quickly and be ready to focus on the second.

No mere psych-lab curiosity, the blink research, which offers yet more proof that the world you experience is much more subjective than you assume, has important real-life implications. Even when you think you’re focused on what’s going on, these data show, you miss things that occur in quick succession, including fleeting facial and vocal cues. … ‘Sensitive attention is a key substrate of successful social interactions, and the consequences of missing that kind of information can be quite significant.’ Indeed, research done by Paul Ekman, a psychologist at the University of California at San Francisco, shows that slight, rapid changes in a person’s expression are highly meaningful, if unspoken, indications of what’s really on his or her mind. Most people don’t read these cues well, he finds, but attentional training can greatly improve this interpretive ability.

Because the blink phenomenon has long been regarded as relatively fixed, the fact that it can be modified helps prove that attention is indeed a trainable skill.

Rapt: Attention and the Focused Life is filled with tips and strategies on how to improve your ability to concentrate and pay attention.