Saturday, September 18, 2021

Lying and Society

 


In large, complex societies, honesty takes on added weight because so many transactions take place between strangers. At the same time, we’re descended from those who felt little need to tell the truth to those they didn’t recognize. As we saw in the last chapter, for most of human history, lying to outsiders was not only condoned but sanctioned. Integrity helped us deal with our own, duplicity with outsiders. As a result, the will to deceive unfamiliar human beings may be one of our genetic memories. Paradoxically, a will to be honest may also be an inherited trait. Darwin himself believed that because it strengthened communal bonds, truthfulness had survival value.


Members of tribes that emphasized honesty were most likely to have heirs. Us. From our distant ancestors we have inherited complementary urges to be honest and dishonest. Depending on the context, both tendencies helped us survive: honesty for our own kind, deception for everyone else. Just as dealing with familiar faces promotes a tendency to tell the truth to this very day, so may contact with a stranger trigger an ancient impulse to lie. This bifurcated heritage is reflected in the variable ethical standards that are still commonplace. Subjects in one of Bella DePaulo’s studies said they lied during 28 percent of their conversations with friends, 48 percent with acquaintances, and 77 percent with strangers. Even in Tipton, Iowa, Alan Wolfe found an insider-outsider ethical dichotomy alive and well. The strong emphasis residents placed on being honest with each other weakened when questions arose about telling the truth to those who lived elsewhere, or to large corporations, or the IRS. “The moral instinct of Tiptonites is to value honesty more when the recipient of one’s honesty is a close neighbor or friend,” Wolfe discovered, “than when it is a stranger.”


If it is true that dealing with people we don’t know, or don’t know well, triggers a tendency to deceive inherited from our ancestors on the savanna, then as more and more of us deal with a rising numbers of strangers (or those who feel like strangers), an urge to tell lies is increasingly unleashed. At the very least our inhibitions about being dishonest are lowered. On the receiving end, anyone we meet could be lying at any time about anything, and we would have no way to know. (As we’ll see in a later chapter, the human capacity to detect lies is quite limited.) Paul Ekman—who has devoted his career to studying deception—believes it is unwise to trust one’s assessment of another person’s honesty without having some knowledge about that person. One proven enhancer of lie-detection accuracy is knowing how a suspected liar has behaved in two or more situations.


Ekman has an interesting theory about why most of us can’t detect lies very well. In our ancestral environment, he speculates, there was not much opportunity to deceive one another. We lived cheek by jowl in groups where consequences for being deceitful were severe. As a result, there was little incentive to be dishonest, or opportunity to unmask those who were. Lies would not have been told often enough for lie catchers to hone their skills. In such a context the adaptive value of a lie-telling talent or a complementary ability to catch liars would be low. The context in which most of us now live is just the opposite. Opportunities to lie are constant today, the means to disguise lies plentiful, and the penalties for being caught meager. At worst, those revealed as liars can simply move on—to another place, a different spouse, new friends, who have no idea that they are known liars.

One could hypothesize that the looser human ties are in any social context, the more likely it is that those who live there will deceive and be deceived. And even if we aren’t being hoodwinked in such settings, it is easy to suspect that we are, because we just don’t know.




Lie Regulations

 


All societies must reconcile the fact that lying is socially toxic with the fact that nearly all their members engage in this practice. Every belief system does its best to regulate dishonesty with taboos, sanctions, and norms. Few such systems claim that every lie is always wrong. This would put them too far out of synch with facts on the ground.


Therefore a major task for all belief systems has been to determine when it’s permissible to tell a lie.


Those participating in this search have usually taken three basic approaches: (1) lying is wrong, period (Augustine, Wesley, Kant); (2) it all depends (Montaigne, Voltaire, Bacon); (3) there is something to be said for a good lie well told (Machiavelli, Nietzsche, Wilde).


Greek gods were celebrated for their skill at deceiving humans and each other. In The Odyssey, Odysseus the dissembler is a far more intriguing character than Achilles the truth teller. When Athena, no slouch herself in the deception arts, approaches Odysseus upon his return to Ithaca in disguise, she is favorably impressed by the persuasive yarns he spins about himself. “Crafty must be he,” Athena tells Odysseus, “and Knavish, who would outdo thee in all manner of guile.”


Even early ethicists who warned against telling lies seldom did so on absolute terms. Plato, who condemned lying on general principles, nonetheless thought it was crucial for the guardians of his ideal republic to propagate “noble lies” so that the masses would accept their place and not disturb social harmony. Across the Adriatic, Cicero’s On Duties emphasized the need for truth telling among free men. In Cicero’s world, lying to a slave was not considered dishonest.

Most societies leave the question of determining which lies are justified to their clergy. Over the millennia theologians of all stripes have occupied themselves with explaining why some lies are worse than others. Even though the fourth of Buddhism’s five precepts admonishes the faithful to abstain from lying, Buddhists distinguish between major lies (such as feigning enlightenment), minor lies (making things up), and lies told to benefit others (as when a doctor conceals the truth from a patient who is dying). The latter in particular are not considered much of a problem.


Like Buddhism’s fourth precept, Hindu ethics proscribe lying. The seminal text Laws of Manu admonishes Hindi never to “swear an oath falsely, even in a trifling matter.” That seems clear enough. In its next passage, however, Manu’s laws advise that “there is no crime in a [false] oath about women whom one desires, marriages, fodder for cows, fuel, and helping a priest.”


This is how it goes in most theology. Admonitions not to lie are followed by a list of circumstances in which lying is permissible. Muhammad said his followers should always be truthful, except when a lie was necessary to preserve domestic harmony, save their life, or keep the peace. The Talmud also notes a need to keep peace as justifying falsehood. According to Judaism’s civil and religious laws, a pious scholar is always to tell the truth except when asked about his marital relations, or to avoid sounding boastful, or when telling others how well he has been hosted might burden his host with too many other guests.


Both Testaments of the Bible, and the Old Testament especially, combine condemnations of dishonesty with admiring accounts of successful deception: Abraham claiming that Sarah was his sister, not his wife; Jacob passing as his brother Esau to win his father’s blessing (and inheritance); Egyptian midwives rescuing Hebrew children by telling Pharaoh that their mothers were so vigorous that—unlike Egyptian women—they gave birth before the midwives arrived. 

Friday, September 17, 2021

We do tell more lies

 


I think it’s fair to say that honesty is on the ropes. Deception has become commonplace at all levels of contemporary life. At one level that consists of “He’s in a meeting” or “No, that dress doesn’t make you look fat.” On another level it refers to “I never had sexual relations with that woman” or “We found the weapons of mass destruction.” High-profile dissemblers vie for headlines: fabulist college professors, fabricating journalists, stonewalling bishops, book-cooking executives and their friends the creative accountants. They are the most visible face of a far broader phenomenon: the routinization of dishonesty. I’m not talking just about those who try to fib their way out of a tight spot (“I wasn’t out drinking last night; I had to work late”) but casual lying done for no apparent reason (“Yes, I was a cheerleader in high school”).


Ludwig Wittgenstein once observed how often he lied when the truth would have done just as well. This Viennese philosopher has many modern disciples. The gap between truth and lies has shrunk to a sliver. Choosing which to tell is largely a matter of convenience. 


We lie for all the usual reasons, or for no apparent reason at all. It’s no longer assumed that truth telling is even our default setting. When Monica Lewinsky said she’d lied and been lied to all her life, few eyebrows were raised. 


Our attitudes toward lying have grown, to say the least, tolerant. “It’s now as acceptable to lie as it is to exceed the speed limit when driving,” observed British psychologist Philip Hodson. “Nobody thinks twice about tattered condition of contemporary candor is suggested by how often we use phrases such as “quite frankly,” “let me be frank,” “let me be candid,” “truth be told,” “to tell you the truth,” “to be truthful,” “the truth is,” “truthfully,” “in all candor,” “in all honesty,” “in my honest opinion,” and “to be perfectly honest.” Such verbal tics are a rough gauge of how routinely we deceive each other. If we didn’t, why all the disclaimers?


**


Even though there have always been liars, lies have usually been told with hesitation, a dash of anxiety, a bit of guilt, a little shame, at least some sheepishness. Now, clever people that we are, we have come up with rationales for tampering with truth so we can dissemble guilt-free. I call it post-truth. We live in a post-truth era. Post-truthfulness exists in an ethical twilight zone. It allows us to dissemble without considering ourselves dishonest. When our behavior conflicts with our values, what we’re most likely to do is reconceive our values. Few of us want to think of ourselves as being unethical, let alone admit that to others, so we devise alternative approaches to morality. Think of them as alt.ethics. This term refers to ethical systems in which dissembling is considered okay, not necessarily wrong, therefore not really “dishonest” in the negative sense of the word.


Even if we do tell more lies than ever, no one wants to be considered a liar. That word sounds so harsh, so judgmental. Men in particular are extremely careful to avoid giving other men any opportunity to say “You callin’ me a liar?” Once those fatal words are spoken, it’s hard for dialogue to continue without fists being thrown, or worse. The word lie itself is both a description and a weapon. 


According to the Oxford English Dictionary, this term “is normally a violent expression of moral reprobation, which in polite conversation tends to be avoided.” That’s why we come up with avoidance mechanisms: rationales for dishonesty, reasons why it’s okay to lie, not nearly as bad as we once thought, maybe not so bad after all. The emotional valence of words associated with deception has declined. We no longer tell lies. Instead we “misspeak.” We “exaggerate.” We “exercise poor judgment.” “Mistakes were made,” we say. The term “deceive” gives way to the more playful “spin.” At worst, saying “I wasn’t truthful” sounds better than “I lied.” Nor would we want to accuse others of lying; we say they’re “in denial.” That was sometimes said even of Richard Nixon, the premier liar of modern times, who went to his grave without ever confessing to anything more than errors of judgment. Presidential aspirant Gary Hart admitted only to “thoughtlessness and misjudgment” after reporters revealed Hart’s dishonesty (not only about his sex life but about his age). When, during a primary debate, John Kerry referred to a nonexistent poll that put his popularity well above Hillary Clinton’s, an aide later said Kerry “misspoke.” And it isn’t just male politicians who parse words this way. In the course of writing The Dance of Deception, Harriet Lerner asked women friends what lies they’d recently told. This request was invariably greeted with silence. When Lerner asked the same friends for examples of “pretending" they had no problem complying. “I pretended to be out when my friends called,” said one without hesitation.


A direct admission of lying (“I lied”) is rare to nonexistent. Those willing to make such a bold statement cast doubt on anything they have said in the past and anything they will say in the future. This is why, rather than open the floodgates and accept lying as a way of life, we manipulate notions of truth. We “massage” truthfulness, we “sweeten it,” we tell “the truth improved.” Britain’s cabinet secretary Sir Robert Armstrong once created an uproar with his droll admission that he’d been “economical with the truth” (a phrase he borrowed from Edmund Burke). Since then, all manner of creative phrasemaking has been devoted to explaining why lies are something else altogether. My favorite depicts a liar as “someone for whom truth is temporarily unavailable.”


**

Honesty was once considered an all-or-nothing proposition. You were either honest or dishonest. In the post-truth era this concept has become more nuanced. We think less about honesty and dishonesty per se and more about degrees of either one. Ethics are judged on a sliding scale. If our intentions are good, and we tell the truth more often than we lie, we consider ourselves on firm moral ground. If we add up truths and lies we’ve told and find more of the former than the latter, we classify ourselves honest. This is ledger-book morality. Conceding that his magazine soft-pedaled criticism of advertisers, one publisher concluded, “I guess you could say we’re 75 percent honest, which isn’t bad.”


In terms of values, this approach denotes a significant shift. Previous generations tended to think you were virtuous or you weren’t. Morality was not assessed by tallying assets and debits on a spreadsheet of virtue and hoping to come out ahead. Another analogy would be that we have shifted from set menu to buffet style ethics: picking and choosing which ones to abide. This approach allows for the “compartmentalizing” at which Bill Clinton was said to excel. Abraham Lincoln would not be impressed. 












Wednesday, September 15, 2021

Loss Aversion




Tastes are not fixed; they vary with the reference point.


The disadvantages of a change loom larger than its advantages, inducing a bias that favors the status quo.


Loss aversion implies only that choices are strongly biased in favor of the reference situation (and generally biased to favor small rather than large changes).


Threats are privileged above opportunities


The brain responds quickly even to purely symbolic threats. Emotionally loaded words quickly attract attention, and bad words (war, crime) attract attention faster than do happy words (peace, love).


The psychologist Paul Rozin, an expert on disgust, observed that a single cockroach will completely wreck the appeal of a bowl of cherries, but a cherry will do nothing at all for a bowl of cockroaches. As he points out, the negative trumps the positive in many ways, and loss aversion is one of many manifestations of a broad negativity dominance. Other scholars, in a paper titled “Bad Is Stronger Than Good,” summarized the evidence as follows: “Bad emotions, bad parents, and bad feedback have more impact than good ones, and bad information is processed more thoroughly than good. The self is more motivated to avoid bad self-definitions than to pursue good ones. Bad impressions and bad stereotypes are quicker to form and more resistant to disconfirmation than good ones.” They cite John Gottman, the well-known expert in marital relations, who observed that the long-term success of a relationship depends far more on avoiding the negative than on seeking the positive. Gottman estimated that a stable relationship requires that good interactions outnumber bad interactions by at least 5 to 1. Other asymmetries in the social domain are even more striking. We all know that a friendship that may take years to develop can be ruined by a single action.


Loss aversion refers to the relative strength of two motives: we are driven more strongly to avoid losses than to achieve gains. A reference point is sometimes the status quo, but it can also be a goal in the future: not achieving a goal is a loss, exceeding the goal is a gain. As we might expect from negativity dominance, the two motives are not equally powerful. The aversion to the failure of not reaching the goal is much stronger than the desire to exceed it.




Inside Our Body

 

Lucy Jones, Lithub

Inside your body, there are probably more microbial cells than human cells. Symbiotic organisms colonize various areas of the body—the mouth, skin, vagina, pancreas, eyes and lungs—and many reside in the gut microbiota. You almost certainly have microscopic mites living on your face in the hundreds, or even thousands—mating, laying eggs and, at the end of their lives, exploding, unbeknown to you.

You may have heard the incredible fact that the resident microbes in your body outnumber your own human cells ten to one. That figure has been downgraded to three to one or an equal number, which is still astonishing. They mostly resemble mini jumping beans or Tic Tacs on a much smaller scale. These organisms aren’t simply parasitic freeloaders: they are intricate networks that intertwine and interconnect, influencing our health and well-being through complex ecological processes. They are involved in the workings of the immune system, the gut-brain axis, protection against harmful organisms and, indirectly, they have some relationship to our mental health.

When we breathe, we suck different species of microorganisms into the body. Studies suggest 50 different species of mycobacteria would be normal in the upper airways of healthy individuals, making their way into the teeth, oral cavity and pharynx. The environment around you might look clear and empty, but it will be swarming with microscopic organisms, depending on where you are.

Our microbiota are healthiest when they are diverse—and a diverse microbiota is influenced positively by an environment filled with organ­isms, which are found more abundantly in outside spaces than inside. We imagine our skin and our bodies to be armored, or a shell impenetrable to the outdoors, that we have somehow transcended our biological origins. But the human epidermis is more like a pond surface or a forest soil, as Paul Shepard, the late American environmentalist, suggested. Even if we don’t yet understand or know exactly how many of the abundant micro­organisms in our bodies arrived with us through exposure to nature—and, indeed, how they affect our mental and physical health—we are woven into the land, and wider ecosystems, more than we realize.

Crucially, these “old friends” that we have evolved with are able to treat or block chronic inflammation. There are two types of inflamma­tion: the good, normal, protective type, whereby the immune system fires up to respond to an injury, with fever or swelling or redness; then there is the chronic, systemic kind you don’t want. This is the simmering, low-level constant inflammation within the body which can lead to cardiovas­cular disease, inflammatory disorders, decreased resistance to stress and depression. This kind of raised, background inflammation is common in people who live in industrialized, urban environments and is associ­ated with the unhealthy habits of the modern world: our diets, poor sleep, smoking and alcohol consumption, stress and sedentary lifestyles. As we age, our bodies become more inflamed. Scientists can measure levels of inflammation by looking at biomarkers such as proteins in the blood.

It should be no surprise, then, to learn that the gut microbiota of people who live in urban areas and developed countries are less biodi­verse than those who still have profound contact with the land, such as hunter-gatherers and traditional farming communities.

Scientists are starting to understand more deeply the role inflamma­tion may also play in our mental health. Evidence that bodily inflammation can affect the brain and have a direct effect on mood, cognition and behavior is relatively new. But it is strong and compelling. Depres­sion may well be all in the mind, the brain and the body. This view runs counter to the dominant view of Western medicine that our bodies and minds are separate and thus should be treated apart from each other, a view dating back to 17th-century French philosopher René Des­cartes’ concept of dualism. As the neuropsychiatrist professor Edward Bullmore has said, “In Britain in 2018, the NHS is still planned on Car­tesian lines. Patients literally go through different doors, attend different hospitals, to consult differently trained doctors, about their dualistically divided bodies.”

But perhaps we are not as dualistically divided as the Cartesian orthodoxy our health systems are still built on would lead us to believe. A study of 15 thousand children in England found that those who were inflamed at the age of nine were more likely to be depressed a decade later, as 18-year-olds. People with depression, anxiety, schizophre­nia and other neuropsychiatric disorders have been found to have higher levels of inflammation biomarkers. European people have higher levels of cytokines in the winter months, which is also a time of increased risk of depression. Levels of cytokines are higher in sufferers of bipolar disorders during their manic episodes, and lower when they’re in remission. Early findings suggest anti-inflammatory medicines may improve depressive symptoms. People with a dysregulated immune system are more likely to have psychiatric disorders.

In his book The Inflamed Mind, Bullmore argued that some depres­sions may be a symptom of inflammatory disease, directly related to high levels of cytokines in the blood, or a “cytokine squall,” as he puts it.

Could our lack of contact with the natural world be a contributing factor to high levels of inflammation, which could be related to depres­sion and other mental health disorders? Studies show that just two hours in a forest can significantly lower cytokine levels in the blood, soothing inflammation. This could partly be caused by exposure to important microorganisms.

There are multiple reasons why babies born in the rich, developed world have a less diverse population of mycobacteria—for example, the use of antibiotics, diet, lack of breastfeeding and reduced contact with the natural environment. We live inside, often in air-conditioned buildings cleaned with antibacterial sprays, with reduced exposure to organisms from the natural environment via plants, animals and the soil. Our food is sprayed and wrapped in plastic. We don’t live alongside other species of animals, as we did for millennia. The opportunities to be exposed to diverse microorganisms are much fewer—which might explain why my daughter liked to eat soil.






Tuesday, September 7, 2021

Hedgehogs and Foxes

 


Tetlock also found that experts resisted admitting that they had been wrong, and when they were compelled to admit error, they had a large collection of excuses: they had been wrong only in their timing, an unforeseeable event had intervened, or they had been wrong but for the right reasons. Experts are just human in the end. They are dazzled by their own brilliance and hate to be wrong. Experts are led astray not by what they believe, but by how they think, says Tetlock. He uses the terminology from Isaiah Berlin’s essay on Tolstoy, “The Hedgehog and the Fox.” Hedgehogs “know one big thing” and have a theory about the world; they account for particular events within a coherent framework, bristle with impatience toward those who don’t see things their way, and are confident in their forecasts. They are also especially reluctant to admit error. For hedgehogs, a failed prediction is almost always “off only on timing” or “very nearly right.” They are opinionated and clear, which is exactly what television producers love to see on programs. Two hedgehogs on different sides of an issue, each attacking the idiotic ideas of the adversary, make for a good show.


Foxes, by contrast, are complex thinkers. They don’t believe that one big thing drives the march of history (for example, they are unlikely to accept the view that Ronald Reagan single-handedly ended the cold war by standing tall against the Soviet Union). Instead the foxes recognize that reality emerges from the interactions of many different agents and forces, including blind luck, often producing large and unpredictable outcomes. It was the foxes who scored best in Tetlock’s study, although their performance was still very poor. But they are less likely than hedgehogs to be invited to participate in television debates.





Our almost unlimited ability to ignore our ignorance.

 


Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.


I have heard of too many people who “knew well before it happened that the 2008 financial crisis was inevitable.” This sentence contains a highly objectionable word, which should be removed from our vocabulary in discussions of major events. The word is, of course, knew. Some people thought well in advance that there would be a crisis, but they did not know it. They now say they knew it because the crisis did in fact happen. This is a misuse of an important concept. In everyday language, we apply the word know only when what was known is true and can be shown to be true. We can know something only if it is both true and knowable. But the people who thought there would be a crisis (and there are fewer of them than now remember thinking it) could not conclusively show it at the time. Many intelligent and well-informed people were keenly interested in the future of the economy and did not believe a catastrophe was imminent; I infer from this fact that the crisis was not knowable. What is perverse about the use of know in this context is not that some individuals get credit for prescience that they do not deserve. It is that the language implies that the world is more knowable than it is. It helps perpetuate a pernicious illusion.


The core of the illusion is that we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less than we believe we do. Know is not the only word that fosters this illusion. In common usage, the words intuition and premonition also are reserved for past thoughts that turned out to be true. The statement “I had a premonition that the marriage would not last, but I was wrong” sounds odd, as does any sentence about an intuition that turned out to be false. To think clearly about the future, we need to clean up the language that we use in labeling the beliefs we had in the past.


** 


The mind that makes up narratives about the past is a sense-making organ. When an unpredicted event occurs, we immediately adjust our view of the world to accommodate the surprise. Imagine yourself before a football game between two teams that have the same record of wins and losses. Now the game is over, and one team trashed the other. In your revised model of the world, the winning team is much stronger than the loser, and your view of the past as well as of the future has been altered by that new perception. Learning from surprises is a reasonable thing to do, but it can have some dangerous consequences.


A general limitation of the human mind is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or of any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed


**

Hindsight bias has pernicious effects on the evaluations of decision makers. It leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad. Consider a low-risk surgical intervention in which an unpredictable accident occurred that caused the patient’s death. The jury will be prone to believe, after the fact, that the operation was actually risky and that the doctor who ordered it should have known better. This outcome bias makes it almost impossible to evaluate a decision properly—in terms of the beliefs that were reasonable when the decision was made.


Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact. There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight. Based on an actual legal case, students in California were asked whether the city of Duluth, Minnesota, should have shouldered the considerable cost of hiring a full-time bridge monitor to protect against the risk that debris might get caught and block the free flow of water. One group was shown only the evidence available at the time of the city’s decision; 24% of these people felt that Duluth should take on the expense of hiring a flood monitor. The second group was informed that debris had blocked the river, causing major flood damage; 56% of these people said the city should have hired the monitor, although they had been explicitly instructed not to let hindsight distort their judgment.


The worse the consequence, the greater the hindsight bias. In the case of a catastrophe, such as 9/11, we are especially ready to believe that the officials who failed to anticipate it were negligent or blind. On July 10, 2001, the Central Intelligence Agency obtained information that al-Qaeda might be planning a major attack against the United States. George Tenet, director of the CIA, brought the information not to President George W. Bush but to National Security Adviser Condoleezza Rice. When the facts later emerged, Ben Bradlee, the legendary executive editor of The Washington Post, declared, “It seems to me elementary that if you’ve got the story that’s going to dominate history you might as well go right to the president.” But on July 10, no one knew—or could have known—that this tidbit of intelligence would turn out to dominate history.


Because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions—and to an extreme reluctance to take risks. As malpractice litigation became more common, physicians changed their procedures in multiple ways: ordered more tests, referred more cases to specialists, applied conventional treatments even when they were unlikely to help. These actions protected the physicians more than they benefited the patients, creating the potential for conflicts of interest. Increased accountability is a mixed blessing.


Although hindsight and the outcome bias generally foster risk aversion, they also bring undeserved rewards to irresponsible risk seekers, such as a general or an entrepreneur who took a crazy gamble and won. Leaders who have been lucky are never punished for having taken too much risk. Instead, they are believed to have had the flair and foresight to anticipate success, and the sensible people who doubted them are seen in hindsight as mediocre, timid, and weak. A few lucky gambles can crown a reckless leader with a halo of prescience and boldness.


**

The CEO of a successful company is likely to be called flexible, methodical, and decisive. Imagine that a year has passed and things have gone sour. The same executive is now described as confused, rigid, and authoritarian. Both descriptions sound right at the time: it seems almost absurd to call a successful leader rigid and confused, or a struggling leader flexible and methodical.


Indeed, the halo effect is so powerful that you probably find yourself resisting the idea that the same person and the same behaviors appear methodical when things are going well and rigid when things are going poorly. Because of the halo effect, we get the causal relationship backward: we are prone to believe that the firm fails because its CEO is rigid, when the truth is that the CEO appears to be rigid because the firm is failing. This is how illusions of understanding are born.










Monday, September 6, 2021

Surprise!

 


People who are taught surprising statistical facts about human behavior may be impressed to the point of telling their friends about what they have heard, but this does not mean that their understanding of the world has really changed. The test of learning psychology is whether your understanding of situations you encounter has changed, not whether you have learned a new fact. There is a deep gap between our thinking about statistics and our thinking about individual cases. Statistical results with a causal interpretation have a stronger effect on our thinking than noncausal information. But even compelling causal statistics will not change long-held beliefs or beliefs rooted in personal experience. On the other hand, surprising individual cases have a powerful impact and are a more effective tool for teaching psychology because the incongruity must be resolved and embedded in a causal story. That is why this book contains questions that are addressed personally to the reader. You are more likely to learn something by finding surprises in your own behavior than by hearing surprising facts about people in general.

Availability Effect and Emotions

 


Students of risk were quick to see that the idea of availability was relevant to their concerns. Even before our work was published, the economist Howard Kunreuther, who was then in the early stages of a career that he has devoted to the study of risk and insurance, noticed that availability effects help explain the pattern of insurance purchase and protective action after disasters. Victims and near victims are very concerned after a disaster. After each significant earthquake, Californians are for a while diligent in purchasing insurance and adopting measures of protection and mitigation. They tie down their boiler to reduce quake damage, seal their basement doors against floods, and maintain emergency supplies in good order. However, the memories of the disaster dim over time, and so do worry and diligence. The dynamics of memory help explain the recurrent cycles of disaster, concern, and growing complacency that are familiar to students of large-scale emergencies.


Kunreuther also observed that protective actions, whether by individuals or governments, are usually designed to be adequate to the worst disaster actually experienced. As long ago as pharaonic Egypt, societies have tracked the high-water mark of rivers that periodically flood—and have always prepared accordingly, apparently assuming that floods will not rise higher than the existing high-water mark. Images of a worse disaster do not come easily to mind.


**


The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed. 


**


People’s emotional evaluations of outcomes, and the bodily states and the approach and avoidance tendencies associated with them, all play a central role in guiding decision making. Damasio and his colleagues have observed that people who do not display the appropriate emotions before they decide, sometimes because of brain damage, also have an impaired ability to make good decisions. An inability to be guided by a “healthy fear” of bad consequences is a disastrous flaw.


**

“The emotional tail wags the rational dog.” The affect heuristic simplifies our lives by creating a world that is much tidier than reality. Good technologies have few costs in the imaginary world we inhabit, bad technologies have no benefits, and all decisions are easy. In the real world, of course, we often face painful tradeoffs between benefits and costs.

 

**

Sunstein and a collaborator, the jurist Timur Kuran, invented a name for the mechanism through which biases flow into policy: the availability cascade. They comment that in the social context, “all heuristics are equal, but availability is more equal than the others.” They have in mind an expanded notion of the heuristic, in which availability provides a heuristic for judgments other than frequency. In particular, the importance of an idea is often judged by the fluency (and emotional charge) with which that idea comes to mind.


An availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public panic and large-scale government action. On some occasions, a media story about a risk catches the attention of a segment of the public, which becomes aroused and worried. This emotional reaction becomes a story in itself, prompting additional coverage in the media, which in turn produces greater concern and involvement. The cycle is sometimes sped along deliberately by “availability entrepreneurs,” individuals or organizations who work to ensure a continuous flow of worrying news. The danger is increasingly exaggerated as the media compete for attention-grabbing headlines. Scientists and others who try to dampen the increasing fear and revulsion attract little attention, most of it hostile: anyone who claims that the danger is overstated is suspected of association with a “heinous cover-up.” The issue becomes politically important because it is on everyone’s mind, and the response of the political system is guided by the intensity of public sentiment. The availability cascade has now reset priorities. Other risks, and other ways that resources could be applied for the public good, all have faded into the background.


**


The Alar incident, known to detractors of environmental concerns as the “Alar scare” of 1989. Alar is a chemical that was sprayed on apples to regulate their growth and improve their appearance. The scare began with press stories that the chemical, when consumed in gigantic doses, caused cancerous tumors in rats and mice. The stories understandably frightened the public, and those fears encouraged more media coverage, the basic mechanism of an availability cascade. The topic dominated the news and produced dramatic media events such as the testimony of the actress Meryl Streep before Congress. The apple industry sustained large losses as apples and apple products became objects of fear. Kuran and Sunstein quote a citizen who called in to ask “whether it was safer to pour apple juice down the drain or to take it to a toxic waste dump.” The manufacturer withdrew the product and the FDA banned it. Subsequent research confirmed that the substance might pose a very small risk as a possible carcinogen, but the Alar incident was certainly an enormous overreaction to a minor problem. The net effect of the incident on public health was probably detrimental because fewer good apples were consumed.


The Alar tale illustrates a basic limitation in the ability of our mind to deal with small risks: we either ignore them altogether or give them far too much weight—nothing in between. Every parent who has stayed up waiting for a teenage daughter who is late from a party will recognize the feeling. You may know that there is really (almost) nothing to worry about, but you cannot help images of disaster from coming to mind. As Slovic has argued, the amount of concern is not adequately sensitive to the probability of harm; you are imagining the numerator—the tragic story you saw on the news—and not thinking about the denominator. Sunstein has coined the phrase “probability neglect” to describe the pattern. The combination of probability neglect with the social mechanisms of availability cascades inevitably leads to gross exaggeration of minor threats, sometimes with important consequences.