"We have met the enemy; and he is us."
— Walt Kelly, Pogo
I always thought of the epigraph above as the greatest comic book quotation of all time. Google, however, tells me I'm not even close. Pogo's words rank nowhere near such gems as The Thing's "It's clobberin' time." Therein lies the problem.
Perhaps the biggest danger to our own safety and well-being, and that of our children, comes not from adult predators, environmental hazards, or the class bully, but from traits common to us all. The enemy is us, and not least because we too often jump to such strategies as clobbering. Writers from such varied fields of expertise as business ethics, behavioral economics, and neuroscience seem to come up every day with further evidence that our own thinking does us in more often than an external enemy ever could.
Daniel Kahneman's Thinking, Fast and Slow is the best compendium of this research to date. Kahneman, Nobel Prize winner and the coparent of behavioral economics along with Amos Tversky, describes no fewer than 44 ways in which our intuitions and biases put us at risk of making poor decisions, even life-threatening ones. Because of what he calls the "availability bias," for example, we use the first instances that come to mind to generalize about the world. Add that to the "overestimate of rare events," and throw in "probability neglect," and "denominator neglect" (we hear of two deaths in a week and forget that they happened 10,000 miles apart and are therefore two in a population of a billion or more), and we prepare for a statistically infinitesimal danger while ignoring far greater risks right in front of us. We may cancel our beach trip, for instance, because of fear of a shark attack, but then drive many hours to a new vacation site inland, risking the far more common auto accident.
But it isn't only the average person who blunders. According to research cited by Kahneman, a vast study of brokerage trades made by professionals revealed that the stocks they sold outperformed the ones they then bought by an average of more than 3 percent — enough to make your retirement account 69 percent larger over 30 years had they left it alone. More disturbing, 40 percent of doctors who were "completely certain" of their diagnosis of critically ill patients were shown to be wrong by later autopsies. And if you find yourself in court, be prepared for the fact, as noted in Christopher Chabris and Daniel Simons's The Invisible Gorilla: How Our Intuitions Deceive Us, that highly confident eyewitnesses are wrong in their identification 30 percent of the time.
Despite all this bad news, Kahneman holds out hope. We can train ourselves to be aware of these errors in our reasoning. Even putting a name to them helps. Kahneman ends each chapter with questions we can ask to see if we or another has fallen prey to such an error.
Kahneman describes a time he and expert colleagues used such a question. Given the task of developing a decision-making curriculum for high schools, he assembled a team. He asked the team members how long each thought it would take to complete the project. The estimates "were narrowly centered around two years; the low end was 11/2, the high end 21/2." He then asked a member of the team who had been on similar projects to measure the teams' success rate. After careful thought, the expert replied that 40 percent of such teams had never completed their task, and that the ones who did had taken seven to ten years. Kahneman's group took eight. Remember that the next time you join a committee.
But if Kahneman ends with hope that we can apply careful reasoning and avoid some of these traps, others, particularly the neuroscientists, are more pessimistic. To many, our rational side is no match for the biases, and especially the unconscious motivations and deficiencies, that are completely out of our control.
A relatively benign version of this perspective was offered a decade ago by University of Virginia psychologist Timothy D. Wilson. In Strangers to Ourselves: Discovering the Adaptive Unconscious, he argues that, "People possess a powerful, sophisticated, adaptive unconscious that is crucial for survival in the world. Because this unconscious operates so efficiently out of view, however, and is largely inaccessible, there is a price to pay in self-knowledge. There is a great deal about ourselves that we cannot know directly, even with the most painstaking introspection."
Among the tasks our adaptive unconscious undertakes is the quick determination of risks, which includes, unfortunately, the stereotyping of other people as well as the quick recognition of snake-like objects. He cites studies showing that many people who consciously report being unbiased, when interviewed by a person of a different race, exhibit numerous aversive reactions (shifting eye contact, fidgeting), although they later insist they felt no discomfort.
Other studies show that we can be unconsciously primed by influences of which we have no recollection. College students who read word lists including many references to aging and debility left the lab more slowly than others who read neutral lists, while people with brain injuries that separate their vision centers from their verbal ones will pick an object they have just seen out of a random group, even though they are not conscious of having seen it.
Wilson is optimistic that "we retain some ability to influence how our minds work." He suggests, for example, that since our unconscious affects our actions, we should look carefully not at what we think, but at what we actually have done. We can also be helped by reading about typical human behaviors to see if they apply to us. He notes, too, that our own self-image is usually far less accurate than the image others have of us, and that they can predict our actions better than we can. He does not, however, propose asking everyone we know to describe us. "Discovering our friends' true opinions about us," he warns, "might puncture some adaptive illusions…. People are often better off having an inflated view of how others feel about them." Describing his own experience on a local baseball team, he says, "It is life's positive illusions that make us show up for the next game."
More recent writers, however, seem even more determined to drive out those positive illusions. In On Being Certain: Believing You Are Right Even When You're Not, neurologist Robert A. Burton aims to "strip away the power of certainty by exposing its involuntary neurological roots." His starting point is a remarkable study of the Space Shuttle Challenger disaster, in which over 100 college students were asked to write down their experiences and feelings within a day of hearing about the crash. Thirty months later, 25 percent of the students had "strikingly different" memories, and only a tiny fraction agreed in all details with their past description. More surprising, many students preferred their current version of events to the one they had written at the time, one even saying, "That's my handwriting, but that's not what happened."
Books NotedThinking, Fast and Slow, Daniel Kahneman (Farrar, Straus and Giroux) The Invisible Gorilla: How Our Intuitions Deceive Us, Christopher Chabris and Daniel Simons (Crown Publishing) Strangers to Ourselves: Discovering the Adaptive Unconscious, Timothy D. Wilson (Harvard University Press) On Being Certain: Believing You Are Right Even When You're Not, Robert A. Burton (St. Martin's Griffin) |
Lest we think this is an anomaly, Burton takes the reader through innumerable examples — from unconscious conditioning to placebos to phantom limb syndrome among amputees — to develop his thesis that being certain is an emotional state entirely out of our rational control. Although some of Burton's examples are of the everyday kind, some depend on very precise scientific analysis. One of the oddest is that baseball players, who believe with confidence that they can see the ball when they hit it, cannot be correct. Our vision and reflexes simply do not allow for that, and the player who believes he can see the ball is, in reality, "developing a probabilistic profile of the speed, trajectory, and profile of the next pitch," of which he actually sees only the first third. "During the swing he could close his eyes and it would not make any difference." Why do ballplayers think they can see the ball? Because the brain "reorders the batter's perception of time," giving him an illusion of the experience that he firmly believes.
Burton makes no bones about the universality of his conclusions: "The message at the heart of this book is that feelings of knowing, correctness, conviction, and certainty aren't deliberate conclusions and conscious choices. They are mental sensations that happen to us." Although he concedes that he "cannot imagine a world in which we fully accepted and felt" his conclusions to be right, he proposes a reason for this acceptance: "We must learn (and teach our children) to accept the unpleasantness of uncertainty…. We do not need and cannot afford the catastrophes born out of a belief in certainty."
So, not only do we have the dangers that our ordinary habits of mind, conscious and unconscious, present to successful navigation of the world, but we also have the dangers of accepting our powerlessness to improve these behaviors — if we accept the views of a growing number of scientific experts. Sorry for the bad news. In my next column, I will take a look at some authors who take a brighter view of our possibilities.