Tricky questions reveal pitfalls we all face in thinking logically
Key figures in this new science of stupidity
Daniel Kahneman’s new book, Thinking, Fast and Slow (Farrar Straus)
AN exciting piece in Vanity Fair this month (Jan issue) by Michael Lewis, The King of Human Error
Michael Lewis has taken a great interest over the years in people he has met who explore the sources of human stupidity, including one neighbor in Berkeley in particular, who has just brought out a book which sound like a must buy. What’s interesting is that Lewis, a sharp cookie who is unfazed by the complexities of Wall Street thought processes, seems a little befuddled by the ideas he is explaining, or conveying from his source to the reader. At least, the piece is in need of better editing for clarity and precision where it summarizes the analysis of his hero.
For example, is the second choice of the question about the bank teller literally “impossible”? Clearly it is less probable than the first choice, so people – the 85% of them – who choose it are not thinking straight. Or are they? The question characterizes her in terms which make it very likely she is a feminist, so the second choice is not impossible, and quite probable, just not the most likely, since simply a bank teller is a larger category of possibility than bank teller feminist.
LETTER FROM BERKELEY
The King of Human Error
Billy Beane’s sports-management revolution, chronicled by the author in Moneyball, was made possible by Israeli psychologists Daniel Kahneman and Amos Tversky. At 77, with his own new book, Thinking, Fast and Slow, the Nobel Prize-winning Kahneman reveals the built-in kinks in human reasoning—and he’s Exhibit A.
Related: “The Quiz Daniel Kahneman Wants You to Fail.”
By Michael Lewis Photograph by Patrick Ecclesine
We’re obviously all at the mercy of forces we only dimly perceive and events over which we have no control, but it’s still unsettling to discover that there are people out there—human beings of whose existence you are totally oblivious—who have effectively toyed with your life. I had that feeling soon after I published Moneyball. The book was ostensibly about a cash-strapped major-league baseball team, the Oakland A’s, whose general manager, Billy Beane, had realized that baseball players were sometimes misunderstood by baseball professionals, and found new and better ways to value them. The book attracted the attention of a pair of Chicago scholars, an economist named Richard Thaler and a law professor named Cass Sunstein (now a senior official in the Obama White House). “Why do professional baseball executives, many of whom have spent their lives in the game, make so many colossal mistakes?” they asked in their review in The New Republic. “They are paid well, and they are specialists. They have every incentive to evaluate talent correctly. So why do they blunder?” My book clearly lacked a satisfying answer to that question. It pointed out that when baseball experts evaluated baseball players their judgment could be clouded by their prejudices and preconceptions—but why? I’d stumbled upon a mystery, the book reviewers noted, and I’d failed not merely to solve it but also to see that others already had done so. As they put it:
Lewis is actually speaking here of a central finding in cognitive psychology. In making judgments, people tend to use the “availability heuristic.” As Daniel Kahneman and Amos Tversky have shown, people often assess the probability of an event by asking whether relevant examples are cognitively “available” [i.e., can be easily remembered]. Thus [because they more readily recall words ending in “ing” than other words with penultimate “n”s, such as “bond” or “mane”], people are likely to think that more words, on a random page, end with the letters “ing” than have “n” as their next to last letter—even though a moment’s reflection will show that this could not possibly be the case. Now, it is not exactly dumb to use the availability heuristic. Sometimes it is the best guide that we possess. Yet reliable statistical evidence will outperform the availability heuristic every time. In using data rather than professional intuitions, Beane confirmed this point.
Kahneman and Tversky were psychologists, without a single minor-league plate appearance between them, but they had found that people, including experts, unwittingly use all sorts of irrelevant criteria in decision-making. I’d never heard of them, though I soon realized that Tversky’s son had been a student in a seminar I’d taught in the late 1990s at the University of California, Berkeley, and while I was busy writing my book about baseball, Kahneman had apparently been busy receiving the Nobel Prize in Economics. And he wasn’t even an economist. (Tversky had died in 1996, making him ineligible to share the prize, which is not awarded posthumously.) I also soon understood how embarrassed I should be by what I had not known.
Between 1971 and 1984, Kahneman and Tversky had published a series of quirky papers exploring the ways human judgment may be distorted when we are making decisions in conditions of uncertainty. When we are trying to guess which 18-year-old baseball prospect would become a big-league all-star, for example. To a reader who is neither psychologist nor economist (i.e., me), these papers are not easy going, though I am told that compared with other academic papers in their field they are high literature. Still, they are not so much written as constructed, block by block. The moment the psychologists uncover some new kink in the human mind, they bestow a strange and forbidding name on it (“the availability heuristic”). In their most cited paper, cryptically titled “Prospect Theory,” they convinced a lot of people that human beings are best understood as being risk-averse when making a decision that offers hope of a gain but risk-seeking when making a decision that will lead to a certain loss. In a stroke they provided a framework to understand all sorts of human behavior that economists, athletic coaches, and other “experts” have trouble explaining: why people who play the lottery also buy insurance; why people are less likely to sell their houses and their stock portfolios in falling markets; why, most sensationally, professional golfers become better putters when they’re trying to save par (avoid losing a stroke) than when they’re trying to make a birdie (and gain a stroke).
When you wander into the work of Kahneman and Tversky far enough, you come to find their fingerprints in places you never imagined even existed. It’s alive in the work of the psychologist Philip Tetlock, who famously studied the predictions of putative political experts and found they were less accurate than predictions made by simple algorithms. It’s present in the writing of Atul Gawande (Better, The Checklist Manifesto), who has shown the dangers of doctors who place too much faith in their intuition. It inspired the work of Terry Odean, a finance professor at U.C. Berkeley, who examined 10,000 individual brokerage accounts to see if stocks the brokers bought outperformed stocks they sold and found that the reverse was true. Recently, The New York Times ran an interesting article about a doctor and medical researcher in Toronto named Donald Redelmeier, whose quirky research projects upended all sorts of assumptions you might not know you even had. He’d shown that changing lanes in traffic is pointless, for instance, and that an applicant was less likely to be admitted to a medical school if he was interviewed on a rainy day. More generally he had demonstrated the power of illusions on the human mind. The person who had sent him down this road in life, he told the Times reporter, was his old professor Amos Tversky.
It didn’t take me long to figure out that, in a not so roundabout way, Kahneman and Tversky had made my baseball story possible. In a collaboration that lasted 15 years and involved an extraordinary number of strange and inventive experiments, they had demonstrated how essentially irrational human beings can be. In 1983—to take just one of dozens of examples—they had created a brief description of an imaginary character they named “Linda.” “Linda is thirty-one years old, single, outspoken, and very bright,” they wrote. “She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they went around asking people the same question:
Which alternative is more probable?
(1) Linda is a bank teller.
(2) Linda is a bank teller and is active in the feminist movement.
The vast majority—roughly 85 percent—of the people they asked opted for No. 2, even though No. 2 is logically impossible. (If No. 2 is true, so is No. 1.) The human mind is so wedded to stereotypes and so distracted by vivid descriptions that it will seize upon them, even when they defy logic, rather than upon truly relevant facts. Kahneman and Tversky called this logical error the “conjunction fallacy.”
Their work intersected with economics in the early 1970s when Tversky handed Kahneman a paper on the psychological assumptions of economic theory. As Kahneman recalled:
I can still recite its first sentence: “The agent of economic theory is rational, selfish, and his tastes do not change.”
I was astonished. My economic colleagues worked in the building next door, but I had not appreciated the profound difference between our intellectual worlds. To a psychologist, it is self-evident that people are neither fully rational nor completely selfish, and that their tastes are anything but stable.
The paper that resulted five years later, the abovementioned “Prospect Theory,” not only proved that one of the central premises of economics was seriously flawed—the so-called utility theory, “based on elementary rules (axioms) of rationality”—but also spawned a sub-field of economics known as behavioral economics. This field attracted the interest of a Harvard undergraduate named Paul DePodesta. With a mind prepared to view markets and human decision-making as less than perfectly rational, DePodesta had gone into sports management, been hired by Billy Beane to work for the Oakland A’s, and proceeded to exploit the unreason of baseball experts. A dotted line connected the Israeli psychologists to what would become a revolution in sports management. Outside of baseball there had been, for decades, an intellectual revolt, led by a free thinker named Bill James, devoted to creating new baseball knowledge. The movement generated information of value in the market for baseball players, but the information went ignored by baseball insiders. The market’s willful ignorance had a self-reinforcing quality: the longer the information was ignored, the less credible it became. After all, if this stuff had any value, why didn’t baseball insiders pay it any attention? To see the value in what Bill James and his crowd were up to you had first to believe that a market as open and transparent as the market for baseball players could ignore valuable information—that is, that it could be irrational. Kahneman and Tversky had made that belief reasonable.
Coffee with Kahneman
Kahneman is a professor emeritus at Princeton, but, as it turned out, he lived during the summers with his wife, Anne Treisman, another well-known psychologist, near my house in Berkeley. Four years ago I summoned the nerve to write him an e-mail, and he invited me for a safe date, a cup of coffee. I found his house on the top of our hill. He opened the door wearing hiking shorts and a shirt not tucked into them, we shook hands, and I said something along the lines of what an honor it was to meet him. He just looked at me a little strangely and said, “Ah, you mean the Nobel. This Nobel Prize stuff, don’t take it too seriously.” He then plopped down into a lounge chair in his living room and began to explain to me, albeit indirectly, why he took such an interest in human unreason. His laptop rested on a footstool and a great many papers and books lay scattered around him. He was then 73 years old. It was tempting to describe him as spry, but the truth is that he felt more alert and alive than most 20-year-olds.
He was working on a book, he said. It would be both intellectual memoir and an attempt to teach people how to think. As he was the world’s leading authority on his subject, and a lot of people would pay hard cash to learn how to think, this sounded promising enough to me. He disagreed: he was certain his book would end in miserable failure. He wasn’t even sure that he should be writing a book, and it was probably just a vanity project for a washed-up old man, an unfinished task he would use to convince himself that he still had something to do, right up until the moment he died. Twenty minutes into meeting the world’s most distinguished living psychologist I found myself in the strange position of trying to buck up his spirits. But there was no point: his spirits did not want bucking up. Having spent maybe 15 minutes discussing just how bad his book was going to be, we moved on to a more depressing subject. He was working, equally unhappily, on a paper about human intuition—when people should trust their gut and when they should not—with a fellow scholar of human decision-making named Gary Klein. Klein, as it happened, was the leader of a school of thought that stressed the power of human intuition, and disagreed with the work of Kahneman and Tversky. Kahneman said that he did this as often as he could: seek out people who had attacked or criticized him and persuade them to collaborate with him. He not only tortured himself, in other words, but invited his enemies to help him to do it. “Most people after they win the Nobel Prize just want to go play golf,” said Eldar Shafir, a professor of psychology at Princeton and a disciple of Amos Tversky’s. “Danny’s busy trying to disprove his own theories that led to the prize. It’s beautiful, really.”
Over the next few years I followed the progress of Daniel Kahneman’s attempt to explain to other people what he knew about their minds. In the bargain I learned a bit more about who he was and where he had come from, though he tends to be reticent, even uninterested in his own life story. He was born in 1934 and grew up a Jew in France during the German occupation. His boyhood had been punctuated by dramatic examples of the unpredictability of human behavior and the role of accident in life. His father was captured in a German dragnet that sent many French Jews to die in concentration camps—but then, at the last moment, he was mysteriously released. With his parents and his sister, Danny fled from Paris to the South of France and then to Limoges, where they lived in a chicken coop at the back of a rural pub. One evening he violated the curfew for Jews, and found himself face-to-face in the street with a man in the black uniform of the German SS. The man picked him up and hugged him, then showed him a picture of his own little boy and gave him money. Later in the war, after his family had disguised their Jewish identity, he watched a young Frenchman, a Nazi collaborator and passionate anti-Semite, be well enough fooled by his sister’s disguise to fall in love with her. (“After the Liberation she took enormous pleasure in finding him and letting him know he had fallen in love with a Jew.”) For a time his father held a job, but it was a long bus ride from the chicken coop, and he was away during the week. On weekends Danny and his mother would watch the bus stop from their house, waiting for his father’s bus to arrive. Each time was a cliff-hanger: he knew his father was in constant danger and was never sure that he would come home. “I remember waiting with my mother, and as we waited we darned socks,” he said. “And so darning socks for me has always been an incredibly anxious activity.” His relationship to his father, whom he adored, was further queered by a mere typo; in the phony identity papers his father procured for them there was a mistake. Danny’s last name had been printed as “Godet”; his father’s had been printed as “Cadet.” Because of this typo Danny was required to address his father as “uncle.”
Through it all his father suffered from diabetes, which, after the Germans arrived, went untreated. On the day of his death, in 1944, he took Danny, then 10 years old, out for a walk. “He must have known he was dying,” says Kahneman. “I remember him saying it was now time for me to become the man of the family. I was really angry about him dying. He had been good. But he had not been strong.”
After the war his mother moved the family to what was then Palestine and would soon become Israel, where he became first a platoon commander in the Israeli Defense Forces and then a professor of psychology. It apparently never seriously occurred to him to become anything else. He was always bookish, precocious, and curious about what made people tick. His wartime experience may or may not have heightened his curiosity about the inner workings of the human mind; at any rate, he’s reluctant to give the Germans too much credit for his career choice. “People say your childhood has a big influence on who you become,” he says. “I’m not at all sure that’s true.”
When I first met Kahneman he was making himself more miserable about his unfinished book than any writer I’d ever seen. It turned out merely to be a warm-up for the misery to come, the beginning of an extraordinary act of literary masochism. In effect, the psychologist kept trying to trick himself into doing things he didn’t want to do and failing to fall for the ruse. “I had this idea at first that I could do it easily,” he said. “I thought, you know, that I could talk it” to a ghostwriter, but then he seized on another approach: a series of lectures, delivered to Princeton undergraduates who knew nothing about the subject, that he could transcribe and publish more or less as spoken. “I paid someone to transcribe them,” he says. “But when I read them I could see that they were very bad.” Next, he set out to write the book by himself, as he suspected he should have done all along. He quit and re-started so many times he lost count, and each time he quit he seemed able to convince himself that he should never have taken on the project in the first place. Last October he quit for what he swore was the last time. One morning I went up the hill to have coffee with him and found that he was no longer writing his book. “This time I’m really finished with it,” he said.
Then, after I left him, he sat down and reviewed his own work. The mere fact that he had abandoned it probably raised the likelihood that he would now embrace it: after all, finding merit in the thing would now prove him wrong, and he seemed to take pleasure in doing that. Sure enough, when he looked at his manuscript his feelings about it changed again. That’s when he did the thing that I find not just peculiar and unusual but possibly unique in the history of human literary suffering. He called a young psychologist he knew well and asked him to find four experts in the field of judgment and decision-making, and offer them $2,000 each to read his book and tell him if he should quit writing it. “I wanted to know, basically, whether it would destroy my reputation,” he says. He wanted his reviewers to remain anonymous, so they might trash his book without fear of retribution. The endlessly self-questioning author was now paying people to write nasty reviews of his work. The reviews came in, but they were glowing. “By this time it got so ridiculous to quit again,” he says, “I just finished it.” Which of course doesn’t mean that he likes it. “I know it is an old man’s book,” he says. “And I’ve had all my life a concept of what an old man’s book is. And now I know why old men write old man’s books. My line about old men is that they can see the forest, but that’s because they have lost the ability to see the trees.”
Hold That Thought
The book was originally titled Thinking About Thinking. Just arriving in bookstores from Farrar, Straus and Giroux, it’s now called Thinking, Fast and Slow. It’s wonderful, of course. To anyone with the slightest interest in the workings of his own mind it is so rich and fascinating that any summary of it would seem absurd. Kahneman walks the lay reader (i.e., me) through the research of the past few decades that has described, as it has never been described before, what appear to be permanent kinks in human reason. The story he tells has two characters—he names them “System 1” and “System 2”—that stand in for our two different mental operations. System 1 (fast thinking) is the mental state in which you probably drive a car or buy groceries. It relies heavily on intuition and is amazingly capable of misleading and also of being misled. The slow-thinking System 2 is the mental state that understands how System 1 might be misled and steps in to try to prevent it from happening. The most important quality of System 2 is that it is lazy; the most important quality of System 1 is that it can’t be turned off. We pass through this life on the receiving end of a steady signal of partially reliable information that we only occasionally, and under duress, evaluate thoroughly. Through these two characters the author describes the mistakes your mind is prone to make and then explores the reasons for its errors.
Along the way the reader has the dawning sense that he is in the presence of an unusual author, who perhaps does not fully realize how unusual he is. Any number of passages would do the trick but here is one:
Amos and I once rigged a wheel of fortune. It was marked from 0 to 100, but we had it built so that it would stop only at 10 or 65 One of us would stand in front of a small group, spin the wheel, and ask them to write down the number on which the wheel stopped, which of course was either 10 or 65. We then asked them two questions:
Is the percentage of African nations among UN members larger or smaller than the number you just wrote?
What is your best guess of the percentage of African nations in the UN?
The spin of a wheel of fortune had nothing to do with the question and should have had no influence over the answer, but it did. “The average estimate of those who saw 10 and 65 were 25% and 45 respectively.”
This is known as “the anchoring effect,” of which Kahneman explains in his book, “We were not the first to observe the effects of anchors, but our experiment was the first demonstration of its absurdity.” It’s unsettling to know that your judgment can be so heavily influenced by some random number and disturbing to realize it is probably happening all the time. The anchoring effect turns out to explain all sorts of strange phenomenon in the world around us—why, for instance, when German judges, before mock-sentencing a shoplifter, were asked to roll a pair of dice rigged to come up either three or nine, those who rolled nine said on average eight months, while those who rolled three said five months.
Kahneman knows how interesting all of this is. What he doesn’t seem to notice is the natural question that springs into the mind of the lay reader: Who rigs up a wheel of fortune to show how people can be deceived by a number? How does that occur to anyone to do? And why? There’s a quality both impish and joyous to Kahneman’s work, and it is most on display in his collaboration with Amos Tversky. They had a rule of thumb, he explains: they would study no specific example of human idiocy or irrationality unless they first detected it in themselves. “People thought we were studying stupidity,” says Kahneman. “But we were not. We were studying ourselves.” Kahneman has a phrase to describe what they did: “Ironic research.”
Kahneman has always insisted that the ideas for which he’s best known, along with his Nobel Prize in Economics, are not his but theirs. Once upon a time he collided almost by accident with another curious person, Amos Tversky, and the collision wound up defining them both.
At some point in my conversations with Kahneman I wanted to know more about his other half. My former student Oren Tversky put me in touch with his mother, Barbara, who put me onto the papers left behind by her husband. As I paged through these a pattern presented itself to my mind. Perhaps I was just seeing what my mind expected to see, but it seemed to me that anyone who had become deeply aware that our species often did not make a lot of sense eventually found their way to Kahneman and Tversky.
Then one afternoon I came upon a letter dated June 4, 1985, from Bill James. The baseball analyst whose work was then being blithely ignored by professional baseball people had wanted help answering a question that vexed him: Why were baseball professionals forever attempting to explain essentially random and therefore inexplicable events? “Baseball men, living from day to day in the clutch of carefully metered chance occurrences, have developed an entire bestiary of imagined causes to tie together and thus make sense of patterns that are in truth entirely accidental,” James wrote. “They have an entire vocabulary of completely imaginary concepts used to tie together chance groupings. It includes ‘momentum,’ ‘confidence,’ ‘seeing the ball well,’ ‘slumps,’ ‘guts,’ ‘clutch ability,’ being ‘hot’ and ‘cold,’ ‘not being aggressive’ and my all time favorite the ‘intangibles.’ By such concepts, the baseball man gains a feeling of control over a universe that swings him up and down and tosses him from side to side like a yoyo in a high wind.” It wasn’t just baseball he was writing about, James continued. “I think that the randomness of fate applies to all of us as much as baseball men, though it might be exacerbated by the orderliness of their successes and failures.”
Bill James was clearly roubled that the human mind settled so easily on false explanations when the truth was readily at hand. He wondered if students of the human mind might help him to explain why.
What Daniel Kahneman now swears is the last book he will ever write does exactly this. But in a funny way it is not his book. It’s theirs.
If you want to test yourself Vanity Fair provides a test as follows:
The Quiz Daniel Kahneman Wants You to Fail
In the December 2011 issue of Vanity Fair, Michael Lewis profiles Nobel Prize–winning psychologist Daniel Kahneman, who pioneered research into “heuristics,” or the shortcuts humans use when making decisions. Below, take our quiz to see how your own mind works.
By Jaime Lalinde
Plainly put, a “heuristic” is a tool we use to simplify the decision-making process. For example, if you’re driving in the United Kingdom for the first time and don’t know the traffic laws, heuristics might help you correctly assume that a green light means go and a red light means stop. By applying what you already know about driving in America, you won’t have to waste hours reading up on England’s traffic laws. However, that same heuristic could prove harmful if you start driving in the right-hand lane, against traffic. Research psychologist Daniel Kahneman—Nobel Prize winner, and the subject of Michael Lewis’s article in this month’s issue, “The King of Human Error”—spent a great part of his life’s work discovering and cataloging the heuristics people use. Specifically, he concentrated on the situations where they lead us astray. By nature, heuristics are both useful and inaccurate; our minds have developed them to deal with a wide-ranging set of problems. In Kahneman’s forthcoming book, Thinking, Fast and Slow, he separates the thinking process into two types—System 1, in which efficiency comes at the cost of accuracy, and System 2, which requires a lot of focus and can sometimes prevent System 1 from making mistakes. When you’re asked what “2 + 2” equals, System 1 takes over, but when you’re asked what “17 x 24” equals, System 2 takes the reins. The questions in this quiz are designed to trigger System 1, which relies heavily on intuition to provide us with answers that we perceive to be correct. Whenever you find yourself “going with your gut,” that’s System 1—often standing in the way of rational thought. It’s no wonder that the word “heuristic” has its root in the word “eureka.” Go ahead and take this quiz, based (loosely) on Kahneman’s four decades of research; follow your gut and see just how wrong you are.
1. A town has two hospitals: one large and one small. Assuming there is an equal number of boys and girls born every year in the United States, which hospital is more likely to have close to 50 percent girls and 50 percent boys born on any given day?
A. The larger
B. The smaller
C. About the same (say, within 5 percent of each other)
Reveal explanatory note ↓
2. A team of psychologists performed personality tests on 100 professionals, of which 30 were engineers and 70 were lawyers. Brief descriptions were written for each subject. The following is a sample of one of the resulting descriptions:
Jack is a 45-year-old man. He is married and has four children. He is generally conservative, careful, and ambitious. He shows no interest in political and social issues and spends most of his free time on his many hobbies, which include home carpentry, sailing, and mathematics.
What is the probability that Jack is one of the 30 engineers?
A. 10–40 percent
B. 40–60 percent
C. 60–80 percent
D. 80–100 percent
Hide explanatory note ↑
If you answered anything but A (the correct response being precisely 30 percent), you have fallen victim to the representativeness heuristic again, despite having just read about it. When Kahneman and Tversky performed this experiment, they found that a large percentage of participants overestimated the likelihood that Jack was an engineer, even though mathematically, there was only a 30-in-100 chance of that being true. This proclivity for attaching ourselves to rich details, especially ones that we believe are typical of a certain kind of person (i.e., all engineers must spend every weekend doing math puzzles), is yet another shortcoming of the hyper-efficient System 1.
3a. How many dates did you have last month?
3b. On a scale of 1 to 5, how happy are you these days (5 being the happiest)?
Hide explanatory note ↑
Regardless of how you answered, it is likely that your answer to question (a) is positively correlated to your answer to question (b)—that is, you rated your happiness higher if you had more dates and lower if you had fewer dates. However, when the order of these questions was reversed, as was done by two German researchers, people’s happiness became untethered from their dating life. This experiment demonstrates the brain’s deferral to System 1, the faster and easier of the two processes. When faced with an objective question (in this case, How many dates did you have last month?), followed by a subjective one (How happy are you these days?), people often simply carry over their answer for the first to the second. This heuristic is called substitution.
4. Imagine that you decided to see a play and you paid $10 for the admission price of one ticket. As you enter the theater, you discover that you have lost the ticket. The theater keeps no record of ticket purchasers, so the ticket cannot be recovered. Would you pay $10 for another ticket to the play?
Reveal explanatory note ↓
5a. Choose between getting $900 for sure or a 90 percent chance of getting $1,000.
A. Getting $900
B. 90 percent chance of getting $1,000
5b. Choose between losing $900 for sure or a 90 percent chance of losing $1,000.
A. Losing $900
B. 90 percent chance of losing $1,000
Hide explanatory note ↑
The results of this simple problem set, for which most participants answer A and then B, were used to develop the thesis that would make Kahneman and Tversky famous: prospect theory. In a 1979 paper, they documented a peculiar behavioral tendency: when people faced a gain, they became risk averse; when they faced a loss, they became risk seeking. As a result of their discovery, Kahneman and Tversky debunked Bernoulli’s utility theory, a cornerstone of economic thought since the 18th century. (Bernoulli first proponed that a person’s willingness to gamble a certain amount of money was a product of how that amount related to his overall wealth—that is, $1 million means more to a millionaire than it does to a billionaire.)
Along with playing a large role in Kahneman’s being awarded the Nobel Prize in 2002, the theory also spawned a new academic pursuit, the field of behavioral economics. Prospect theory, Michael Lewis writes, explains “why people are less likely to sell their houses and their stock portfolios in falling markets; why, most sensationally, professional golfers become better putters when they’re trying to save par (avoid losing a stroke) than when they’re trying to make a birdie (and gain a stroke).”
We haven’t sen the book yet, but it looks very promising as a guide to how scientists in crowds can go astray as far as they have in, for example, HIV/AIDS research and its paradigm.