The Bromance that Turned Economics Upside Down
Who would guess that the modern sciences of behavioral economics and the psychology of decision-making owe their origin to a love affair (no, not sexual) between two men born early in the last century and so different that one could barely imagine them speaking to each other? Yet that is the story chronicled by the extraordinary nonfiction writer Michael Lewis in The Undoing Project: A Friendship that Changed our Minds, which, despite some quirks, is a compelling and worthwhile read.
Amos Tversky, whom Lewis describes as a “swaggering Sabra” or native-born Israeli, and Daniel Kahneman, a shy “Holocaust kid,” teamed up in the psychology department of Jerusalem’s Hebrew University in the early 1950s, and their names have been linked ever since. When Kahneman, a psychologist with no formal economics training, won the Nobel Prize in Economics in 2002, it was for work that the two men had accomplished years earlier, always working as a team. (Tversky died young, in 1996; Kahneman is hale and hearty at 82. The Nobel is given only to the living.)
Lewis’ narrative starts with a long digression on basketball. The author of Moneyball and The Blind Side (which were both made into two excellent movies, especially the latter), Lewis devotes the long first chapter of The Undoing Project to the quantitative side of basketball and does not even mention Tversky or Kahneman. Lewis notes, for example, that basketball players who look like champions are often overrated and overpaid. Those who are less attractive or charismatic are more likely to be a good deal for the team owner – they are not only cheaper, but may score more points or be better team players. To figure this out, one must look at the statistics, as Lewis’ hero Billy Beane, the manager of the Oakland Athletics baseball team, did to great advantage in Moneyball and in real life. (All teams now perform the analysis pioneered by Beane, reducing its effectiveness.)
Some readers may wonder if they’ve opened the right book. But, with the basketball digression, Lewis slyly prepares us for a book-length lesson in the psychology of decision-making and its application to economics. If the tendency to overrate and overpay players who look good – even if they do not play particularly well – is a fundamental property of human nature, perhaps we should be aware of that. Maybe we can profit from that knowledge. And maybe such observations can be parlayed into a theory that can be applied more generally to human behavior and that can be used to improve the way that decisions are made.
“Two radically different personalities”
Lewis introduces his main characters separately, with Kahneman first; consequently, so will I. Kahneman spent much of his French childhood hiding from the Nazis. Later, the young refugee settled in Israel. Lewis describes him as “gentle, detached, disorganized, conflict-avoiding, physically inept… [not] anyone’s idea of a soldier.” But, in Israel then and now, everyone’s a soldier. (Kahneman even single-handedly invaded Jordan by mistake, almost provoking an international incident.) And in a young country full of young people it’s possible to advance startlingly quickly. At the age of 20, Kahneman, still an undergraduate, was assigned the responsibility of psychologically scoring every Israeli soldier using a test of his design. This “Kahneman number” is still in use 70 years later.
Kahneman displayed a quiet diligence, which Tversky would correctly describe as brilliance concealed by shyness. Tversky, in contrast, was cartoonish in his self-confidence and bravado. He once jumped off a high-diving board without knowing how to swim, having arranged for a swimmer to fish him out. As a soldier, he repeatedly placed himself in harm’s way when others were ducking it. When Tversky, an experienced paratrooper, flew a commercial airliner for the first time he was fascinated by the process of landing – he had taken off countless times but never landed, having jumped out of the plane every time before that.
Tversky exuded the exact opposite of quiet diligence. He was noisy, brash and so smart that one psychologist, Adam Alter, designed an “intelligence test” around him: “The faster you realized Tversky was smarter than you, the smarter you were.” His wit could be cutting: he told the physicist Murray Gell-Mann, no slouch in the sky-high IQ department, “There's no one in the world as smart as you think you are.” His students admiringly, or sarcastically (it’s hard to tell which), called him “Famous Amos.”
The odd couple
Tversky and Kahneman were the perfect odd couple: each filled the holes in the other’s personality.
Amos was the life of the party. Danny didn’t go to…parties… Amos was tone-deaf but would nevertheless sing Hebrew folk songs with great gusto. Danny was the sort of person who might be in possession of a lovely singing voice that he would never discover…. ‘Danny was always eager to please,’ [a colleague said]. ‘Amos couldn’t understand why anyone would be eager to please.’
In combination they made one extraordinary human being. Separately, each was a little peculiar. The same observation has been made about other successful working pairs. A capable author, Joshua Shenk, studied them in that context, comparing them to Lennon and McCartney, Wozniak and Jobs, and many others.
What can you say about two men who sat together at a single typewriter, co-authoring journal articles?
Sadly, the odd couple would eventually break up. Jealousy over professional status was the primary cause. Although Kahneman had the greatest imaginable respect for Tversky’s intellect, he grew to resent Tversky’s dominance in the intellectual partnership and in its public perception. These feelings arose even though, as Michael Lewis tells it, Tversky bent over backwards to share credit. By the mid-1980s, when Tversky was awarded a MacArthur Foundation genius grant, the partnership was over.
The love letter
The depth of emotion felt by the collaborators is evident in a letter, highlighted by Lewis, that Kahneman wrote in 1975 when he received a package that he thought Tversky had coldly sent without a personal note. Kahneman was devastated. Then he realized he had overlooked a brief note that Tversky had in fact included. “I saw the words ‘Yours, as ever’ and I had goose bumps from emotion,” wrote Kahneman.
This is the dramatic highlight of the book. While Lewis conveys a great deal of scientific information, his narrative never strays far from the ever-changing relationship between the two men. As in The Blind Side, feeling and science are close allies in Lewis’ unique way of educating and captivating the reader.
A new method of psychological inquiry
In the early 1950s, when Tversky and Kahneman met, psychology was a highly disorganized field of study. Some researchers were studying rats in mazes; some, the relationship between fathers, daughters, and Greek mythology; some, “electrochemical processes in the retina of the walleyed pike.” A few hardy souls, called behavioral psychologists, asked what factors caused people to behave as they did.
Tversky and Kahneman followed a thread within the behavioral strand, seeking to understand how people made decisions, and reinvented the field of decision sciences. The pair adopted unorthodox methods. One was to conduct “experiments” on groups of students by asking them to make hypothetical choices: Do you prefer outcome A or outcome B?
At the time, many psychologists thought this practice worthless, and I can see why: Students have no skin in the game, and can answer seriously or frivolously depending on their mood. Even in experiments where dollars change hands, the amounts are intentionally small enough to be meaningless (so that students willingly participate). Yet experimental economics, which is grounded in the early work of Tversky and Kahneman, has become so respectable that Kahneman’s 2002 Nobel was shared with Vernon Smith, whose name is virtually synonymous with the method.
Tversky and Kahneman’s foundational work is called “prospect theory,” referring originally to the prospect or hope that one has when taking a gamble. They positioned prospect theory in opposition to the classical expected-utility theory that Daniel Bernoulli set forth in 1738, which, under the influence of John Von Neumann and Oskar Morgenstern’s Theory of Games and Economic Behavior in 1944, had become the established paradigm for understanding decision-making in economics. By putting their new hypothesis in the mathematical language of utility theory, they achieved a level of respect among economists that had eluded other psychologists seeking to describe economic behavior. Their landmark paper, “Prospect Theory: An Analysis of Decision Under Risk,” is the second-most cited economics paper during the last quarter of the twentieth century.
Prospect theory says that:
- For decision-makers, losses loom larger than gains;
- Decision-makers focus more on changes in their well-being, or utility, than on their absolute level of utility; and
- Their subjective estimates of the probabilities of events are biased by anchoring.
The first two observations seem obvious, but they are still hypotheses (and violations of rationality) that needed to be tested and found either valid or wanting. Tversky and Kahneman found them to be valid. Anchoring is more complex, and says that people “anchor,” or focus on the first piece of information that they encounter when solving a problem, whether or not the first piece of information is helpful.
People sometimes even anchor on data points that obviously have no relevance at all. Before being asked how many African countries they think are members of the United Nations, experimental subjects are asked to write down the last two digits of their Social Security number. Those with higher numbers guessed there were more countries. If people are that easy to fool, how can economics assume that everyone acts rationally?
The predictable errors that people make
Tversky and Kahneman’s research then led them to document the errors that people make systematically or predictably, and that therefore don’t cancel each other out; such errors matter in an economic sense. They contrasted these with errors that people make randomly and that do cancel each other out. Systematic errors include representativeness bias, endowment bias and framing effects. We’ll go over each of these, but there are many more, catalogued in a delightful diagram on page 4 of James Montier’s article, “Darwin’s Mind: The Evolutionary Foundations of Heuristics and Biases.”
Representativeness bias is the belief that a small sample is representative of a large one. In other words, people are not natural statisticians; they do not understand Bayesian inference and jump to conclusions based on very limited evidence. Here’s an example:
Steve is very shy and withdrawn, invariably helpful, but…little interested in people, or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.
Is Steve more likely to be a salesman or a librarian? Most people say “a librarian,” because the description seems to capture the very soul of librarianship. But they overlook the fact that the overall population contains a great many salesmen and few librarians. The base rate, the proportion of librarians in the population (about 0.1%), is overlooked – and is tremendously important to answering the question.
Endowment bias is the idea that you treat a good that you already have as more valuable than an equivalent good that you do not have. Why else would owners of vacation homes be so slow to sell properties that, at the current price and given their current circumstances, they would never buy?
Endowment bias shows up in almost every aspect of economics and finance. The sunk-cost fallacy is a special case of it: Investors make decisions about assets based on what they paid for the asset, not what it is worth. “Cut your losses and let your profits ride,” a foolish trading rule that compares an asset’s price to the irrelevant datum of what you paid for it, is just endowment bias at work.
Framing is the idea that people’s answers to questions are affected by the way the question is phrased or “framed”:
When you told people that they had a 90 percent chance of surviving [lung cancer] surgery, 82 percent of patients opted for surgery. But when you told them that they had a 10 percent chance of dying from the surgery – which was of course just a different way of putting the same odds – only 54 percent chose the surgery.
And patients weren’t the only ones who did this – Lewis says that “doctors did it too!” We are all equally human, and if errors such as these are wired into our brains, it takes very specific training (not included in the 1950s medical-school curriculum, although maybe it is now) to counteract our wiring. And, if we can predict people’s cognitive errors, we may be able to help them act more effectively in their own interest, a topic to which I will return.
“If only”: The undoing project
When something terrible happens, people are inclined to say or think that, “if only” someone had done something differently, the disaster would not have occurred. For example, a man commuting to work takes an unusual route and is killed by a drunken driver. We almost automatically say to ourselves, “If only he had gone the usual way.”
It is revealing that we do not say: “If only the drunken driver had taken a different route,” or “If only the commuter had a job closer to home.” We almost never think of the time dimension: “If only the commuter (or the drunken driver) had arrived at the intersection thirty seconds earlier (or later).”
Psychologists call these mental gymnastics “undoing,” and the undoing project is the effort undertaken by Tversky and Kahneman to ascertain the rules by which the mind undoes bad outcomes. Among the rules are:
- “The more consequences an event has, the larger the change that is involved in eliminating [it].” Thus, consequential events are less likely to be undone. The mind is more likely to undo a car crash than a major earthquake.
- “The further the event recedes into the past, the less likely it is to be undone.” Kennedy’s assassination is more easily undone than Caesar’s.
- With some exceptions, we undo the actions of an individual and keep the situation unchanged: “We don’t invent a gust of wind to deflect Oswald’s bullet.”
- The more surprising the event (in this case, the unusual route the commuter took), the more easily it is undone.
Fascinating! Yet these observations did not overturn any economic theory, or have a practical use in engineering that would save or even improve lives. The undoing project was an intellectually satisfying dead end.
Lewis named his book after this uncertain venture. The project was also the undoing of Tversky’s and Kahneman’s collaboration, which did not survive the former’s pomposity and the latter’s insecurity. But Lewis may have been hinting that the entire collaboration of Tversky and Kahneman was an undoing project, stripping away illusions and laying bare the essentials of human nature: “headshrinking” is what psychologists are often described as doing. Lewis doesn’t say.
Richard Thaler and the link to investing
The work of Tversky and Kahneman might have remained a bright spot in the history of psychology – without any spillover to other disciplines – if, in the 1980s, a young economist named Richard Thaler had not spotted the potential importance of the new psychological discoveries to his own field.
Thaler was an unexceptional Ph.D. student at the University of Chicago’s then-supercharged economics department, and he was unable to land an academic job when he graduated. Instead, he was hired by a consulting firm and was soon let go. At that point he did secure an academic job, albeit a temporary one. But, rather than limp through a career as a mediocre conventional economist, Thaler wisely decided to carve out new territory that would become his own. He would apply the decision sciences of Tversky and Kahneman to both economics and investment management, calling into question the very foundation of much economic theory: the assumption that people are rational.
“As if”: Why behavioral economics may not be transformative
We all know that people aren’t always rational. What we don’t know is whether their irrationality functions so as to overturn the economist’s assumption that people are utility maximizers who act in their own interest (and who know what that interest is). For economics to “work,” that is, to explain and predict observed phenomena with reasonable accuracy, the economy needs only to function as if people were perfectly rational; they do not have to conform to the assumption exactly.
“As if” has been a challenge to behavioral economics since the field was a baby. In 1953, Milton Friedman wrote:
Consider the problem of predicting the shots made by an expert billiard player…. [E]xcellent predictions would be yielded by the hypothesis that the billiard player made his shots as if he knew the complicated mathematical formulas [that govern the] travel...of the balls… Our confidence in this hypothesis is not based on the belief that billiard players…can or do go through the process described; it derives rather from the belief that, unless… they were capable of reaching essentially the same result [without using the formulas], they would not in fact be expert billiard players.
It is only a short step from these examples to the economic hypothesis that, under a wide range of circumstances, individual firms behave as if they were seeking rationally to maximize their expected returns…and had full knowledge of the data needed to succeed in this attempt…
…when we know full well that most businesses do not attempt these calculations at all. Thus, economics can tolerate quite a bit of deviation from its fundamental assumptions of rationality and perfect and costless information. So can investment management, which seems more thoroughly governed by William Sharpe’s arithmetic of active management doctrine (which says that active managers cannot in aggregate outperform, before fees, their properly constructed benchmarks) than it is by any advantage given to those who understand psychology and behavior.
Thus, while most economists have been influenced by behavioral science, this infiltration did not spark the revolution in economic thinking that some behaviorists anticipated and hoped for. The work of Tversky, Kahneman, and Thaler had the potential to create an “everything you know is wrong” situation, like that of Ptolemaic astronomers after Copernicus and Galileo.
But it turns out that much economic reasoning is robust to the observation that human beings are, well, human. While some investors may be able to exploit market inefficiencies created by cognitive errors, neoclassical economics is still quite effective at explaining the ways that economic agents behave in the real world. And investors, even those armed with behavioral insights, still find it hard to beat the market.
Behavioral economics and investment management
Applying behavioral economics to the practice of investing, Thaler found some spectacular examples of irrationality – enough to give pause to advocates of the generally well-respected efficient market hypothesis. (See my review of Thaler’s autobiography, Misbehaving: The Making of Behavioral Economics, here.) One is the fact that the stocks of Royal Dutch and Shell trade at different prices, after adjusting for currency and the number of shares outstanding, even though they are two names for the same company! Another is that, in 1999, Palm Computing, a partially owned subsidiary of 3Com, traded so expensively that one could theoretically buy the rest of 3Com at a negative price (by buying 3Com and shorting Palm).
Active managers can benefit from situations such as these, but the implications of behavioral economics are not limited to stock picking. Market timing is another application of it. Representativeness bias suggests that investors will regard a sequence of positive returns as predictive of more positive returns. We’ve seen this recently, as the U.S. equity market has surged to new highs, prompting reluctant savers to buy stocks despite reasonable concerns about their valuation levels. They’re likely to do this even if the market’s win-loss record is statistically indistinguishable from a random sequence. People are not natural statisticians.
Another example of a cognitive error is that almost no one buys annuities even if they are fairly priced, because people hate the idea that they will lose most of their capital if they die young. This aversion appears to outweigh the important need to generate a lifetime income. Ignoring the bequest motive – say, because one has no heirs – a rational person would realize that, if they die young, they won’t care about having lost money because they’ll be dead. And, if they live to be very old, they’ll be glad they bought the annuity because they’ll need the money. An annuity provider, cognizant of these behavioral weaknesses, may be able to overcome them and help investors secure a lifetime income with part of their assets.
Advice to investors
What can we learn from Lewis’ heroes, Amos Tversky and Daniel Kahneman? We already know that we’re irrational, but we can use their work to adjust our behavior for the fact that we’re irrational so that we make better decisions in investment management and in every other field.
We share much of our DNA with lizards – and with bananas. Be smarter than they are. Aspects of our DNA that are adaptive for lizards may be maladaptive for people making complex decisions using incomplete information.
So, be a Bayesian. When evaluating any situation, realize that you are probably looking at a small sample that is unrepresentative of the larger population. Ask, “What is the base rate?” (That is, what is the percentage of the population having the characteristic you are concerned with?) If you modify your first impressions using the information in the base rate, you will be way ahead of most people, who don’t know that there is a base rate or that it matters.
If you’re an active manager of investments, emphasize value. Investors overreact to negative information, creating bargains; value investing takes advantage of this behavioral trait. Investors also overreact to positive information, forming bubbles that later burst, but that phenomenon is harder to exploit.
If you’re on the fence between active investing and index funds, favor index funds. A rational investor knows that the average active manager will be beaten by his or her benchmark by the amount of fees and other costs. While it’s tempting to try to pick winning active managers, that is very hard to do.
Michael Lewis’ ability to make difficult concepts easy causes his books to be consistently worth reading. He is more than a storyteller – he’s a great teacher. A journey through Lewis’ oeuvre is an education in investment banking, short selling, high-frequency trading, quantitative evaluation of baseball players and the conversion of a rough-hewn football prospect into a gentlemanly college freshman.
Now, Lewis uses the story of Tversky and Kahneman to educate the reader on psychology and its impact on the economy and markets. He weaves together human feeling and scientific knowledge so tightly that they are inextricable, making his stories immensely appealing. While this book is a little more disjointed than his others and would have benefited from a smoother flow between chapters, that is a minor complaint. Like the rest of Lewis’ books, The Undoing Project exemplifies how complex ideas should be taught. Teachers, take note.
Laurence Siegel is the Gary P. Brinson director of research for the CFA Research Foundation. Prior to that, he was director of research in the investment division of the Ford Foundation. He is a member of the editorial boards of The Journal of Portfolio Management and The Journal of Investing and serves on the board of directors and the program committee of the Q Group. He may be reached at [email protected].
 Quoted in Gladwell, Malcolm, David and Goliath: Underdogs, Misfits, and the Art of Battling Giants, New York: Little, Brown, 2013.
 Shenk, Joshua Wolf. 2014. Powers of Two: Finding the Essence of Innovation in Creative Pairs. London: John Murray.
 According to Laibson and Zeckhauser . Laibson, David, and Richard Zeckhauser. 1998. “Amos Tversky and the Ascent of Behavioral Economics.” Journal of Risk and Uncertainty, Volume 16, issue 1 (April).
 Click on “view PDF” to read the article.
 The number of librarians, library technicians, and library assistants in the U.S. in 2015 was 295,000, which is just under 0.1% of the total population or 0.2% of the working population.
 Friedman, Milton. “The Methodology of Positive Economics,” in Milton Friedman, ed., Essays in Positive Economics, University of Chicago Press, http://socjologia.amu.edu.pl/isoc/userfiles/40/friedman-1953.pdf
 The Royal Dutch/Shell Group is traded on Euronext Amsterdam as Royal Dutch and on the New York Stock Exchange as Shell; “based on the 1907 merger agreement, all cash flows are split so that Royal Dutch shares receive 60 percent and Shell shares receive 40 percent” (Lamont and Thaler 2003). Lamont, Owen A., and Richard H. Thaler. 2003. “Anomalies: The Law of One Price in Financial Markets.” Journal of Economic Perspectives, Volume 17, Number 4 (Fall).
 In “A Solution to the Palm-3Com Spin-Off Puzzles,” Martin Cherkes, Charles Jones, and Chester Spatt  pointed out that, at the time, it was very expensive to sell Palm short, making the purchase of “rump” 3Com at a negative price impractical. Thus, this episode is not firm evidence of an irrational market. Claims of irrationality are often subjected to this kind of scrutiny by efficient-market advocates. https://www.princeton.edu/bcf/newsevents/events/seminar/Charles-Jones-paper.pdf