Death, Taxes, and Let’s Add Overconfidence to the List
On May 10 this past month, Jamie Dimon, the head of JPMorgan, the biggest bank in the United States in terms of its assets, hastily assembled a conference call to announce that his company, led by Bruno Iksil, had made a series of badly-timed bets on an upswing in the world economy. As such, it had sold billions of dollars of esoteric financial instruments known as credit derivatives, only to have the world economy stall as uncertainties in Europe grew. The term “credit derivative” may sound familiar. These complex and somewhat synthetic financial securities were at the heart of the economic meltdown of 2008.
What Jamie Dimon had to report was that JPMorgan had been forced to liquidate its position to the tune of at least $2 billion dollars in losses, with some claiming that much higher losses would eventually surface. For a financial economy that had had a near death experience in 2008, this episode had a sudden feeling of déjà-vu associated with it, and economists and politicians alike marveled that anyone could make a mistake of such magnitude after
the calamities of 2008
Sometimes, mistakes can be a surprise. We can marvel at the hubris of Bruno Iksil, whose financial bets were so big he became known at the London Whale. How could he wander
into the quicksand of complex credit derivatives not thinking that he, too, would ultimately be stuck with nowhere to go but nowhere? Further, his mistake was not a single event, but a series of repeated bets that went on for months that only got himself in deeper. Why did he not stop before his potential losses traversed from the millions column into the billions?
One of the most pervasive biases discussed in psychology, yielding unintended outcomes that can range from the
costly to catastrophic, is overconfidence. For a variety of reasons, people tend to overbelieve the objective likelihood that their beliefs and judgments are correct. When they reach a conclusion, they are usually much more sure of it than the evidence before them should allow.
Hundreds of studies have demonstrated this phenomenon. In a classic early study, college students were asked the likelihood that they could spell rather difficult words (e.g., fibrous). Their estimates typically exceeded their accuracy. For example, participants thinking they had a 50-50 chance of spelling a particular word correctly proved to be right less than 40% of the time. When they were 80% sure, they were right about 50% of the time. At around 90% confidence, they were right 60% of the time. And when they were absolutely 100% sure, they were still wrong in 1 out of every 5 of their attempts. These data are rather typical of any number of tasks–from trying to answer general knowledge questions to asking doctors to give diagnoses to asking CIA analyses to predict future world events.
Some examples of overconfidence are far more extreme and serious. It permeates planning. In one famous instance, builders in 1956 predicted that they would complete the Sydney Opera House in six years at the cost of $6 million dollars. Final tally when building was completed in 1973: $102 million. Historians have also implicated false confidence as a cause of World War I. Military strategists at the time misread technical innovations as giving advantage to offense over defense, and so rushed into a war each side not only thought it could win, but could do so swiftly, famously by Christmas. However, as the first year of the war ended in each side staring at the other side’s trenches, with over 300,000 soldiers killed and another 600,000 wounded, military planners could not have anticipated that their little war would grow to another four years of blood, mud, and grinding stalemate.
Some dissenters, however, dismiss overconfidence as a major problem when it comes to economic or social choice. People are too rational, they say, when money is involved. Or, they concede the existence of overconfidence, but claim that a little training or experience will beat it out of the professional. If not, those professionals still clinging to their overconfident ways will soon find themselves out of the profession.
However, let me argue that overconfidence is inevitable. And let me start that argument in an unusual place: For the moment, let us assume that the critics are right—that overconfidence is not a systematic part of human personality. That, at least when it comes to the stock market, no individual would show a chronic and mistaken tendency to rate stocks as too attractive, or as too unattractive. Over the long haul, each individual’s judgments of a stock’s prospects would prove exactly accurate, as the “invisible hand” of economics eventually removes systematic biases.
Now, if you grant me such a world, in which systematic bias has been banished from human judgment, I can guarantee you that you will have one basic, and pervasive problem. That problem will be…wait for it… overconfidence.
Say what? How can I argue for overconfidence in a world in which people show no bias? Easy. Even if people
show no bias toward unwarranted optimism or pessimism over the long haul,
It’s the inevitable presence of judgmental error that leads to the inevitability of overconfidence, for this reason: When stock pickers make an occasional overly optimistic error and rate a stock too highly, they will be prone toward buying that stock. In that action, they will be overconfident and not earn what they anticipate they will earn. Same for those times when stock pickers are too pessimistic about a stock: Their judgment will lead them to take a pass on the stock, thus forgoing profits that they could have had in their grasp. Ultimately, this means that people willing to buy a stock do so, at least in part, out of overconfidence. Their judgment contains good information plus some error that misleads them into overvaluing the stock. Those who refuse to buy it also do so, at least in part, to overconfidence. Their errors on this stock lead them to overbelieve that it is not worth the asking price.
And sometimes, the error they make and the belief they hold onto too much can become quite extreme. This is likely that is what happened with Mr. Iksil and his associated colleagues. They actually had been “winning” for several months—they were temporarily making money—but that belied a global political condition that was bound to deteriorate, bringing economic forecasts down with it.
Thus, overconfidence is inevitable. It is not a function of the person. It is the function of acting on a decision. Good thinking pushes us toward our decisions, but so do errors in information or reasoning, and so—sooner or later—we or someone relevant to us will be caught in a decision that
turns out to be folly.
What to do in the face of such inevitability? First, one should not rely on “the market” to correct for inevitable overconfidence. In the JPMorgan case, the market worked perfectly and many players made much money punishing the London Whale—including another unit of JPMorgan—for the outlandish bets he made. This still left a situation blowing an embarrassing and costly hole in JPMorgan’s financial sheet. Just imagine if the position had been larger, or the circumstances much more like 2008. The market may have worked, but the destruction could have been vast. JPMorgan may have possibly failed, or the economy could have been put at risk like it had been in 2008.
Instead, the inevitability of overconfidence—and the issue that it can crop up with anyone and anywhere—suggests that individuals and firms should adopt “repairs” that seek out possible signs of overconfidence and then rush to mend it. For example, architects take great pains to estimate how much concrete to use in a building, and then multiply that number by eight to insure against overconfident judgments that they are likely not to detect until too late.
For myself, it is unclear just how much, how often, and severe of a problem overconfidence is in the financial world. I just believe it is a problem that demands constant vigilance. I can only hope that most financial decisions are made with care and accuracy—but I would hope people not only trust that sentiment, but verify it as well. Now, let me count up my Facebook IPO money to see if I have enough for that trip to Greece.
Heath, C., Larrick, R.P., & Klayman, J. (1998). Cognitive repairs: How organizational practices can compensate for individual shortcomings. In B.M. Staw & L.L. Cummings (Eds.) Research in Organizational Behavior, 20, 1-37.