Since the beginning of the financial crisis, there have been two principal explanations for why so many banks made such disastrous decisions. The first is structural. Regulators did not regulate. Institutions failed to function as they should. Rules and guidelines were either inadequate or ignored. The second explanation is that Wall Street was incompetent, that the traders and investors didn’t know enough, that they made extravagant bets without understanding the consequences. But the first wave of postmortems on the crash suggests a third possibility: that the roots of Wall Street’s crisis were not structural or cognitive so much as they were psychological.
In “Military Misfortunes,” the historians Eliot Cohen and John Gooch offer, as a textbook example of this kind of failure, the British-led invasion of Gallipoli, in 1915. Gallipoli is a peninsula in southern
…
Cohen and Gooch ascribe the disaster at Gallipoli to a failure to adapt—a failure to take into account how reality did not conform to their expectations. And behind that failure to adapt was a deeply psychological problem: the British simply couldn’t wrap their heads around the fact that they might have to adapt. “Let me bring my lads face to face with Turks in the open field,”
At Gallipoli, the British acted as if their avowed superiority over the Turks gave them superiority over all aspects of the contest. They neglected to take into account the fact that the morning sun would be directly in the eyes of the troops as they stormed ashore. They didn’t bring enough water. They didn’t factor in the harsh terrain. “The attack was based on two assumptions,” Cohen and Gooch write, “both of which turned out to be unwise: that the only really difficult part of the operation would be getting ashore, after which the Turks could easily be pushed off the peninsula; and that the main obstacles to a happy landing would be provided by the enemy.”
Most people are inclined to use moral terms to describe overconfidence—terms like “arrogance” or “hubris.” But psychologists tend to regard overconfidence as a state as much as a trait. The British at Gallipoli were victims of a situation that promoted overconfidence. Langer didn’t say that it was only arrogant gamblers who upped their bets in the presence of the schnook. She argues that this is what competition does to all of us; because ability makes a difference in competitions of skill, we make the mistake of thinking that it must also make a difference in competitions of pure chance. Other studies have reached similar conclusions. As novices, we don’t trust our judgment. Then we have some success, and begin to feel a little surer of ourselves. Finally, we get to the top of our game and succumb to the trap of thinking that there’s nothing we can’t master. As we get older and more experienced, we overestimate the accuracy of our judgments, especially when the task before us is difficult and when we’re involved with something of great personal importance. The British were overconfident at Gallipoli not because Gallipoli didn’t matter but, paradoxically, because it did; it was a high-stakes contest, of daunting complexity, and it is often in those circumstances that overconfidence takes root.
…..
It makes sense that there should be an affinity between bridge and the business of Wall Street. Bridge is a contest between teams, each of which competes over a “contract”—how many tricks they think they can win in a given hand. Winning requires knowledge of the cards, an accurate sense of probabilities, steely nerves, and the ability to assess an opponent’s psychology. Bridge is Wall Street in miniature, and the reason the light bulb went on when Greenberg looked at Cayne, and Cayne looked at Spector, is surely that they assumed that bridge skills could be transferred to the trading floor—that being good at the game version of Wall Street was a reasonable proxy for being good at the real-life version of Wall Street.
It isn’t, however. In bridge, there is such a thing as expertise unencumbered by bias. That’s because, as the psychologist Gideon Keren points out, bridge involves “related items with continuous feedback.” It has rules and boundaries and situations that repeat themselves and clear patterns that develop—and when a player makes a mistake of overconfidence he or she learns of the consequences of that mistake almost immediately. In other words, it’s a game. But running an investment bank is not, in this sense, a game: it is not a closed world with a limited set of possibilities. It is an open world where one day a calamity can happen that no one had dreamed could happen, and where you can make a mistake of overconfidence and not personally feel the consequences for years and years—if at all. Perhaps this is part of why we play games: there is something intoxicating about pure expertise, and the real mastery we can attain around a card table or behind the wheel of a racecar emboldens us when we move into the more complex realms. “I’m good at that. I must be good at this, too,” we tell ourselves, forgetting that in wars and on Wall Street there is no such thing as absolute expertise, that every step taken toward mastery brings with it an increased risk of mastery’s curse.