Black Swans

Taleb (2005) suggests that hindsight bias and availability bias bear primary responsibility for our failure to guard against what Taleb calls Black Swans. Black Swans are an especially difficult version of the problem of the fat tails: sometimes most of the variance in a process comes from exceptionally rare, exceptionally huge events. Consider a financial instrument that earns $10 with 98% probability, but loses $1000 with 2% probability; it's a poor net risk, but it looks like a steady winner. Taleb (2001) gives the example of a trader whose strategy worked for six years without a single bad quarter, yielding close to $80 million - then lost $300 million in a single catastrophe.

Another example is that of Long-Term Capital Management, a hedge fund whose founders included two winners of the Nobel Prize in Economics. During the Asian currency crisis and Russian bond default of 1998, the markets behaved in a literally unprecedented fashion, assigned a negligible probability by LTCM's historical model. As a result, LTCM began to lose $100 million per day, day after day. On a single day in 1998, LTCM lost more than $500 million. (Taleb 2005.)

The founders of LTCM later called the market conditions of 1998 a "ten-sigma event". But obviously it was not that improbable. Mistakenly believing that the past was predictable, people conclude that the future is predictable. As Fischhoff (1982) puts it:

When we attempt to understand past events, we implicitly test the hypotheses or rules we use both to interpret and to anticipate the world around us. If, in hindsight, we systematically underestimate the surprises that the past held and holds for us, we are subjecting those hypotheses to inordinately weak tests and, presumably, finding little reason to change them.

The lesson of history is that swan happens. People are surprised by catastrophes lying outside their anticipation, beyond their historical probability distributions. Then why are we so taken aback when Black Swans occur? Why did LTCM borrow leverage of $125 billion against $4.72 billion of equity, almost ensuring that any Black Swan would destroy them? Because of hindsight bias, we learn overly specific lessons. After September 11th, the U.S. Federal Aviation Administration prohibited box-cutters on airplanes. The hindsight bias rendered the event too predictable in retrospect, permitting the angry victims to find it the result of 'negligence' - such as intelligence agencies' failure to distinguish warnings of Al Qaeda activity amid a thousand other warnings. We learned not to allow hijacked planes to overfly our cities. We did not learn the lesson: "Black Swans occur; do what you can to prepare for the unanticipated."

Taleb (2005) writes:

It is difficult to motivate people in the prevention of Black Swans... Prevention is not easily perceived, measured, or rewarded; it is generally a silent and thankless activity. Just consider that a costly measure is taken to stave off such an event. One can easily compute the costs while the results are hard to determine. How can one tell its effectiveness, whether the measure was successful or if it just coincided with no particular accident? ... Job performance assessments in these matters are not just tricky, but may be biased in favor of the observed "acts of heroism". History books do not account for heroic preventive measures.

Continue reading here: The conjunction fallacy

Was this article helpful?

0 0