# The conjunction fallacy

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Rank the following statements from most probable to least probable:

1. Linda is a teacher in an elementary school.

2. Linda works in a bookstore and takes Yoga classes.

3. Linda is active in the feminist movement.

4. Linda is a psychiatric social worker.

5. Linda is a member of the League of Women Voters.

6. Linda is a bank teller.

7. Linda is an insurance salesperson.

8. Linda is a bank teller and is active in the feminist movement.

89% of 88 undergraduate subjects ranked (8) as more probable than (6). (Tversky and Kahneman 1982.) Since the given description of Linda was chosen to be similar to a feminist and dissimilar to a bank teller, (8) is more representative of Linda's description. However, ranking (8) as more probable than (6) violates the conjunction rule of probability theory which states that p(A & B) < p(A). Imagine a sample of 1,000 women; surely more women in this sample are bank tellers than are feminist bank tellers.

Could the conjunction fallacy rest on subjects interpreting the experimental instructions in an unanticipated way? Perhaps subjects think that by "probable" is meant the probability of Linda's description given statements (6) and (8), rather than the probability of (6) and (8) given Linda's description. Or perhaps subjects interpret (6) to mean "Linda is a bank teller and is not active in the feminist movement." Although many creative alternative hypotheses have been invented to explain away the conjunction fallacy, the conjunction fallacy has survived all experimental tests meant to disprove it; see e.g. Sides et. al. (2002) for a summary. For example, the following experiment excludes both of the alternative hypotheses proposed above:

Consider a regular six-sided die with four green faces and two red faces. The die will be rolled 20 times and the sequence of greens (G) and reds (R) will be recorded. You are asked to select one sequence, from a set of three, and you will win \$25 if the sequence you chose appears on successive rolls of the die. Please check the sequence of greens and reds on which you prefer to bet.

1. RGRRR

2. GRGRRR

3. GRRRRR

125 undergraduates at UBC and Stanford University played this gamble with real payoffs. 65% of subjects chose sequence (2). (Tversky and Kahneman 1983.) Sequence (2) is most representative of the die, since the die is mostly green and sequence (2) contains the greatest proportion of green faces. However, sequence (1) dominates sequence (2) because (1) is strictly included in (2) - to get (2) you must roll (1) preceded by a green face.

In the above task, the exact probabilities for each event could in principle have been calculated by the students. However, rather than go to the effort of a numerical calculation, it would seem that (at least 65% of) the students made an intuitive guess, based on which sequence seemed most "representative" of the die. Calling this "the representativeness heuristic" does not imply that students deliberately decided that they would estimate probability by estimating similarity. Rather, the representativeness heuristic is what produces the intuitive sense that sequence 2 "seems more likely" than sequence 1. In other words, the "representativeness heuristic" is a built-in feature of the brain for producing rapid probability judgments, rather than a consciously adopted procedure. We are not aware of substituting judgment of representativeness for judgment of probability.

The conjunction fallacy similarly applies to futurological forecasts. Two independent sets of professional analysts at the Second International Congress on Forecasting were asked to rate, respectively, the probability of "A complete suspension of diplomatic relations between the USA and the Soviet Union, sometime in 1983" or "A Russian invasion of Poland, and a complete suspension of diplomatic relations between the USA and the Soviet Union, sometime in 1983". The second set of analysts responded with significantly higher probabilities. (Tversky and Kahneman 1983.)

In Johnson et. al. (1993), MBA students at Wharton were scheduled to travel to Bangkok as part of their degree program. Several groups of students were asked how much they were willing to pay for terrorism insurance. One group of subjects was asked how much they were willing to pay for terrorism insurance covering the flight from Thailand to the US. A second group of subjects was asked how much they were willing to pay for terrorism insurance covering the round-trip flight. A third group was asked how much they were willing to pay for terrorism insurance that covered the complete trip to Thailand. These three groups responded with average willingness to pay of \$17.19, \$13.90, and \$7.44 respectively.

According to probability theory, adding additional detail onto a story must render the story less probable. It is less probable that Linda is a feminist bank teller than that she is a bank teller, since all feminist bank tellers are necessarily bank tellers. Yet human psychology seems to follow the rule that adding an additional detail can make the story more plausible.

People might pay more for international diplomacy intended to prevent nanotechnological warfare by China, than for an engineering project to defend against nanotechnological attack from any source. The second threat scenario is less vivid and alarming, but the defense is more useful because it is more vague. More valuable still would be strategies which make humanity harder to extinguish without being specific to nanotechnologic threats -such as colonizing space, or see Yudkowsky (this volume) on AI. Security expert Bruce Schneier observed (both before and after the 2005 hurricane in New Orleans) that the U.S. government was guarding specific domestic targets against "movie-plot scenarios" of terrorism, at the cost of taking away resources from emergency-response capabilities that could respond to any disaster. (Schneier 2005.)

Overly detailed reassurances can also create false perceptions of safety: "X is not an existential risk and you don't need to worry about it, because A, B, C, D, and E"; where the failure of any one of propositions A, B, C, D, or E potentially extinguishes the human species. "We don't need to worry about nanotechnologic war, because a UN commission will initially develop the technology and prevent its proliferation until such time as an active shield is developed, capable of defending against all accidental and malicious outbreaks that contemporary nanotechnology is capable of producing, and this condition will persist indefinitely." Vivid, specific scenarios can inflate our probability estimates of security, as well as misdirecting defensive investments into needlessly narrow or implausibly detailed risk scenarios.

More generally, people tend to overestimate conjunctive probabilities and underestimate disjunctive probabilities. (Tversky and Kahneman 1974.) That is, people tend to overestimate the probability that, e.g., seven events of 90% probability will all occur. Conversely, people tend to underestimate the probability that at least one of seven events of 10% probability will occur. Someone judging whether to, e.g., incorporate a new startup, must evaluate the probability that many individual events will all go right (there will be sufficient funding, competent employees, customers will want the product) while also considering the likelihood

that at least one critical failure will occur (the bank refuses Note that the 44% figure is for all new businesses, including e.g. small restaurants, rather than, say, dot-com startups.

a loan, the biggest project fails, the lead scientist dies). This may help explain why only 44% of entrepreneurial ventures3 survive after 4 years. (Knaup 2005.)

Dawes (1988) observes: 'In their summations lawyers avoid arguing from disjunctions ("either this or that or the other could have occurred, all of which would lead to the same conclusion") in favor of conjunctions. Rationally, of course, disjunctions are much more probable than are conjunctions.'

The scenario of humanity going extinct in the next century is a disjunctive event. It could happen as a result of any of the existential risks discussed in this book - or some other cause which none of us foresaw. Yet for a futurist, disjunctions make for an awkward and unpoetic-sounding prophecy.

Continue reading here: Confirmation bias

Was this article helpful?

0 0