Survivorship bias focuses only on successful examples
During World War II, analysts studied returning planes to determine where to add armor. They noticed damage patterns on planes that survived and recommended reinforcing those areas. Mathematician Abraham Wald pointed out the survivorship bias: damage on returning planes showed non-critical areas, while planes that didn’t return were likely hit in critical areas. Armor was needed where returning planes were not hit.
Illustration of hypothetical damage pattern on a WW2 bomber. Loosely based on data from an unillustrated report by Abraham Wald (1943), showing that a similar plane survived a single hit to the engine 60% of the time, but a hit to the fuselage or fuel system closer to 95% of the time. This only tells you where planes can get shot and still come back to base. Survivorship bias: your only information is what has survived.[1]
In the social sciences, survivorship bias is “the common mistake of noting only those things that made it through some selection process while overlooking those that didn’t.[2] In project planning and management, this phenomenon occurs frequently.
Goal setting suffers from a serious case of survivorship bias. We concentrate on the people who end up winning—the survivors—and mistakenly assume that ambitious goals led to their success while overlooking all of the people who had the same objective but didn’t succeed.
See also:
- Confirmation Bias defends one's assumptions
- Selection Bias happens when the sample is not representative
Illustration and caption made available under CC BY-SA. ↩︎
How Big Things Get Done – Flyvbjerg and Gardner (2023), ch. 7, § “Stories vs. Data.” ↩︎