Thanks to Tim Harford's latest column for pointing me at lessons from the story of Amercian Mathematician and Statistician Abraham Wald. In 1943, Wald was part of a group asked to advise the US air force on how they might reinforce their planes to prevent so many of them being lost on bombing raids to enemy fire.
The challenge was that there was only a limited amount of armour plating that could be used on the plane if it were to still fly. Research by the miltary into damage to planes that had returned from missions revealed that bombers were often riddled with bullets in the wings, the centre of the fuselage and around the tail gunner. So they proposed to reinforce these areas where they could see the most damage.
Wald told them that this was a big mistake, and that they should in fact do the opposite. He recognised that what the bullet holes were in fact showing was where a plane could be hit and still survive to make it home. The reinforcement actually needed to be placed in the areas where the surviving planes looked unscathed, since planes hit in these areas never returned.
Survival bias leads us to overly focus on the successful, to the detriment or oversight of what we can learn from the unsuccessful, and can lead to false conclusions in several ways. Wald went on to use a complex series of equations to work out the vulnerability of individual parts of a plane. Calculations which are still in use today.
I think we tend to be far better in organisations at looking at lessons from successes rather than failures. We're particularly bad at building in reflection time so that we might encourage the kind of learning culture that has been shown to create significant advantage. It's easy, says Tim, to look at life's winners. Yet if we don't look at life's failures as well we may end up drawing the wrong conclusions entirely.