Recently I have been researching for examples of problem-solving that have gone wrong. I wanted to understand specific examples, some of which were epic disasters, where the data used to problem solve went one way while other data points were obviously missed. From these examples, a pattern emerged of problematic habits set forth at the beginning of an investigation. Routinely, the investigations that went array were from experienced problem solvers that allowed their previous experiences to biasedly direct connect the evidence. A term commonly used referenced is survivorship cognitive bias.
A classic example of cognitive bias
The classic example of survivorship bias is the story of a member of World War II’s Statistical Research Group (SRG) and one of their highly regarded statisticians named Abraham Wald. His team was tasked to find the right balance of armor on bombers with missions over Germany. Applying intense statistical analysis of the bullet holes on the sampled bombers, the team was able to find repeatable patterns of bullets per square inch to areas around the plane. His team quickly determined that they were not distributed evenly and evidence was overwhelmingly obvious to apply a solution. Anyone could have recognized that there were obviously specific locations where there were plenty of bullet holes and many areas where there were little to none.
You don’t want your planes to get shot down by enemy fighters, so you armor them. But armor makes the plane heavier, and heavier planes are less maneuverable and use more fuel. Armoring the planes too much is a problem; armoring the planes too little is a problem. Somewhere in between, there’s an optimum. An excerpt from How Not To Be Wrong by Jordan Ellenberg
With this convincing evidence, the team recommended to Abraham to reinforce the armor in areas such as the fuselage while maintaining the right balance of weight by rebalancing the distribution of weight from areas with little to no evidence of bullet holes. There was only one problem. What about the data from the planes that didn’t return and were not part of the analysis?
The armor, said Wald, doesn’t go where the bullet holes are. It goes where the bullet holes aren’t: on the engines. - Medium
We have to understand the source of the data, the assumptions that went into the data, and sometimes even the story that the individual is portraying with the data. We must use our training to solve problems, not the answers from previous investigations. In The Data Detective: Ten Easy Rules to Make Sense of Statistics by Tim Harford, he digs deep into the cognitive bias of problem-solving. He stresses that we strive to test commonalities of previous investigations while disrupting the assumption the commonalities are directly correlated.
So the problem is not the algorithms, or the big datasets. The problem is a lack of scrutiny, transparency, and debate. - Tim Harford, The Data Detective: Ten Easy Rules to Make Sense of Statistics
There is an example of cognitive bias in season seven of The Office called “The Search.” In this episode, Michael Scott inadvertently goes missing and the characters of Holly, Dwight, Pam, and Erin go looking for him. With a comedic touch of overconfidence, Erin, Dwight, and Pam begin to problem solve with examples of their own cognitive bias by normalizing the situation. With high fives embodying their confidence, they assume that Michael most likely walked back towards the office. Holly quietly disrupts the logic with the data and not logical assumptions Michael would have done the right thing by starting to walk back to the office.
Hey hey hey. Let me answer this. Stupid question. He went back to the office, obviously, which is that way” says Dwight. Holly responds, “Oh really? You don’t think he walked by the bakery just for the smell of it?” The Office - The Search
In this episode, Dwight utilizes his presumed talent of tracking to select only the information that supports his construction of a narrative. Whereas, Holly continues with an ability to look beyond the scene to understand the surrounding circumstances. She calmly avoids being absorbed by normalized assumptions by instead focusing on the information. Soon she finds Michael on top of a roof, solving the problem by preventing the gravitation pull of assumptions.
In these examples of survivorship bias, the lesson here is to not just look at the tangible data that is in front of you but curiously look at the information. We have to disrupt the cognitive bias that creates an abstraction designed to fill the void within ourselves seeking to know exactly what happened. We are problem solvers during investigations, not storytellers. We have to understand what is providing the data and have the intuition to explore the areas that are lacking tangible evidence as much as the areas that have overwhelming similarities. We must develop habits to disrupt rewriting a previous experience’s narrative, and instead use our skills to let each investigation play out. When we are successful at doing this, we will become better problem solvers.