top of page

Understanding the prerequisite to a failure

Updated: Mar 12

A quote I remember a mentor of mine said years ago was, if you find yourself bored, just wait 5 more minutes, something will blow up. Of course, this quote wasn’t to prepare for the glorification of putting things back together after an incident or a breakdown. It wasn’t intended to instill anxiety that a counselor would call out years later as the root cause of your sleeping issues. Nor was it an indication that your team shouldn’t deserve a well-needed break after a long, successful outage. Instead, this message was to constantly instill a sense of curiosity about the potential of disruptions to the plan.

By fostering a culture of innovation and always challenging the status quo, you can help your business thrive and avoid being broken – or worse, actually broke. - Evan Goldberg,

Be a little uncomfortable

I have shared my mentor’s message for decades to instill habits that seek curiosity and avoid complacency. The message intends to drive a culture that asks us to be a little uncomfortable to allow ourselves to anticipate and recognize what might happen. Because if we don't, we will be consequences that are unfavorable to the plan. There will be that undesired call on a Friday night. There will be breakdowns without an available spare.

From this instilled form of strategically gifted anxiety, I learned we should manage risk more effectively by managing our priorities and resources. We should become instinctual at putting our actions in place to prevent an incident or a breakdown. However, it wasn’t until I read Robert K. Merton’s (1936) famous Law of Unanticipated Consequences that I embrace the full picture of what my mentor was saying.

It’s kind of like a millwright sitting down in the breakroom with a thick wallet of cash from recent production incentives on one cheek. This uncomfortable break reminds them that they feel the cash they have recently made but realize that they only feel it when they are sitting down. - Unknown response to training on this subject ~2005

Failure Patterns and the PF Curve

Before we dive into Robert K. Merton’s Law of Unanticipated Consequences, I want to set the stage by looking at the academic approach to categorizing failures. Typically, reliability training literature for manufacturers goes in depth to understand failure patterns. Failure patterns are typically visually trained as curves, where we can metaphorically relate to personal examples. From the patterns, we then are trained to correlate these curves with key proactive actions addressing signs proceeding a functional failure or failure in its entirety. We tend to train this approach with a PF Curve (Potential-Failure Curve) so we can balance the cost of action to prevent the failure versus the cost of experiencing the failure.

  • Bathtub Curve – accounts for approximately 4% of failures

  • Wear Out – accounts for approximately 2% of failures

  • Fatigue – accounts for approximately 5% of failures

  • Initial Break-In – accounts for approximately 7% of failures

  • Random – accounts for approximately 14% of failures

  • Infant Mortality – accounts for approximately 68% of failures

Robert K. Merton's Law of Unanticipated Consequences

We must accept that we are not going to solve all failures, and in some cases, our strategy might be to run an asset to failure. However, to understand how to implement the proactive initiatives that stretch the range of the x-axis by moving the failure further to the right, we need to understand Robert K. Merton's five principles that influence unanticipated consequences, their humanistic attributes, and how they relate to asset management. We need to explore Merton’s unanticipated consequences that can lead to failure.

Ignorance - This isn’t specifically the negative connotation of the word, but instead is rooted around lack of training. We may have removed a class within the apprenticeship training to save cost, but later experienced an event that training would have prevented. We may have elected to place an individual on the closing shift at a coffee shop without proper training, and then the morning crew is ineffectively opening the store quickly the next morning. The lack of training creates ignorance.

The most obvious limitation to a correct anticipation of consequences of action is provided by the existing state of knowledge. The extent of this limitation may be best appreciated by assuming the simplest case where this lack of adequate knowledge is the sole barrier to a correct anticipation. - Robert K. Merton, American Sociological Review, Vol.1, No. 6, (Dec. 1936)

Unintended stubbornness - Have you ever walked through an operating unit and heard a knock that sounded just like something you had experienced before? You then mentally matched the knock to a previous insignificant scenario and postpone a thorough inspection until next month’s periodic outage. Then the knock stopped instantly and turned to a bang on the midnight shift and an unexpected lengthy delay. In this example, the error occurred because we allowed a previous experience to influence us to a specific action or neglect. We allowed our ingrained historical reference to disrupt a natural curiosity of thinking that this knock may not be of a similar cause.

Error - Face it, we all make mistakes. Some people, like me, make dozens if not hundreds of errors per day. But it is within this error that we have failed in a process. The error isn’t at the fault of the employee because they would be in a state of complacency. Consider looking at your historical safety incidents for the cause category of “operator error.” Assigning “operator error” is a lazy approach to problem-solving the process and methods to disrupt complacency. Instead, Merton refers to an error as a lack of mistake-proofing in the process or empowerment to influence the desired action or decision.

Pretend we didn’t see it - This is where the person’s immediate interest neglects consideration of consequences. We may see someone not tied off with proper fall protection, and choose to look the other way because we don’t want to write up an incident. I hope you can’t relate to that example, but I assume you can relate to walking by a full trashcan at home. We see it and know that it needs to get done, but we walk past it. This avoidance then leads to a consequence like a fall or a ripped trash bag. This isn’t completely an embodiment of laziness, but instead an avoidance of accepting that they can address the issue.

Inevitable influence - These are examples where human engagement has minimal influence on the fault. Markets are going to be cyclical, and intense preparation for when they come will not prevent them from occurring. It’s inevitable, we will have strong markets and we will have weak markets. No specific action will disrupt a team’s ability to be within these market swings, instead, we can only prepare.

The prediction of the return of Halley's comet does not in any way influence the orbit of that comet; but, to take a concrete social example, Marx's prediction of the progressive concentration of wealth and increasing misery of the masses did influence the very process predicted. - Robert K. Merton, American Sociological Review, Vol.1, No. 6, (Dec. 1936)


So next time you are working and find yourself bored, think about where you are on the x-axis of the PF-Curve. Then consider what Merton’s unintended consequences are about to surface. With Merton’s messaging see what actions and habits can impact the trajectory of the PF Curve. Because, if we can understand the influencers of unintended consequences more effectively, we will significantly decrease our failures.



bottom of page