Too frequently, I have seen departments measure the effectiveness of their asset management by pontificating their PM Compliance. Disregarding other measurements, the organization forges its preventive maintenance (PM) effectiveness by presenting this key performance indicator (KPI) on the highest pedestals within a time-series plot. They state this god-like numeric achievement with no more than five syllables, month over month, proud of their ability to be above their benchmarkable target. They artificially beg for someone in the audience to gush about their remarkable feat. But like in previous months, the audience passes on giving the desired adoration.
Then something fails mid-month, a failure that is the foundation of future war stories. The organization responsible for the reliability of the assets (shhh… not saying maintenance) hung its hat on PM Compliance for so long, but this time they are defending that their PMs are not weak as the failure is interrogated during a root cause failure analysis. Of course, in the back of the minds of their defense, the failure is someone else’s fault and surely not an ineffectiveness of completing the PMs on time.
Then the next meeting comes, and the same PM Compliance calculation is shown, and this time has a value that may challenge the historical performance ceiling. The team gloats about the high PM Compliance and highlights that they leveraged the unplanned downtime to get ahead on their PMs. But this time, the presentation changes course when a leader in the group asks a question that is engulfed by the emotions of the last failure. They ask, what makes doing a PM compliant? That is where it gets sticky.
I think we have all been there. We have seen an organization become complacent to month-over-month performances of good PM Compliance and fail to educate how it connects to other KPIs. We may have seen a defender of this KPI attempt to explain the calculation but leaves the audience more confused. The organization that hangs its hat on PM Compliance will inevitably steer the organization to excess cost, higher failure rates, and an inability to connect to a litany of other performance indicators. Instead, we should inspire to have an organization that understands its calculation and how it is intertwined with dozens of others. The effectiveness of asset management is more of a story of where we have been, and what the data is telling us the future may look like.
To help with this, I want to challenge this holistically governing KPI to determine what is “compliance” when doing preventive work. I want to accomplish this by walking through some examples and then evaluating how we may show this on an x-axis. While working through a PM transformation example, you may see some commonalities in your organization’s genealogy of PM Compliance.
To begin, on paper, we will most likely agree that it appears to be a simple concept that shows the percentage of time PMs are completed on time. An asset management novice can quickly understand that higher must be better because it would indicate that we are completing well-intended and predetermined work proactively. It is typically shown on a time-series plot with months on the x-axis and either a bar or a line showing a monthly performance. There are typically three ways that compliance is then shown.
Completed before it was due
The first definition falls into a bucket that asks, did you finish the PM before it was due? This is the most simple of the three example calculations because it simply says did you get the work done before a specific date. But have you asked what is the date on the x-axis? Is it the date that it was completed or the date that it was due? If I get a monthly PM done one week early whenever the succeeding PM is due 30 days later than the previous one’s completion date, you will do this “monthly” PM completed 17 times in a year. This isn’t compliance, this is 40% more grease, 40% more labor hours, and 40% more cost than planned. This is waste. Typically, when this calculation’s blemish is found, the calculation will experience a maturity transition as the team will go alter the calculation. This is a healthy change that should be accepted and one that shows the organization is maturing.
Completed in the month it was due
The maturity of PM Compliance may then transition into the next version that asks, did you get the PM done in the month it was due? This is a universally accepted simple solution until its flaws get exposed. Consider a PM that has a due date that is the last day of every month. So, if we do a monthly PM on October 31 for the October due date and quickly rebound and get November’s completed on November 2, would it show compliance for both months? While my numbers look great, someone will eventually discover that defining compliance this way is flawed. You may consider that the next PM was completed on December 31, or 60 days since November 1, we would show compliance in all three months but is this truly compliance? I will assume that you will say, nope.
This flaw might be uncovered in a performance review, revealing that the organization will have to mature the calculation again. The historical performance has now been tarnished because it's a lot easier to make a chart versus interrogating and educating the calculation. Yet again, we are maturing.
Completed within a window
The organization may now look externally for guidance, as it has transitioned through two definitions already. SMRP (Society of Maintenance Reliability Professionals) has a metric called 5.4.14 Preventive Maintenance (PM) & Predictive Maintenance (PdM) Work Order Compliance. The metric recommends that compliance is achieved if the PM is completed in a window, or plus or minus a percentage of the frequency interval. If we look at an example of a monthly PM with a target completion date of October 31, and the range is +/- 10% of the frequency interval, a monthly PM would typically have a variance window of +/- 3 days. So the October 31 PM would be compliant if it was completed between October 28 and November 3. I will assume that after we worked through the previous examples, you would agree this sounds the most logical. Beware, it is a trap.
Consider two independent monthly PMs that are due on October 31. One gets completed on October 29 and the other gets completed on November 2. In this example, we have agreed that per 5.4.14 both are compliant, but which month on the x-axis do you show this compliance? Which month has the denominator and which month has the numerator? What would it show when the results are shown on November 1 for October? Would October’s results change based on our selected calculation when we publish the results again on December 1? The simplistic ease of presenting a time-series plot overshadows the importance of educating on the definition of what is being shown.
In this example of PM Compliance, we saw how one KPI can be interrogated to mature over time yet still has an educational component that is required to fully understand the malleable results. When we finally got to a KPI that is referenced as a standard and can be benchmarked, we still have the potential of accepting that history may indeed change. This is okay as long as the audience is aware of the definition and how it maps into other KPIs. PM Compliance should provide an agreed-to and educated calculated value that guides into other KPIs such as backlog, schedule compliance, and cost compliance. It’s the collection of asset management KPIs that should be appended and educated together to truly represent the asset management health of the organization.
Comments