Process optimization projects have been legion for many years. The backdrop, however, can vary substantially. Whether it is around the environment, digital transformation, virtualization of teams and workspaces, or diversification of the service offering, it is and will always be part of the landscape of organizational transformation. The metamorphosis is thus taking place at two levels, and the impacts, risks and consequences can only be more substantial. The exercise will therefore require circumspection, analysis, and validation in fine detail and at all stages of the process.
It is therefore no longer sufficient to map existing processes and optimize them through Kaizens workshops, however well directed and inclusive they may be. The optimized processes, even if they are the result of a consensus, must be scrutinized to determine their feasibility, but also to consider potential slippages, identify possible causes and likely consequences.
Quantity and quality must go hand in hand
In this era where instantaneity triumphs in absolute mastery, it is certainly less obvious to take the time to ensure safety, consistent results, and continuous quality. Just as hyper-fast text messages in which grammatical, syntactic, or spelling errors seem permissible, or even justified, quality can sometimes be eroded. However, this is not the first time that we have seen the pendulum swing back... reactive movement, often amplified and accelerated by management methods, always generates a certain amount of collateral damage, the importance of which varies substantially. And the more time passes, the faster the pace increases. As a direct consequence, when the frequency of the potential failure is high and the consequences of this same potential failure can be serious, the risk is significant. Car recalls... Quebec highway interchanges... oil spills... thalidomide... Boeing 737 Max... That is a lot of work that seems to come from the Improvisation League.
Improvisation is an exercise in balance where there is a risk of too many falls.
Michel Déon
(Les trompeuses espérances - 1956)
Too many people saw Six Sigma coming and scrapped their Total Quality program. The Six Sigma methodology simply had to go further and offer a production context in which the margin of error would be even narrower than in Total Quality. Six Sigma embraced the principles of Total Quality... by going further, simply, and not in opposition. But once again it is a question of methodological interpretation.
Of course, it would be absurd to promote the omnipresence of over-quality. But, at the very least, it is essential to consider the balance of inconveniences in order to identify the sectors, fields or niches of activity where certain risk analyses, adapted to their own context, should be carried out. This means that speed cannot ignore caution, a caution that is nevertheless adapted according to the importance of the possible consequences and the probability of failure.
The rampart of the FMECA
Failure Mode, Effects, and Criticality Analysis (FMECA) is a method for determining the level of risk associated with a process, among other things. At the process level, FMECA covers both product manufacturing processes and service processes, with different impact considerations. The objective is to quantify the level of risk associated with each of the specific tasks of a process. There are many possible variations here, as well as scale. Some will use scales from 1 to 4, others from 1 to 10, but it does not matter.
In the sphere of product manufacturing processes, the risk index of each step is established by multiplying the frequency index (varying from implausible (1) to frequent (10)) X the severity index (varying from insignificant (1) to catastrophic (10)) X the probability index of non-detection (varying from certainly detectable (1) to absolutely non-detectable (10)). It goes without saying that the higher the score, the greater the risk. And the greater the risk, the greater the need for mitigation and control measures. But beware: Mitigation and control measures will not necessarily make the optimized process more cumbersome, because there are many possible avenues for automation. Then there is also the balance of inconveniences and the analysis of the return on investment, without losing sight of the initial objectives... We will not go into detail here, however. However, the identification of potential risks must be considered with the utmost seriousness.
In the service sphere, the risk index for each of the steps is established strictly by multiplying the frequency index (varying from implausible (1) to frequent (10)) X the severity index (varying from insignificant (1) to catastrophic (10)). Again, the higher the score, the greater the risk. And the greater the risk, the greater the need for mitigation and control measures.
At first glance, four variables seem to be necessary to be identified for each of the steps in a process:
[Potential Cause of Failure]
[Failure potential]
[Potential Failure Frequency]
[Severity level of the consequence of a potential failure]
But despite the great rigor and real validity of the approach, a central element is conspicuous by its absence, the element par excellence that justifies the very existence of any organization, the element around which the mission statement is articulated: the customer. Or rather: the customer impact or the impact on the customer experience.
This customer, who must remain at the center of our concerns. The one for whom and according to whom processes must be optimized and quality maximized. The place reserved for him can be that of your market share.