Uncontrolled variation is the enemy to quality. It’s what experts W. Edwards Deming, Joseph M. Juran, et al, spent their professional lives stressing to industrial leaders. It remains the enemy to management because the concept of variation remains misunderstood.

A quality engineer whose job was to produce quality metric charts for monthly management meetings was disturbed that senior managers would react positively when a data point went the ‘right’ way but if a data point went the other way, the reaction was painful for everyone. Eventually, the quality engineer added control limits to each of the charts demonstrating sensitivity to variation and when to react to ‘out of spec’ variation. The result had a calming effect on the entire team. Morale improved as efforts were focused on identifying and eliminating special cause variation which subsequently resulted in business improvement.

The approach had positive results because the managers had a basic knowledge of statistical process control (SPC), which unfortunately many managers don’t have. A barrier many quality engineers run into is that there are many managers who have been trained to think of quality as “in-spec.” Some people actually say “If it’s in spec it’s good and if it’s out of spec it’s bad. So why do I need SPC?”

One way to explain the benefits of SPC is through the use of Dr. Genichi Taguchi’s Loss Function (sometimes just referred to as the Quality Loss Function). This tool can often teach front-line managers that “in-spec and out the door” can be the same as throwing money down the drain.

To make the point with managers, consider the following example. Let’s say your company produces widgets. What is the cost incurred by your company if a widget is far outside of the specification limit?

There are only four things that can happen. (1) The widget can be scrapped, losing all the expense (material, labor, etc.) you have put into producing it to this point. (2) The widget can be reworked, with the additional associated costs in time, material and labor needed to put it back into compliance. (3) The customer can be contacted to get approval to ship it with deviation. The customer may then have to institute special handling in order to use the widget’s out of spec condition. The greater the widget is out-of-spec causes the costs associated with making it work increase rapidly, outweighing any benefit realized by using the widget in the first place. (4) Derate the widget (if it’s possible) to sell it at a lower cost for use in a less stressful environment.

Now ask those same managers, if the widget was just barely outside the spec limit, would the same costs occur? Their answer would likely be no, although some cost would still be incurred. It would be unlikely the part would be scrapped with a total loss. It could be likely that the customer would accept the widget on deviation. However, they may ask for a price reduction if they have to tweak their process in order to accommodate the out-of-spec widget. In absence of accepting a deviation the widget could undergo minimal (light) rework to put it back into spec. To the point there is always a continuum of incurred costs, but the closer to the spec limits, the lower the loss.

The ideal is for a company to get the widget where the customer wants it—on target every time. Once the spec limit is exceeded there is an increasing cost to the part. But what costs do we experience further inside specifications?

As variability increases, so does the need for infrastructure to detect out-of-spec conditions. For example, as variability approaches the spec limits, there is a need to increase the sampling rate and add inspection overhead. In-process part variation may cause greater internal scrap rates or process costs as well.

A part that is in-spec can, therefore, incur cost on a continuum as well. As variation increases, so do the costs at an ever-increasing rate. If a part is made out-of-spec, it has NO quality because the customer is not getting the promised product.

How does variation even within spec limits affect the cost of the process? Dr. Taguchi concluded that these quality losses can be modeled. He taught that poor quality starts at the spec limits and quality gets better the closer one gets to target.

The quality staff looking to influence their management on the benefits of utilizing SPC should consider the following:

  • SPC reduces costs by ensuring that process adjustments happen only when they need to happen.
  • SPC reduces the variability of the product or service, thus improving customers’ quality experience.
  • Reducing variability around the target reduces cost.
  • SPC data can be used to capture further cost reductions.

By using SPC, companies can minimize the variation of their processes by identifying, reacting to and eliminating sources of variation; therefore, SPC can be an effective tool to save a company money.