One of the greatest issues facing quality management professionals today is identifying problems and quantifying their impact on two critical metrics: profit and customer satisfaction. It wasn’t too long ago that the primary method for detecting quality problems was inspection, a time-consuming, expensive and laborious process.

Despite the extreme level of effort involved in inspecting goods throughout the production process—requiring repeated inspections from the time components hit the receiving dock until the finished goods left the shipping dock—inspection as a quality management methodology was an expensive failure because it failed to identify problems until later in the process, when the costs of remediation become much higher.


History And Evolution

In the mid to late 1920s, an explosion of innovation in the quality control field occurred. Walter Shewhart of Bell Labs developed the concept of statistical control, and the work of William Ernest Johnson enabled logical application of statistically significant sampling methods into quality management methodologies. The idea that every process produces variations, some of which are natural to the process and other variations that aren’t always present. A process with only inherent variations is said to be in control, while a process showing other variations that affect conformance to specification is defined as not in control.

Throughout his life,W. Edwards Deming promoted and popularized the ideals of statistical process control, and his teachings became a bedrock of modern quality practices. His devotion to quality improvement kicked off a golden age of new quality assurance (QA) tools and methods.

However, as the adage says, when you only have a hammer in your toolbox, every problem looks like a nail. Until this time, QA pros only had hammers.

But suddenly, QA pros found their toolbox increasingly overflowing with new tools and methods for managing quality. As those additional methods of identifying quality issues became available, the problem facing the QA team shifted from finding problems and isolating questionable materials to quantifying the impact of the problem and identifying the root cause.

They went from wielding their hammers at every problem to overusing and overanalyzing many problems because of a lack of understanding about which tool would yield the most informative results in a specific situation.


As the adage says, when you only have a hammer in your toolbox, every problem looks like a nail. Until this time, QA pros only had hammers.


Clarifying The Source Of A Problem

Now that quality practitioners could easily identify quality and process problems, they were faced with a near infinite number of quality “issues” to be addressed. The problem shifted from identifying a problem to prioritizing the problem.

If you’re familiar with the evolution of the automobile, you may remember when cars were simple enough for a “shade tree mechanic” to perform simple repairs like changing the bulb in a headlight.

As automobiles became more complex, and in many cases, smaller, changing a lightbulb became more and more difficult. Most amateur mechanics no longer attempted to change their own bulbs, and many people avoided taking their cars to the pros for repairs because of the expense and complexity of the repair. As a result, many people adopted unsafe practices so they could continue using the product—they drove their cars with burned out headlamps until forced to correct the problem because they failed annual inspection procedures or were cited by law enforcement.

When the automotive industry first began looking at the problem, they tried to invent ways to make it easier to replace the bulbs by creating even more complex contraptions like the specialized headlamp winch. Rather than solve the problem, they only added time, cost and complexity. It wasn’t until years later that the real problem was satisfactorily resolved. It wasn’t by making the bulbs easier to change. It was through the introduction of bulbs with longer lifespans. If you don’t need to change the bulbs, you don’t have to worry about making them easier to change.

Had the engineers of the earlier time had access to tools such as root cause analysis, kaizen, or even used a simple “Five Whys” process, the real problem might have been identified long before it was, saving auto companies the bundles of cash spent on ineffective redesigns, improving safety for drivers and passengers, and improving all-around customer satisfaction.


Clarifying The Cost Of A Problem

Another solution automotive engineers might have considered early on in their quest to solve the burnt-out lightbulb problem might have been to include a few spare bulbs with every car. The cost of the extra bulbs pales when compared to the cost of adapting designs, so the headlamp became part of a major assembly.

Consumers of the time were used to changing light bulbs in their homes. Far from viewing the spare bulbs as an admission of poor quality, it’s likely most consumers would have viewed it as a piece of complimentary safety equipment preemptively provided by the auto manufacturers for their convenience. For very little additional expense, they could have solved a problem and improved their image in the customer’s eyes. Instead, they spent millions to solve a problem with an extremely inexpensive part.

This is where cost of quality analysis can be helpful, and in fact, it becomes essential to an effective quality program.

Not every quality problem must be solved. That sounds like heresy, but in many cases, it can be true. If an identified problem doesn’t cause a safety issue or render a product completely unfit for purpose, it may not need to be “fixed.”

There may be a point where changing a problem component exceeds the costs associated with fixing units under warranty. Changing a process may eliminate some poor-quality units making it to finished goods and ultimately to the customer, but it may also add time and expense that renders the product non-competitive in the market.

It’s important to realize that quality improvements offer diminishing returns. As overall quality improves, the incremental value of additional improvements goes down. Every company needs to decide where they choose to draw the line between quality and cost, quality and time to market, and quality and reputation as measured by market share.


Avoid Temptation

With such a variety of tools available to them, quality professionals need to be careful not to bring a sledgehammer to a problem that can be solved with a nutcracker. It’s easy to become enamored with the latest tools, fancy colored charts, and the illusion of the best results they may bring, but a complex, sophisticated tool may add an additional layer of unnecessary complication to a problem.


The Role Of An EQMS

This is exactly where an enterprise quality management system (EQMS) comes in. The best EQMS solutions integrate seamlessly into existing business solutions and production processes, helping to identify and prioritize issues, and analyzing the costs of the problem and the proposed solutions. The built-in reports and data analytics help QA to report to management in the language they understand. A modern QA department has important needs, and the role of the EQMS solution is to serve those needs.