Six Signs an Organization’s SPC is Outdated
Aggregation, consistency keys to quality improvements.
Today’s cloud-based SPC software can help manufacturers find the perspective and data to realize transformative savings and efficiency gains.
The cloud provides a single location to which data is saved from all plants and production lines, giving access to everyone from C-suite executives to operators. Those gains can be real and immediate, says Doug Fair, chief operating officer at InfinityQS. After more than 30 years as a statistician working with manufacturers to optimize efficiency and find cost savings, Fair has seen the difference first-hand between organizations operating with a modern approach to SPC and those lagging behind.
Fair described several common signs that indicate an organization needs to upgrade their SPC system.
1. The Company Views SPC as a Control Chart
A control chart is a great tool to use on a shop floor, and it is a vital part of an SPC system. But a control chart generates only about 15% of the improvements that are attainable, Fair says.
“A control chart by itself does not define successful use of SPC,” he explains. “And the companies that have relegated SPC to just the shop floor are missing out on massive opportunities to reduce costs and become more profitable and more competitive in the marketplace. So, if I go out to a company to talk about SPC, and they take me down to the shop floor first, that’s great, because that’s generally where data collection occurs and where we use control charts to control processes.
“But if that’s all they show me, then they’re missing out on 85% of the improvement ability of a real SPC system.”
2. Managers Don’t Have the Ability to Summarize and Report Data
Fair says that if managers don’t have access to shop floor data, or if all they get is a control chart when they ask for a report, then the SPC system is outdated.
The most impressive organizations that Fair has worked with use data from the shop floor to control processes, but they also aggregate, summarize, and analyze that data on a regular basis.
“These organizations recognize that when data is collected on the shop floor, there’s a cost associated with it, and they want to get a return on their data collection investment,” he says. “But other organizations gather data just to relegate it to a database somewhere just in case they need to see it sometime in the future: that’s wasteful and short-sighted.
“Maybe a company has one plant with several production lines—maybe the company has 600 plants around the globe—but if that company is making a product on multiple production lines or in multiple plants, they have to be sure the product is consistent, regardless of which plant or which production line it’s being run on. Aggregating data across production lines and plants provides insight into where that’s not happening. Plus, it gives managers the ability to see the big picture of quality and identify opportunities for improvement and cost reduction across the entire enterprise. My experience is that data aggregation is where 85% of an SPC system’s improvements are identified.”
3. Six Sigma Teams and Other Groups Don’t Have Access to the Data
Another indication of outdated SPC is when a company doesn’t put the gathered data in a place where Six Sigma teams can access, extract, and analyze it.
“We’ve got Software-as-a-Service (SaaS) systems out there that support data collection from multiple production lines and multiple plants around the globe —and it saves all that quality data to one database—a centralized data repository. That means Six Sigma teams around the world have access to the same information. It’s a ‘one-stop shop’ for data,” Fair says. “As a result, they have easy access to data that can help identify where they need to focus their efforts in order to make the greatest impact on quality in the shortest amount of time.”
4. The SPC Software is Localized on Premises
When installing on-premises software, an organization with five plants needs five different installations.
“It could be the same software provider, but the result is usually unique deployments and different naming conventions at each plant,” Fair says. “Plus, because on-premises software regularly requires upgrades, you’ve got potentially unique software versions at each of these plants. That’s because companies are loath to upgrade software. It’s a pain. It’s time-consuming, and it’s hard to do.”
In addition to the five separate systems at each one of those plants, users could potentially name the same product code differently at each plant. The result is that, even if companies wanted to roll that data up, it’s either impossible or extremely inefficient.
“SaaS systems don’t require users to upgrade because updates are performed automatically by the software vendor. Plus, because SaaS systems house all plants’ data in a single, centralized repository, naming conventions are standardized, so there’s no confusion.”
5. The Organization Does Not Review Data on a Regular Basis
“That’s a massive mistake,” Fair says, “because 85% of the improvements in quality, productivity, and costs are due to data aggregation and analysis at a higher level than the shop floor. So, if companies aren’t regularly taking the time to analyze summaries of data, they are missing out.
“When companies summarize their data and look at it regularly, they uncover previously unknown information. Typically, they say, ‘We had no idea we had these issues.’ They can then apply their time and resources in the right places to get things fixed—and made dramatically better—in a very short period of time.”
6. The Organization Only Reviews Red Flags, and Ignores Insights Hidden in On-Spec Data
If companies are only using SPC to focus on data that triggered an alarm, they’re missing out on opportunities for improvement.
“Companies need to look at the data that’s within specifications,” says Fair. “I have had many experiences where multimillion-dollar bottom-line improvements have been made because companies closely scrutinized in-spec data. Don’t just look at the exceptions. Don’t just look at the stuff that’s bad. Analyze the data that is ‘good.’”
Fair recalls an example of a food manufacturer where he encouraged the company to aggregate in-spec data across 20 product codes for the year. Careful analysis enabled the company to identify several million dollars of raw material savings due to the weights of the product going into the packaging. The company had been so focused on not underfilling packages that they were overfilling to a costly degree.
“By summarizing lots of critical weight data, we were able to see where they had issues, which production lines needed help, and which product codes were most problematic,” he says. “But again, the huge savings they enjoyed were a result of focusing on and analyzing data that was in-spec.”