Modern cloud-based SPC software should not just aggregate and analyze data; it should also present the relevant information in a streamlined way for factory floor operators to make efficient and immediate decisions.

In a practical sense, that means charts and graphs computed by robust algorithms in the backend should not bog down the operator’s decision-making. Rather, it should trigger easy-to-follow alerts and notifications in an intuitive dashboard.

This is where user experience and advanced statistical methods meet face-to-face, and where SPC can provide efficiency gains not just for quality management and Six Sigma teams, but for workers on the front lines of productivity.

InfinityQS Vice President of Statistical Methods Steve Wise recently spoke to Quality about the six ways modern SPC software can realize efficiency results for operators and help workers maintain quality and reporting consistency as they juggle myriad tasks.

1. At-a-Glance Command

When InfinityQS developed its latest generation of software, the initial instinct was to provide operators with the charts and graphs generated by the back-end software. But that obscured the most important information needed by the operator, which really comes down to “Do I do something, or do I do nothing?”

The software company quickly realized that rather than having to dig through charts to make a go or no-go decision, the software’s dashboard should simplify that decision-making process for the operator, at a glance.

“So, forget the charts, just have all the number crunching going on under the hood and then send out notifications if they need to do something, such as collect data” Wise explains.

The dashboard can guide the worker through their day, providing a path and schedule of data collection and other tasks that they can use to time organize their lunch breaks or other tasks.

data collection

*Click the image to see a larger version.

“They collect the data, and they can see the numbers going in in real time,” he explains. “If there's a problem with that, they’re notified. Or if everything goes in just fine, then nothing happens.”

2. No More Missed Opportunities

Operators have many simultaneous responsibilities, and sometimes they miss their data checks.

“It’s amazing how often those get missed,” he says. “If there’s nothing telling them they missed it, they just miss it. They’re not going to miss those opportunities anymore, because this workdesk is telling them things are coming up. If the company says, ‘Pay attention to three different statistical rules and pay attention to these types of specification violations or boundary violations,’ then none of that gets missed either, because there are algorithms in the background. It tells them, ‘OK, there are open tickets that you need to deal with, there are 12 of them, and there are seven that you need to do something with. There are no more missed opportunities.”

3. The Software Knows What's Coming

Several years back, while visiting a chicken processing plant, a food engineer told Wise, “This chicken plant is a conversion process. All we’re doing is converting grain to meat. The more efficient ways we can convert grain to meat, the better off we are.”

Considering this, Wise says manufacturers must look at every stage on the operating line as a conversion point. If no conversion happens, and nothing’s been transformed, nothing happened. There’s nothing to collect.

For data collection to be truly successful at these points, the software must know the recipe ahead of time. Bills of materials come together to form these transformations, and software should be preloaded with what is supposed to happen at any given moment in the schedule.

“Let’s say we’re a beverage bottler filling cans of cola,” he says. “At that filling operation you’ve got CO2, syrup, water and other ingredients that come together in order for that transformation. So the software knows what I’m making right now. There’s a recipe. It knows what the input parts are automatically, and then it knows on these input parts, ‘What do I have to check?’”

checklist

*Click the image to see a larger version.

So, when it’s time to collect data, the software automatically knows what needs to be checked based on what is being made at the given moment, and what data collection should be performed by the operator at a given workstation.

“It knows what shift you’re on, so when you collect the data, you’re tagging it to the correct shift,” he says. “The shift is very important. All of this happens in the background, and it all comes together when I click that button to collect the data.”

In previous software, operators had to manually sift through input data and manually tag it to the current data collection if they wanted good analysis on the backend. All that metadata is now preloaded. “Factories are doing this stuff anyway; they have to do it to run the factory. So now the SPC system picks up metadata from other systems and automatically populates the SPC environment. It’s greater efficiency for everyone.”

4. Self-Regulating Compliance

This point ties in with “No More Missed Opportunities,” Wise says, and can help prevent operators from skirting important compliance reporting metrics required by customers.

“If you’ve played this game, you know that if they’re supposed to collect data once an hour, and it’s an eight-hour shift, they collect eight data points in the first hour of their shift,” he says. “Or they wait and collect those eight at the end. They aren’t in compliance.”

To counteract that and to ensure that compliance is legitimate, software systems alert operators that their collections only comply if they’re performed at the required intervals.

“So there are different statuses for when the data was collected—whether it was early, late or missed. Each one has its own ramifications in this reporting. So, we have to keep track of that, and then we have compliance reporting based on shift, or by-part or by-process. But mostly we find by-shift compliance reporting is the one that customers are most interested in.”

5. Standardized Consistency

Businesses that make the same product in various factories in multiple states or countries need products to be consistent.

“There might be some local recipe changes depending on the location. But by and large, if you’re producing a certain type of product, there is an operation diagram that raw materials flow through in order to get finished goods out on the other end,” Wise says. “Since this is a shared system across the globe, the efficient way to do this is to build one process map that shows all the different steps going from raw materials to finished goods.

data collection

*Click on the image to see a larger version.

“If there are little nuances that happen at each factory, no problem, you have to provide the way at each step so they can tweak it slightly for their own use. And you tie that to regional differences or process differences where these process models automatically adjust slightly for those needs. But, still, you build the backbone once rather than having to build the backbone or framework in every factory.”

6. Software Maintains a Personal Relationship with Each User

At each work station where data are collected, specs and part numbers might be the same, but the measurement devices might be different. Some may be automated, some may be handheld.

“It’s a personalized experience when an operator is at a particular workstation,” he says. “It knows what instruments they need at each operation to do their checks.”

On the other end of the personal relationship are color themes. Operators can choose from a dozen different themes, including two designed to be visible for the colorblind. The software also allows users to quickly sign in securely with a few keystrokes rather than using an arduous username and password. The software knows where the operator left off, and what’s next. A bell icon, similar to ones found on social media and other consumer software, alerts operators in an intuitive way how many tasks or alerts have gone unchecked.

The personal relationship provided by the dashboard simplifies all the complex work done on the backend that has automatically stitched together multiple data flows, operation diagrams and process information that was siloed in the past. Aggregation that would have been performed manually—or not at all—in the past, becomes actionable information for every operator.