Reporting in manufacturing is not a perfect science. Let’s start with that. It can be painful. It can be unreliable. It may provide information that is not specifically what you need. And, often, reports do not “slice and dice” the data in ways that are helpful to users at different levels of the organization.

If these problems sound familiar, it might be time to take a closer look at what you need from your reports.

Smoothing the Data, Dilluting Meaning

The way we go about reporting in the manufacturing world is essentially flawed. Data is incrementally summarized and reduced over a series of steps:

  1. When an operator on your manufacturing team is looking at data on the factory floor, they're looking at individual data points. And they're responding to each signal that comes into the control charts or from a sensor or some other feedback system. What operators like about this is that it’s an opportunity for an immediate response. They can quickly determine whether to correct whatever the issue might be or to just let it be because it's okay.
  2. In another part of the organization, the management team is looking for shift reports. So, the operator gathers all those individual data points—the data points they were managing during their shift—and rolls them up and gives the supervisor anything and everything they ever wanted: the average, the mean, the standard deviation, how many points were out-of-spec, and a whole host of summary statistics. They may even make a histogram of that shift and send that in with the shift report. The end result is they’ve got an eight-hour summary, or “smoothing” of the data, so to speak.
  3. Now, the supervisors must send monthly reports “upstairs.” They're taking all the shift reports from the floor and averaging them up to send to their executive team. Then, at the corporate level, they are doing quarterly reports because they must satisfy the folks above them: the board, the stockholders, and the like. 

As you start taking the exact same amount of data and rolling it up—and then up again—you are “smoothing it out.”

The results end up showing managers and executives blips on a spreadsheet that are just a couple of pixels higher than the last quarter. But managers are making critical business decisions based on those quarterly reports.

The corporate folks know that all they're seeing is the tip of the iceberg and something major had to happen downstream or upstream to create that blip. However, as they're looking at it, they really have no idea what that event was. 

Even if the data were all accurate, just the nature of rolling things up (and up again) and averaging and coming up with 12-month moving averages and so on, is overly concealing. It’s truly smoothing out the data. But all the fine details of each data point are just ripe for misdiagnosis.

Keep it Rough

Depending on your audience (and your position) you're going to generate different types of reports, right? It's just the nature of organizations. As you move up the food chain, the people looking at the reports have less and less time (and reason) to look at all the individual data points—they just don’t have the bandwidth. 

So, you need to convey meaningful data so the corporate folks can make better informed business decisions…without smoothing things out. To determine how to do that, you need to take one important step: 

Find out what information those people really need. 

Here’s an example:

A quality professional started a new job with a manufacturing company, and his role included generating and distributing reports. As a careful and forward-thinking person, he starts his new job by checking the reports that are currently (and probably always have been) going out to corporate. He does a double-take and immediately thinks “I don’t know if this stuff in the reports is any good or not.” He’s skeptical.

So, he tells his people, “Don’t send out this report. Just don’t send it. Let’s wait and see who calls.” When he did get a few calls, he asked questions that helped him get to the heart of things: 

  • What do you need in this report? 
  • What parts of the report do you use? 
  • What parts work for you? 

With this feedback, he refined the report, which began as multiple pages (and about an eighth of an inch thick), down to just a few pages. Users of the report were very happy.

Essentially, he stopped smoothing the data. Rather than roll up the data that had already been rolled up—and thus create a warmed-over average of an average that was of no use to anyone—he instead pinpointed the specific pieces that the users of the report found helpful and concentrated on those. He delivered what the users wanted and needed.

Tips for Meeting Reporting Challenges

In most cases, the less human intervention there is with the data as it is being collected and compiled, the more reliable that data will tend to be. A “hands-off” approach limits the potential for errors and alterations. So start with automating data collection to the most practical extent possible. Then apply a few additional best practices.

Sampling strategiesSampling strategies are regimented, not based on whims like an operator simply saying, “It’s the end of my shift; it’s time to collect my data.” A sampling strategy does a better job of picking up the true personalities of the population of the data—as opposed to just what happened in the middle or at the end of a shift.

Tagging data—Tagging the data appropriately is key to solid data collection. At a minimum, every piece of data is tagged with a time stamp, as well as the employee, feature, part, and process to which it is tied. By tagging data with these identifiers, you enhance the level of analysis that can be performed on the data—and increase the ways in which you can compare, slice and dice, and roll up the data.

Keep it simple—Keeping the data collection interface and process as simple as possible. Your reporting is really only as good, and reliable, as your data collection. The person who is involved in data collection doesn’t want to have to worry about confusing instructions, an ambiguous interface, or difficult sequencing of the data. Those issues can add time to their data collection that they just don’t have.

For example, let’s say you’ve got a part that requires ten different tests. The engineer who established those checks never really had to measure that part—they just laid sampling instructions out alphabetically, or something equally expedient. You then send that sequence of tests down to the shop floor and the operator gets one look at it and blanches. “This is crazy,” she says. “This adds 15 minutes to my data collection time just because of the way I have to orient the parts to take the measurements. This can’t be right!”

Resequencing the checks according to how the work is performed saves a lot of time and makes it easier for the operator. It also makes the data more reliable. Cutting back on the number of times the operator needs to manipulate the instrumentation during testing enables them to move through the checks faster. And they are happier.

Quality Intelligence Takes the Pain Out of Reporting

Reporting can be challenging, and painful, for manufacturers. You want to be able to slice and dice the data any way that suits your needs. You want to avoid smoothing the data when reporting to supervisors and corporate personnel. And you want your data to be as reliable as possible—so your reports are as accurate as possible. Great reports will help your organization make decisions that will transform the operations.

 InfinityQS quality intelligence systems take the pain out of reporting. Sampling and standardization go a long way toward making reporting an easy task that doesn’t take up all your time.

We invite you to visit our website to find out more about InfinityQS’ quality intelligence solutions Enact® and ProFicient™.