Quality’s continuing conversation with Jason Chester, Director of Global Channel Programs for InfinityQS, on SPC and the smart factory. This is part two of the three-part series of our conversation.

Quality: What would you say of the statement that one terabyte of data is currently being created daily by the average factory but 99 percent of that data is not being used/analyzed by manufacturers?

Did you miss part I of SPC and the Smart Factory? No problem! Click here!

Chester: What is the difference between one TB of data, of which 99% is unused, versus 1 GB of data, of which 99% is used? It is not the volume of data collected or the percentage of it that is utilized that determines value. For example, take an ambient temperature and humidity sensor, which could be configured to capture and record a stream of two data values every millisecond. Along with metadata such as sensor ID, location, and a date/time stamp, that single sensor could be creating up to hundreds of megabytes of data per day. But 99% of that data would not be valuable because the ambient temperature and humidity in a facility will vary over much slower physical time scales. Creating a statistical summary of that data every five minutes, for example, would create significantly less data, but would provide much more valuable insight, especially if the temperature and humidity of the production facility have a causal impact on the manufacturing process. If the ambient temperature and humidity had no causal effect on the production process, then perhaps none of the data collected is valuable.

Just because we “can” collect the data, does not mean that we “should” collect the data. Given the low cost of sensors and the fact that most modern equipment is able to collect data across a myriad of parameters, it is tempting to feel that we need to utilize all of that data. The era of big data leads us to believe that data is king and the more we collect the better we will be. But collecting bad/useless data is worse than not collecting the data at all.

All data incur a cost to the organization, which is often grossly underestimated—there is the cost of the physical sensor or equipment used to capture the data at their source, the transport (or transmitting) of the data from the sensor to a storage solution, the computing power to perform calculations on the data, the cost of long-term storage and backups, the management and maintenance of all of those systems, and so on. The individual cost associated with a single-sensor data stream may seem negligible enough to not be concerned with it. But even with the low-cost sensors, networking, storage, and compute time available today (especially in the cloud), that cost can be significant when multiplied by the potential for a facility to have thousands, or hundreds of thousands, of sensors and associated data streams. There is also a significant cost impact from a workforce perspective, where data fatigue can drain effective decision-making and thus impact performance and productivity.

Therefore, an organization should not be obsessed with either the volume of data they can or do collect, or the rate of utilization. Instead, organizations should approach data as a strategic operational asset, build up an inventory of data assets, assess their relative value and applicability to the organizations’ objectives, and work out if, how, and where that data can be used to minimize cost, maximize value, and mitigate risk. As part of the data-intelligence transformation process, data cleansing and transformation may mean that some data originally captured is deliberately underutilized—bringing down this hypothetical rate—which is why aiming for 100% utilization of collected data is folly. 

Quality: If it is said that some manufacturing sites run as many as 150 software apps, how does it make that data accessible and actionable?

Chester: Gone are the days when the use of single, monolithic applications to run entire business operations was a desirable (if even achievable) goal. Instead, manufacturing organizations are increasingly recognizing that deploying an integrated portfolio of best-of-breed applications and solutions is the only achievable approach to delivering effective business outcomes. However, the downside side of that is it can lead to a proliferation of applications with many overlapping or duplicate capabilities and the creation of duplicate data and non-standardized data silos that hinder the organization’s objectives. And many manufacturers do not even know what applications are being used, and where—especially when departments, sites, regions, or divisions have the autonomy to deploy applications at will without alignment to any corporate objective. The long-term risks of this can be severe. Such proliferation of applications can also be a significant impediment to a digital transformation strategy. In such cases, those manufacturers first need to embark on a process of discovery, audit, and rationalization before any such projects can even begin.

Just like the treatment of data discussed above, an organization’s application portfolio should be carefully considered as an operational asset. They should also not “put the cart before the horse” and derive an operational strategy from the existing application portfolio. Instead, they should build an application and data architecture that provides the capabilities required to support their strategic operational objectives, whether that be digital transformation, Industry 4.0, smart factories, or any other objective. Once those capabilities are identified, they can then approach the market to identify the solutions and applications whose functionalities best match those required capabilities. The number of applications deployed is relatively subjective, with the number of discreet applications being a function of that process. It may be 10, 50, or 100, but the number matters less than the provision of the required capabilities. Just because an incumbent ERP solution may have a quality module, or that an MES solution may have an SPC module, does not mean that the quality or process control objectives should be shoe-horned into fitting those modules’ capabilities simply to keep the application numbers down. If more capable best-of-breed quality management or SPC solutions are available that provide a much stronger fit to the required capabilities, then they should be seriously considered as part of the application architecture.

For more information, visit www.infinityqs.com.