Electronic temperature compensation has been successfully applied to shop-floor gages for over 25 years. It has become a mature technology. Yet it is still poorly understood, and commonly undervalued. It has proven to be one of the most easily cost justified means to achieving the goal of gage correlation and elimination of the most common cause of high-resolution gage error, namely temperature. It can save many times its investment cost within months by maintaining gage R & R so as to effectively control production processes while temperatures vary, all in real time.

Gages are used to control quality during the production or rework cycle of metal worked components so as to ensure compliance with part specifications. It is vitally important that these gages can be relied upon to check dimensions accurately. Gages are tested for their accuracy and repeatability. These tests are usually performed at stable temperatures, at which time they may show acceptable results. Thereafter gages will cease to be repeatable or accurate as temperatures fluctuate if thermal changes are not taken into account. Run the same Gage R & R tests while changing the temperatures of part, master and/or gage fixture or gage head and a very different and unacceptable result will be obtained.

The International reference standard, ISO Standard (ISO 1 -1975) states: “The standard reference temperature for industrial length measurements is fixed at 20°C.” so the correct dimension, by International Convention, is that which is obtained when: PART, SETTING MASTER AND GAGE (the “elements” of a measurement system) are at 68°F / 20°C (unless otherwise specified). It can be impractical and expensive to try to keep all 3 elements of the measuring system at a stable 68°F / 20°C, but a properly specified and configured electronic temperature compensation system can sense the temperatures of each of them and correct a gage so that dimensions are displayed as if they were all at 68°F / 20°C.

Manufacturers of metal components used in industries such as automotive, transport, aerospace, railroads and mechanical machinery operate in highly competitive markets. Driven by such goals as improving mileage, environmental awareness, reducing wear and tear, warranty expense, rework or scrap costs, etc., tolerances on critical dimensions of moving parts and their enclosures continue to get ever tighter. It is not uncommon to find tolerances expressed to 3 (metric) or 4 (Imperial) decimal places. At this point the Laws of Physics need to be addressed. At this point it inevitably becomes more expensive to achieve the required levels of accuracy. When these tolerances are applied to dimensions in excess of 50 or 75 mm, 2 or 3 inches, dimensional measurements can display considerable variation due to temperature fluctuations.

Many CMMs have been equipped with temperature compensation capability but they are usually restricted to a thermal range of about 15°F or so, since three-dimensional compensation is particularly challenging. Single axis compensation can be effective over a much greater range such as 45°F to 130°F, a range of 85°F (30°C). There are other ways to reduce thermal effects in measurements but they can be very costly. Air conditioning or at least air tempering or coolant control, are expensive and usually less successful examples. Waiting for thermal stabilization, perhaps in a controlled environment such as a gage room, takes time. Ignoring the problem will ultimately take its toll in other ways, such as with customer rejects, warranty issues and end user dissatisfaction.

Temperature compensation systems are a cost effective way to solve the problem. They can help to squeeze out the much needed last few microns or tenths of a thousandth accuracy and repeatability in fluctuating environmental conditions.  However, they are not necessarily simple to define and set up. It is too easy to over-simplify the solution. It is not sufficient to specify to a gage supplier that “temperature compensation is required”. This leaves too much to interpretation and has led to instances where the technology has earned itself a bad reputation. A good system will measure temperatures of work piece, master and gage and correct for each of them if they are not at reference temperature using a customized correction coefficient. The sensors will respond quickly (or as quickly as physics will allow) and a compensation algorithm will correct measurements made by the gaging system so that they display as if all temperatures were at reference temperature.

When considering the use of temperature compensation it is worth spending some time defining the job that is to be done and defining specifically the expected outcomes. With simple part shapes (such as a cylinder liner or short shaft) the solution may be relatively straight forward and a single temperature sensor may suffice to pick up all relevant work piece temperatures so that repeatable accuracy (i.e. displaying the true dimension at reference temperature) can be achieved over a specified temperature range. When part geometries are more complex it can become necessary to identify zones on the part that are prone to exhibiting different temperatures at the time of gaging. Differing rates of thermal conductivity may apply to different zones with differing masses which may be separated by some distance so that they react to exposure to temperature in different ways. For example, in a washer or during a machining operation, or sitting idle for some time after such an operation, different zones on the part may respond at different rates to their recent exposure to thermal change. It becomes expedient to perform more empirical testing so as to ascertain the best locations for sensors and the best correction coefficients. This may be more expensive, but the benefits have proven to easily outweigh the cost.

Note that the term “Correction Coefficients” is used rather than “Coefficients of Expansion”. The reason is that typical handbook-derived coefficients are usually only good to within +/-15 percent or so, and other factors, such as geometry, different materials, inserts, etc., can affect the rate of thermal changes. Consequently it is best to perform empirical tests to determine best coefficient fit.

The most recent generations of compensation systems are user friendly and transparent. While password protection may be advisable so that programs cannot be altered without authorization, access to the key variables such as correction coefficients and dimensions that are used in the correction algorithm should be available to authorized users. Once installed and operational it is important that good documentation of the system is maintained. A manual should include an explanation of the purpose of the system, the need for maintenance of the sensors (i.e. keeping them clean and in the correct position to make good contact). Too often there are changes in operators or managers who are not familiar with the system. This can result in the system being ignored, neglected or being unplugged or otherwise taken out of service in the years that follow implementation. An easy reference manual can overcome this possibility.

While recognizing that budget managers are constantly striving to minimize capital expenditures it is worth considering that the investment pay back in temperature compensation has been studied by users. It is usually measured in weeks or months after calculating savings from preservation of Gage R&R as temperatures change, reduced scrap and rework, improved gage correlation, reduced rejections and lack of need for the much larger investment in temperature controls such as environment tempering or coolant temperature control.

For example, in one case a major car maker installed a new piston line and used temperature compensation on its gages at a cost of $40,000. Their rejected alternative option was to install a temperature controlled accumulating facility around the entire gaging operation at a cost of $1 million. The line was tested, and subsequently went into production. The tests, and subsequent monitoring over a period of 7 years, showed that the compensation system consistently corrected for over 97 percent of thermal errors. Pistons could be inspected to within better than +/- 1 micron repeatability regardless of ambient temperature changes and process variations.

Other examples that follow show the benefits obtained by other users.

• Temperature compensation of engine cylinder bores at a major US auto engine plant:

• Automobile engine block production rate increased: By over 50 percent due to increased accuracy and speed of gaging.

• Cylinder bore dimensional accuracy increased: 1 micron repeatability and accuracy held.

• Temperature sensing at multiple locations: Three sensors in gage head to account for variations within bore.











Example of test data from the cylinder bore gage. Curves compare uncompensated and compensated dimensions with room temperature (or “reference”) size while cylinder block is warmed.

Temperature compensation of In Process grinding gage:

Cp and Cpk improved: By over 200 percent.

Production rate improved: Through reduction of “spark out” and dressing times.

Dimensional accuracy improved: Held +/- 2 microns while grinding.

untFinally, data from a study performed after installation on an aluminum transmission housing and gage at another major auto maker. Note that 8 different bores were measured repeatedly while the part temperature varied by up to nearly 25°F (14°C).

untAs these cases demonstrate, properly implemented temperature compensation systems can do an excellent job of minimizing the loss of accuracy and repeatability as temperatures change. It is a cost effective technology that helps to meet the increasing challenges posed by ever tightening tolerance requirements.