Measurement / Test & Inspection
Measurement

Chasing Microns: Getting the Most from Dimensional Gages

If you’re trying to measure to microns, every micron counts.

November 1, 2013
/ Print / Reprints /
ShareMore
/ Text Size+
Trans

Click here to check out the Chasing Microns video content!

Most gages on the shop floor provide a specified level of accuracy in conditions for which they were designed. But in the production environment, where tight tolerances are a way of life, it’s critical to think about gaging requirements before putting instruments out there and possibly having them not meet expectations.

In short, if you’re trying to measure to microns, every micron counts. Thus, it is very important to ensure that proper thought is given to the gaging process. Ask yourself: “Where may I lose a micron here or there that I should be accounting for, and how do I prevent that from happening?”

When you first start on the path to improved gaging performance, it’s very easy to pick up microns, but as your gaging performance improves, those microns get harder to find.

Selecting a Gage

The first step, of course, is to select a gaging system that is right for the application. Many factors enter into this decision, the most important being the inherent accuracy of the gage itself. A good approach is to think of this in terms of levels of increased performance. Below we have classified the levels of performance of shop floor handheld gaging used by operators to qualify parts.

Level 1: Basic Shop Tools.

Versatile, low precision and low cost

At the low-precision, high-flexibility end of the spectrum are handheld gages such as calipers and micrometers. Both are extremely versatile and useful tools for making a wide range of distance measurements (both ODs and IDs). Of the two, the micrometer is more popular and offers a small step up in accuracy and performance—but with a shorter measuring range. However, both require time and operator skill for positioning the tool and interpreting the measurement result. So while they can measure a wide range of different parts, both are at the mercy of the operator, and measuring system accuracy leaves much to be desired.

Level 2: Comparative Gages.

Increased performance and throughput

Once tolerances reach the 12 micron (µm) level, you begin to enter the region where a comparative gage such as a bench stand, snap gage, or ID/OD gage is required.

Insert a snap gage onto a workpiece and you’ll quickly understand how these extremely effective, fairly simple OD gages got their name. Once you’ve overcome the “locking” spring tension, the part suddenly slips in against the backstop, making contact with a good, healthy “snap.”

Snap gages can be handheld to measure workpiece ODs while still on the machine, or can be mounted on stands for use with small parts. The heart of the tool is a simple C-frame casting, and measurements rely upon a direct in-line 1:1 transfer of motion. These factors make snap gages simple, reliable and fairly inexpensive.

With a standard dial indicator installed, the measuring range of an adjustable snap gage is typically 0.5 mm. When combined with a digital indicator, a high precision snap gage can resolve and accurately measure to 1 µm. This provides the performance needed to reliably measure those ±12 µm tolerances.

Level 3: Dedicated Comparative Gages.

Highest performance and speed

The comparative gages mentioned above offer some adjustability, allowing them to be set for comparative measurements on a number of particular sizes. For example, a mechanical snap gage may have an inch of range adjustment, but its actual measuring range around the master setting would be ±0.5 mm. This combination of adjustability to size and short measuring range provides a good balance of versatility and performance.

But as is the case with most dimensional measurement, the shorter the measuring range, the higher the performance. This is the case with “fixed plug” gaging. This type of gage is made specifically to measure a certain size. Examples include air gages and mechanical plug gages. (Air gages, by the way, were the first high precision gages brought to the shop floor, back in the 1940s.)

These gages are made to simulate the dimension being measured to within ±0.5 mm or so. The sensing unit is set to a master and the operator compares the measured part to the standard. Very high resolutions can be obtained with this type of gaging, down to 0.1 µm with very good accuracy. In addition to excellent performance, this type of gaging is also the easiest and fastest method for an operator to use. Because the gage fits so closely to the part, the operator simply has to put the gage on the part and the gage will set itself. So there is virtually no operator influence, and thus, very high throughout.

In terms of utility, air gaging or fixed plug gaging is quite capable of measuring a shaft tolerance to ±5 µm, on the machine and in a machine shop environment.

Level 4: Linear Length Gages.

Highest performance, lowest throughput

As noted above, the focus of this piece is shop floor dimensional gaging. But there is one more level of gaging that is the ultimate in precision and offers good versatility, but may not be the fastest method. However, if the goal is to get the best performance, then a shop oriented linear length machine may be the solution.

A linear length gage can be thought of as a bench mounted digital comparator, but designed for rigidity, stability and accuracy. These bench gages are capable of measuring parts and critical levels off masters as long the environment can support the measurement. Just as the comparator offers versatility, the linear gage does the same. But in certain cases, such as reference pins, valve stems and the like, this type of gage offers a good balance of performance and versatility.

Considering Uncertainty

But there are other factors that influence gage performance—especially with higher performance gages—such as gage linearity, long-term stability, and even bias from the gage design. Combined, these measuring system-based factors are called the “measuring uncertainty.” When chasing microns, you need to understand how the uncertainty of your gaging system is affecting its performance.

Measurement uncertainties are always determined and disclosed by facilities doing measurement standards for such things as gage blocks, master rings and discs. But it is also not uncommon for measuring uncertainties to be taken into consideration on instruments used for the inspection of products. Only by determining the uncertainty of an inspection system for production parts can you determine what part of the tolerance band is “left over” for actual production.

Thus, every measuring instrument or system has an uncertainty budget that you need to know in order to determine if it meets your needs. Drawing tolerances—which are often extremely close—are narrowed even more if the measuring uncertainty is too high. The upshot is that imprecise measuring devices increase production effort and, therefore, cost.

The present internationally approved standard for the determination of measuring uncertainty by calibration labs is known as the GUM method (“Guide to the expression of Uncertainty in Measurement.” For more information on the NIST version, go to www.nist.gov/pml/pubs/tn1297/index.cfm). These uncertainty factors have also long been known by the acronym SWIPE, as follows:

  • Reference Standard: Whether an internal reference or an external standard such as a gage block, master ring/disc, etc., standards can influence the measurement by being out of calibration, dirty, nicked, or by having some form of surface finish issue.
  • The Workpiece being measured: Fixturing method, alignment, distortion through measuring force and dead weight, size and type of the gaging and datum surface(s), roughness of the gaging surface, undetected form errors of the gaging surface, form errors of the datum surface(s).
  • The Measuring Instrument: Gage design, robustness for the measurement, alignment issues, sensor linearity, deviations of the measuring axes, irregular movement during measurement, errors in the electronic indicating and control system (e.g., rounding errors), software errors.
  • People: Misinterpretation of the drawing specifications; excessive clamping force when fixing the part under measurement; selection of the wrong probes; selection of the wrong parameters (e.g., wrong profile filter, excessive measuring speed); programming errors and computation errors; heat radiation; physical shocks. For this reason, it is important that operators have comprehensive training in metrology, and in the correct adjustment and operation of their measuring instruments.
  • Environment: Temperature (fluctuations), radiant heat (e.g., from the operator or the lighting), air refraction/gradients (influencing optical based systems, including lasers), humidity, vibrations and shocks.

However you determine uncertainty, whether by doing all the math according to the standard or crude estimate, think of it in terms of a budget. Once set, this becomes a fixed cost, like your rent or mortgage payment: something you can’t really change unless you take a drastic step, like moving, or in this case, changing the gaging system.

But the cost of acquiring a gaging process with a lower uncertainty may be well outside of your dollar budget for gaging. With such tight tolerances today, it is not uncommon that even the best gaging process will use up a significant part of the overall tolerance. On the other hand, lowering the gaging cost with a process having a high uncertainty to tolerance ratio may be too risky and cost more by letting bad parts pass through.

A Case in Point

How does all this work in practice? Here’s a case in point: a manufacturer recently came to us with a requirement to inspect a wide variety of hole sizes on a line of valve bodies.

Some of the relevant parameters for this gaging situation included:

  • Throughput. With literally hundreds of thousands of parts to measure, inspection had to be fast and foolproof.
  • Output. The manufacturer required the capability of automatically collecting data for SPC.
  • Ease of Use. The parts being gaged were large, so the gage had to come to the parts, not vice versa.
  • Accuracy. Most hole tolerances were ±25 µm, but some were as tight as ±12 µm.

Hand instruments, such as calipers and micrometers, just didn’t have the performance required for this particular application: they simply used too many microns of error. By the same token adjustable bore gaging wouldn’t do the job either. While it had the flexibility to cover a large range of hole sizes, it was rejected because of the time-consuming operation of sweeping through the part, and the high skill requirement for the operator.

Moreover, the manufacturer specified a GR&R (gage repeatability and reproducibility) requirement of 20% or better on holes with tolerances of ±25 µm. This meant that the gaging system had to perform to 4 µm or better. This requirement meant using standard gage plugs, and standard digital indicators with resolution of 1 µm: the GR&R achieved with this setup was less than 16%.

On holes with tolerances of ±15 µm and ±10 µm, however, the manufacturer required a GR&R of 10%, which translated to gaging system performance of 1 µm. Given the other parameters of the application, mechanical plug gages remained the only practical approach, but we had to find a way to stop the flow of lost microns to “squeeze” more accuracy out of the situation.

Plug gages are typically engineered for 50 µm of material clearance in the holes they are designed to measure, to accommodate undersize holes and to ease insertion. But the greater the clearance, the greater the amount of centralizing error. This is when the gage measures a chord of the circle, and not its true diameter. By reducing the designed clearance, centralizing error can be minimized, thus saving a few parts of a micron, albeit with some tradeoff against ease of insertion.

We engineered a special set of plug gages, with minimum material clearance of 15 µm. The standard digital indicators were also replaced with high-resolution units, capable of 0.5 µm resolution. This combination satisfied the requirements, generating a GR&R of less than 8.5%.

Thus, the first step in selecting a gaging system is to choose one with the appropriate level of performance for the application. Second, planning decisions should take into account the total tolerance band and the budget that you can afford to live with. Finally, when you are trying to squeeze more accuracy out of a gaging situation, look for opportunities to reduce or eliminate one of more of these SWIPE factors. In this way, all those microns will be used in improved gaging accuracy, not lost on the floor somewhere. 

 

Did you enjoy this article? Click here to subscribe to Quality Magazine. 

Recent Articles by George Schuetz

You must login or register in order to post a comment.

Multimedia

Videos

Podcasts

 In honor of World Quality Month, we spoke to James Rooney, ASQ Past Chairman of the Board of Directors 2013, for his take on quality around the world.
For more information, read the ASQ Speaking of Quality column.
More Podcasts

THE MAGAZINE

Quality Magazine

cover_image

2014 April

Check out the April 2014 edition of Quality Magazine for features!
Table Of Contents Subscribe

Manufacturing Process

Has/does news about a manufacturers’ recall (like the GM recall in the news now) cause you or your company to reexamine its manufacturing process?
View Results Poll Archive

Clear Seas Research

qcast_ClearSeas_logo.gifWith access to over one million professionals and more than 60 industry-specific publications,Clear Seas Research offers relevant insights from those who know your industry best. Let us customize a market research solution that exceeds your marketing goals.

eNewsletters

STAY CONNECTED

facebook_40.png twitter_40px.png  youtube_40px.pnglinkedin_40px.png