More companies than ever are taking measurement uncertainty into account when it comes to gage calibration, and for that matter, even on product measurements. This is due to the fact that a measurement without an uncertainty statement is basically a “reading” with nothing to indicate how good or bad it might be.
The gage making industry has been grappling with this subject for some time with respect to the calibration of gages. Users of calibration services also are starting to pay attention to uncertainty statements on the reports they receive. Instead of enlightening the reader, the reports often add to the confusion.
Simply speaking, when we measure something we can never get the real or “true” value for that feature. It doesn’t matter who you are, there will always be a difference between your reading and the true value. In days gone by we’d use the term “close” as in, “How close do you think your reading is?” The question arose from experience so it could vary significantly between people measuring the same item with the same equipment.
Uncertainty values are determined by taking every element that affects a given measurement and processing it to a common level using simple mathematics even I can understand. The key here is that you have to know your metrology to get those values.
When the number crunching is done you have a value for uncertainty for a particular measurement using specific equipment, which let’s say is 0.0001 inch or 0.0025 millimeter. This means that your “reading” plus or minus the uncertainty is where the true value for the measurement will be. It doesn’t mean your reading is incorrect by that amount-only that it could be.
You’re probably asking yourself where there could be a problem with all of this, providing everyone follows the rules. People who calibrate gages ran into the situation as can be expected since they are attempting measurements to very fine limits. The problem arises when the amount of the measurement uncertainty represents an uncomfortable percentage of the tolerance on the gage being measured.
It is not uncommon for the uncertainty to equal the tolerance on the gage being measured, which makes it impossible to determine whether the gage is within specification.
This situation pops up a lot when thread gages are involved due to the specifications the gages are made to. Specifications that were created before the limitations of measurement as we now understand them were known at the industrial level.
The problem has now mutated into a “How can we fix this?” situation.
One way to bring the planets into alignment on this issue is to borrow some of the threaded product tolerance and give it to the gage makers. As a gage maker, I think this is a great idea but, of course, if I was a fastener manufacturer, I might not be quite as enthusiastic, if not outright hostile.
This would allow us to more properly apply uncertainty; the reading plus the uncertainty would have to be within the gage tolerance like the European standards call for. Now, if a reading of size is on the top limit, people accept the gage when, due to uncertainty, it could be oversize by the amount of the uncertainty.
A brief look at the situation in this column won’t solve the problem, but we must remember that there is a problem and it won’t be going away any time soon. And while I mentioned thread gages in my example, those plain gage people aren’t getting off any easier. There are major problems with uncertainty and the tolerances many plain plug or ring gages are made to that have the same problems-and worse.
While the battles will rage on for awhile, you should be aware of the uncertainties noted on calibration reports and other measurements so you can appraise your situation and exposure to arguments. Discuss the situation with your gage supplier or calibration service provider so you know where you stand.
The booklet “Searching for Zero” published by the American Measuring Tool Manufacturers Association has more information on this subject. Visit www.amtma.com or e-mail firstname.lastname@example.org for more information.