Whose Acceptance Limits?
Part 1 of 2: Learn more about these boundaries of precision.
More companies than ever are downloading decisions to calibration laboratories causing perfectly acceptable gages and instruments to be ‘failed’ by them. How could such a thing happen?
‘Acceptance limits’ are usually the culprit.
To avoid any confusion, you have to realize that calibration laboratories who do this are responding to customer requests—the ISO 17025 standard does not require calibration laboratories to make such decisions. However, it does have some requirements for those who do so. Some other standards may call for such pass/fail decisions to be made by calibration facilities but this is just downloading decision-making.
Making such decisions is not rocket science when you have the data but if the data you’re working with is inaccurate or irrelevant the ultimate decision will be also. Assuming the calibration data is reliable, the only data that could cause problems are the ‘acceptance limits’ and this is where the trouble begins. So, let’s take a look at what they are and what they mean and why they can be such a problem.
Acceptance limits are those boundaries of precision that a gage or measuring device must be within to be suitable for your work. The most forgotten words in that simplified definition are “suitable for your work” because that should be the key consideration as you will read later.
Taking a simple cylindrical pin as an example, let’s say it is supposed to be within +/-.002”. That is your working tolerance or limits for the pin. If you are using a device with a level of precision that is .001” to measure it, you can understand the odds are stacked against you getting reliable results. In a perfect world, you could follow the old 10% rule which said that the device doing the measurement should not be in error by more than 10% of the tolerance being confirmed. In this example that would mean the device has to be within .0002”.
This gives you wiggle room and does not eat up too much of your manufacturing tolerance. It also means that while a new micrometer specification has a level of precision in the order of .0001” or less, recalibrating it to ensure it is within the ‘new’ specification is a waste. All other factors being equal, you can keep using this instrument for this example until its performance exceeds the .0002” required for your work or double what the ‘new’ specification stipulates.
The same general rules apply to fixed limit gages such as thread or plain plug and ring gages.
But this is where a lot of trouble starts. The tolerances shown for ‘new’ fixed limit gages are what the gage manufacturer works to, but too many people ‘adopt’ these tolerances as acceptance limits for used gages. When such a gage wears outside of these limits the gage gets scrapped. A gage can be on the bottom limit for size and is considered good as it is within the gage maker’s specification or published standards. But it won’t be long before it will wear outside of those limits.
When measuring instruments are involved the actual specification they are made to is not always known. This is due to some manufacturers of them not stating what their ‘new’ specification is. In other cases, the device involved is a private labelled item and you don’t know what its specification is let alone what country it came from so using someone’s ‘national’ standards won’t help much either.
Even when you know what the manufacturer’s specification is you have to remember that it applies to the instrument as it came from the manufacturer. If you want it calibrated to the manufacturer’s specification you should realize that there are many elements to instruments that would have to be calibrated to confirm that it meets spec. This means very high calibration costs. What you really need to know is how precise it is doing what you want to do with it or more simply put: How precise is it from a functional point of view? The only way to answer this question is to have a current calibration report on it and what the tolerances are it will be used to verify. Your calibration lab may have some neat software to produce acceptance decisions on reports, but unless it has the correct tolerance information, their decisions will be of little value.
In my next column I’ll take this a step further and provide some hints to help you sort through it all.