Acceptance criteria, like purchasing criteria, begin with workpiece tolerance.



One of the major problems facing dimensional gage and measuring instrument operators is how to determine acceptance criteria for the instruments. When they are new, it's the specification to which they were made. But when it comes to how far you let them wear before you replace them, the rules are quite a bit different-or are they?

The easy way out for many people is incorrect and expensive. Whether it's thread gages or gage blocks, they simply list the "new" tolerance as their acceptance criteria when the items are used. In both cases they may be on bottom limit so after very little use, they are out of tolerance using this method.

"New" tolerances are for new products, not used ones. If you apply new product tolerances for acceptance criteria of used ones, you'll be throwing away a lot of perfectly good equipment and wasting money in the process.

I recall a conversation with a customer regarding calibration of his gage block set. He insisted that they be calibrated in strict accordance with the standard and the new tolerance was his acceptance criteria. All my brilliant explanations why this was foolish fell on deaf ears.

In frustration I decided to talk about something he would understand. I explained that the cost of doing so would be many times higher than the customary calibration cost-that caught his attention. Money beats technical details every time. So I offered to save him some money.

I asked him if the blocks had been used at all. He said they had been wrung together for various applications but, of course, had seen little use. I told him that he was wasting his time getting them calibrated because they wouldn't pass the surface finish requirements of the specification he was using for acceptance criteria. Now he started to listen.

We run into the same problems with fixed limit gages all the time. Fortunately the ANSI committee that produces the standards for thread gages is looking into providing wear criteria in the future. But the problem prevails for plain fixed limit gages.

We get similar calibration instructions from customers when it comes to measuring instruments. Once I decided to work out a price for calibrating a 0 to 1 inch micrometer in accordance with all the requirements of the standard for a regular mechanical model. It came out around 10 times the cost of a functional calibration. I have the customer's attention when I mention that number.

In the case of measuring instruments, you have to follow the old rules respecting calibration: What's it supposed to do? Is it doing it? How accurately is it doing it? A functional check in this case would mean measuring some masters you know the size of and little more. It's nice to know the anvils are parallel within 40 microinches or a micron, or that the measuring force is so much and all those other details that enable the micrometer to be such a reliable instrument. But in the end, you want to know how good it is doing what you want it to do. That answer, compared to the workpiece tolerances it will be verifying, is your starting point.

If you find the micrometer error is too close to the workpiece tolerances, you shouldn't be using a micrometer for that measurement. As a rough rule of thumb, the instrument error should be less than 25% of your tolerance.

Another situation arises with fixed limit gages when customers order them to the closest of tolerances because they want the best they can get. When it comes to calibration, the lab looks at the handle marking showing the tolerance and that's the basis for accept/reject criteria. A perfectly good gage will get the heave because it did not need to be as accurate in the first place and probably could wear down by several times the gage maker's tolerance and still be satisfactory for the application.

Acceptance criteria, such as purchasing criteria, begin with the workpiece tolerance.

Don't you wish you could set up a simple rule such as that for your domestic life instead of trying to work with criteria like, "It's right when I say I like it?"