Regular calibration intervals help ensure micrometer accuracy.

The standard used in calibrating measuring gages must possess an accuracy greater than a 4:1 ratio over the accuracy of the gage being calibrated. The accuracy of a gage block is typically ±0.000002 inch, and the accuracy of a micrometer is typically ±0.0001 inch. This exceeds the 4:1 ratio. Photo: Mitutoyo America Corp.

The inventor of the modern micrometer, Jean Laurent Palmer, introduced the basic principles of magnification based on screw threads. His design is based on a 1-millimeter, single-pitch open screw thread. One rotation, or 1 millimeter, was divided into 20 equal parts, bringing the smallest resolution to 0.05 millimeter. His design was so advanced and complete, that nothing has been added to or subtracted from his original concept in terms of the principles of operation.

To ensure accuracy, micrometers must be calibrated at regular intervals.

The 4 to 1 rule

• The standard used in calibrating measuring gages must possess an accuracy greater than a 4 to 1 ratio over the accuracy of a gage being calibrated. This common rule of thumb originated some 40 years ago from MIL-STD-45662A, which was first published in 1960. A gage block is used to calibrate an unknown micrometer. The "unknown" is the micrometer and the "known" is the gage block being used. This gage block must have a certificate of traceability to the National Institute of Standards and Technology (NIST).

• The use of gage blocks is always the best, because no other standards are more accurate than the gage blocks, and they are the highest in the hierarchy of precision.

• There is a dedicated standard for large micrometers called "standard bar." Often, the bar is supplied with large-size micrometers, and its end is either spherical or flat.

• It is suggested that the bar without certificate be compared against a traceable gage block or immediately compared against a traceable gage block. This can be done on top of the granite surface plate.

Calibration guidelines for micrometers using a five-point calibration method

1. Look at the frame, for signs of damage, such as indications that it was dropped on the floor.

2. Observe both spindle and anvil faces. They must be flat, free of pinholes and clean. Frosted surfaces may indicate wear.

3. If the micrometer needs repairs, they must be done first. Calibration may proceed only when the gage is in good working order. Move the micrometer from 0 to 25 millimeters (0 to 1 inch) or the entire range and check if the spindle feels continuously smooth throughout the range. In short, check if it is good enough for its intended use. A hard to turn spindle is a sign of age or damage. Do not force it to rotate and have someone look into the cause of malfunction.

4. The micrometer should turn smoothly and freely throughout the entire range, and it should not bind or freeze during the check.

5. Check five points. In inch micrometers check 1.000, 0.500, 0.250, 0.125 and 0.0625 with traceable gage blocks. The zero point is repeatedly checked but never reported. The check points presented here are one suggestion: take the entire range, split it in half, split it in half, split it in half again and split it for the last time. The points checked may be even better if they are replaced by 0.997, 0.502, 0.246, 0.128, 0.611 or any other number for that matter.

6. If 18.5 millimeters is checked on a daily basis, by all means include that number in the calibration procedure to safeguard the system.