Take a closer look at common mistakes made in regards to calibration.



No matter how careful we are at any given task, mistakes often are made; the calibration of gages and instruments is no exception. Sometimes the mistakes are hidden by a quantity of repeatable numbers. At other times, the fact that two or more laboratories offer the same calibrated values is used as proof that the numbers are good when both could be making the same mistake.

While it will be assumed that newcomers to the calibration field are the greatest source of such problems, this assumption itself is a mistake. Many experienced hands in the game are prone to making mistakes as well, but because they’ve been at it for so long, it is assumed they know what they are doing.

There is a lot of “how to do it” literature out there but little “how not to do it” information-a void I’ll attempt to fill in the next few columns. Where will the information come from to fill this gap in our knowledge? Mistakes made by others-a very reliable source. Equipment makers and experienced calibration laboratories encounter these mistakes all the time, particularly in the heat of battle over measurements.

This column will deal with some general mistakes, while future columns will look at mistakes often encountered in the calibration of specific gages and instruments.

  • Skills. Too many companies assume that if someone knows how to use a particular device, they’re skilled enough to calibrate it. While some instruments lend themselves to this type of thinking, it doesn’t hold up for other equipment as will be shown in upcoming columns.

  • Environment. When it comes to calibration, we have to deal with the climate in which the work is done. Unfortunately, the overall or ambient temperature of the laboratory becomes the point of focus when the focus should be on the instruments, masters and item being calibrated.

  • Specifications. Many items, particularly fixed limit gages, are made to rather detailed specifications, copies of which should be on hand for each type calibrated. Too often, laboratories do not want to spend the money on the specifications, and thereby calibrate things they don’t need to, use procedures that conflict with them or fail to measure features that can have a dramatic impact on the results.

    Many companies purchase software to avoid buying the standards, but in the real world, at this time, the printed specification is the law-not the software. And there can be significant differences between the two. In a similar vein, some folks rely on a general purpose handbook for such information. In both cases, either source could be out of date since the source documents are usually reviewed every five years.

    Another mistake made by calibration facilities is to use the markings on gage handles as the “standard” to which they are calibrated. They can be in error in some cases, or where foreign specifications are involved, may be marked with product dimensions, not gage dimensions.

  • Equipment. Many everyday measurements can be made using more than one instrument type, but when it comes to calibration, the choices are limited. If you don’t have the right equipment for calibration, you won’t get anywhere close to the right measurements. This means your trusty digital micrometer or indicator with 50-microinch or 0.001-millimeter resolution won’t cut it. If either one had working accuracy close to their resolution, laboratories would not spend thousands of dollars for equipment to do what appears on the surface as a simple measurement.

  • Traceability. For measurements to have any validity, they have to be traceable to the national standard. When high accuracy is required, as is the case with gage and instrument calibration, this traceability cannot start several steps down the food chain. The higher the level of accuracy required, the closer you have to get to the national standard, for example, NIST. At the highest level for a commercial facility this means their primary standards-and some working standards-have to be calibrated directly by NIST. You can plot and scheme all you want, but if you’re serious about calibration, this requirement cannot be ignored.