Squeezing the Most Life From a Zombie Technology
During his recent presentation at the 2009 Quality Measurement Conference, Ted Doiron of NIST called gage blocks, “a zombie technology.” He defined this as, “a dead technology given a semblance of life, but awkward and inefficient, by a supernatural force (human inertia) usually with evil (costly) effects.” His point was that gage blocks represent calibration standards of the past-indeed, of the 19th century. Newer methods, such as linear scales offer a more accurate, less problematic means of calibration, and will, in time, render the venerable gage block obsolete.
That is correct. There are newer and better ways. But if gage blocks are zombies, they are very busy ones. The fact remains that a large percentage of gage calibration done today is still done with gage blocks-even glass scales need an initial setting. It also is a fact that part of what makes gage blocks so valuable as standards is that they have been around so long. So until such time as newer methods have aged sufficiently to allow gage blocks a graceful retirement, it behooves us to make the most of them.
A good way to start is by understanding the unique place gage blocks hold in the measurement spectrum: it is not just poetic to say they stand at the critical junction between a shaft of light and practical measurements on the shop floor. Let's review.
Converting Length from Light to SteelWhile using the speed of light may be the most accurate way yet devised to define the meter, applying that definition to the real world of measurement is not so simple. For one thing, any application must necessarily be less accurate than the standard itself. It also must be highly repeatable, and if it is to be at all practicable, must be readily mass producible. That is where gage blocks come in.
Measuring the speed of stable light sources in a vacuum can be done with measuring accuracy, or resolution, of one part in 1011. When measuring the speed of light in air, accuracy degrades by three or four orders of magnitude. This is because the density of the molecules in air is neither homogeneous nor stable enough to allow the same precision. But one or two parts in 108 is still a lot better than can be achieved in any manufacturing environment.
That is how the folks at NIST and other national labs measure their own gage blocks, which, by the way, are just standard commercial products. They use laser interferometry with a helium/neon laser. But they do not just do it on their own. They use a round-robin technique where multiple, rigidly controlled measurements are conducted by national metrology institutions from many countries, and the results are scrupulously analyzed and averaged. The process is so time consuming and costly, it is only done every few years. But it is done using the same gage blocks over and over so the accumulated measurement data set is very large.
These are the blocks NIST uses to calibrate artifacts for its commercial lab customers, and here is where the rubber meets the road in terms of real-world accuracy. To keep up with the volume of measurements required, this step cannot be done with interferometry. Instead, gage block comparators are used, the industry standard for many years. By applying rigid process controls, accuracy on these measurements can be assured to one part in 106.
But when we talk about “accuracy of” or “resolves to,” what we're really saying is that there is a certain degree of uncertainty in these measurements. Uncertainty of measurement is caused by many factors in the measuring process. Assigning a value to the uncertainty for a given measurement involves identifying the sources that contribute to this uncertainty, assigning appropriate values to them and combining the results to give a value that accurately represents the variability of the process. The net result for gage block calibration is measurements that are known to be correct to within 1 microinch for most normally used gage blocks. NIST's process controls attempt to minimize all of the controllable sources of uncertainty-sources such as temperature variation, wear and dust.
So when commercial labs, in their turn, calibrate artifacts for their manufacturing customers, accuracy is again degraded, with uncertainties of 2 to 5 microinches being common. Thus, in the long trail from that one, three-hundred-millionth fraction speed of light meter measured to 9 decimal places, the best accuracy you can achieve in the real world with the gage blocks you have, is 6 decimal places.
The question then becomes, how do you keep that accuracy from slipping even further?
What is a Standard?In the Town Hall in Braunschweig, Germany, there is a length of steel mounted on the wall. In the 1700s this type of bar served as a very public dimensional standard for the cloth merchants in town. If you bought an ell of wool in Braunschweig and suspected the merchant was using a short stick, you could go to the Town Hall and check. Ultimately, that is what a dimensional standard is for-the promotion of commerce.
Today, of course, we are much more sophisticated. Our dimensional standard (one of them anyway) is a meter, which is defined as the distance light can travel in a vacuum in 1⁄299,792,458 of a second. This is very precise and universal. It is the same in Japan as it is in Sweden, the United States and Timbuktu. However, a beam of light cannot be mounted at the town hall, and it is not so easy to check your wool against it.
How we get from a beam of light to practical measurements on a shop floor is where gage blocks come in. But there is more to the idea of a standard than just measurement, as a brief history of the meter will show.
The meter as a unit of measure was first introduced by Italian scientist Tito Livio Burattini in 1675. However, it was not until the late 18th century, during the Age of Enlightenment, that the French proposed it as the standard unit of length.
There were two schools of thought as to how this meter was to be defined. One suggested defining the meter as the length of a pendulum with a half-period of one second. The other suggested defining the meter as one ten-millionth of the length of the Earth's meridian along a quadrant running through Paris from the Equator to the North Pole.
In 1791 the French Academy of Sciences selected the latter meridional definition and an expedition by Delambre and Pierre Méchain was commissioned to measure the length of the meridian from Dunkerque, through Paris, to Barcelona to serve as the basis of this measurement. A book by Ken Alder, “The Measure of All Things,” details the trials of this expedition as they labored on during the midst of the French Revolution. But long before they were finished, the French adopted a provisional result, and a brass meter bar was set as the standard. This is perhaps the hastiest thing that has ever happened to the meter.
In 1799 the meter bar was changed to platinum, which is more stable than brass. Nearly a century later, in 1889, the then International Bureau of Weights and Measures substituted a platinum/iridium bar, which was even more stable, and defined the meter as the distance between two lines on this bar, measured at the melting point of ice-the first time temperature was included. In 1927 the definition was amended to include supports for the bar, as any distortion would necessarily alter its length.
There things stayed until 1960 when light was first used as the definition-even though Albert Michelson had first measured the meter using interferometry back in 1893-and the meter was defined as equal to 1,650,763.73 wavelengths of the orange-red emission line in the electromagnetic spectrum of the krypton-86 atom in a vacuum. Finally, in 1983, the current definition was established.
The point is that in order to be truly effective, a dimensional standard must not only exist in space, it must also exist in time. If we are to make things, and match and mate them, a meter must be the same this year as it was last, and stay the same next year and the year after that. Change is not good for standards, or for measurement processes as can be seen by looking at NIST's calibration process.
Care and Feeding of Gage BlocksIt should be clear by now that gage blocks will be with us for a considerable time to come. So if you have been hesitating about buying a set, don't. But neither must you buy the absolute top Grade 00 blocks, even if the applications require ultra-high precision. These days, you can save considerable money, and even get better results by buying a lesser grade of block and calibrating assiduously.
In January 2002, ASME issued a revised standard (B89.1.9-2002) with the objective of bringing U.S. gage block calibration practice in line with international standards, specifically ISO 3650. It also accommodated important shifts in the use of gage blocks and reflects current trends in the use of measurement uncertainty. Under the old standard-which reflected the traditional American approach-the idea was to make blocks as close to exact size as possible. Under the more European way of thinking, so long as a block is within a certain size range, it does not matter exactly what its size is, so long as it is measured well and characterized correctly. If you know the variation, you can allow for it.
Thus, calibration assumes even greater importance. You can buy a lesser grade gage block, and so long as you characterize it accurately, you can achieve lower uncertainty in your measurement and save money.
But while size tolerances are not as tight under the new ASME standard (See Figure 1), form tolerance-how flat the blocks should be and how parallel their surfaces should be-can be very tight. Under the old standard, the size tolerance applied only at the center of the block, at what was known as the reference point. Under the new standard, size tolerance applies everywhere. This implies that the calibration lab should check a representative number of points on the block's surface, not just the central point.
The bottom line in purchasing gage blocks is to only buy products from reputable manufacturers, as stability, finish and calibration accuracy are all at-risk items. And protect the gage block investment, particularly from corrosion, which is the number one zombie killer.
Ensuring Good CalibrationThe new standard does not require users of gage blocks to change anything. Many manufacturers have hundreds of sets of gage blocks, some many years old and still perfectly good. They also have processes based on these blocks, and the standards for calibrating them are still perfectly acceptable. However, even if they ignore the new standard, users will need to be aware of this new practice and make sure their blocks are being calibrated correctly.
For calibration labs, in addition to the requirements for clearer customer communication, the new standard requires basic changes in their calibration and reporting procedures. It also means that many labs will have to upgrade their equipment, as existing comparators in most American labs might not have the capability to measure blocks at the corners.
Zombies though gage blocks may be, dimensional measurement for gage block calibration is possibly the most precise mechanical measurement process on the planet. The environmental conditions under which it is done are as controlled as possible, and the equipment used is the absolute best that can be made. So whether buying calibration services from a commercial lab or calibrating in your own in-house lab, rigorous adherence to process is required to assure minimum uncertainty in measurement and optimum quality in manufacturing.