Coordinate measuring machines (CMMs) are often used to measure workpieces that have tight manufacturing tolerances. The “accuracy” is critically important. This term summarizes various influences.
The resolution describes either the smallest step that can be discerned by the CMM or the sensor being used (positional resolution). It is also defined as the size of the smallest measurable feature (structural resolution). The resolution affects the reproducibility of the measurement results.
For users to be able to compare different CMMs, standards and guidelines were created to define characteristic parameters that quantify the measurement uncertainty under ideal environmental, workpiece, and user conditions. The definitive international standards are the ISO 10360 series and the VDI/VDE 2617 series. They specify essentially two characteristic parameters for CMMs and methods for testing them. These are the contact deviation and the length measurement deviation. They are determined using calibrated standards, which ensure comparability with internationally recognized values (traceability). Typical values are in the range of a few micrometers, or significantly lower for high-precision machines.
Measurement Uncertainty and Measurement Process Capability
Various influencing factors cause measurement uncertainty in every measurement. For economic reasons (measurement process capability), the uncertainty should be less than the feature tolerance by about a factor of ten. This is to reduce the number of workpieces that must be rejected even though they fall within the tolerance. The economic consequences of imprecise measurement become clear if one considers that every manufacturer must base their process on the tolerances in the contract, minus the measurement uncertainty of its own inspection equipment. Tighter manufacturing tolerances require very high additional costs. Thus, minimizing the measurement uncertainty is economically beneficial.
Many Factors Influence Measurement Uncertainty
The large number of influencing factors can be roughly divided into properties of the feature to be measured, of the workpiece, and of the CMM, as well as the effect of the operator and the environmental conditions.
Workpiece properties include the type of features to be measured (edges, holes, etc.). For example, measuring the radius of a circular segment is generally less precise than measuring that of a full circle. Other properties also play a role depending on the sensor being used. For optical sensors, this is the reflectivity, which is a property of the surface of the workpiece (e.g., roughness or color). Tactile measurements are compromised by workpieces that deform under the contact force, or by taking too few measurement points to determine shape deviations. The user can reduce these factors by selecting the most suitable measurement strategy and optimal sensors.
Three basic types of sensors are tactile, optical, and X-ray tomography sensors. The measurement uncertainty that can be achieved is influenced by the sensor properties and accuracy of the CMM (machine axes, geometric deviations). For tactile sensors, it depends on the probe sphere shape, the associated corrective method, and the contact force, which bend the probe shaft. For optical sensors, the magnification and illumination are critical factors. For X-ray tomography sensors, consider the size of the focal spot (determined by the electrical current, voltage and type of X-ray tube) and the resolution of the detector.
Temperature as a Critical Environmental Condition
If the ambient temperature deviates from the standard 20 C, thermally induced changes in the length of the workpiece and the machine scales can falsify the results. For machines with no temperature correction, measurement deviations of up to 300 microns per meter of measured length can occur at just 23 C for plastic parts. The deviation is offset only if the machine scales and the workpiece have similar thermal expansion characteristics, which is seldom the case.
For precise inspection measurements on the shop floor, a CMM with temperature correction is required. The temperatures of the workpiece and the machine scales are measured and used to correct the length measurement. The limits of this method are the precision of the temperature measurement and knowledge of the expansion coefficients. In practice, the best choice is to apply a correction based on a high-end temperature measurement device (deviations of 0.1 to 0.5 K) and the expansion coefficients of the workpiece as taken from tables (deviations up to 10%). The effort of calibrating the expansion coefficients directly on the workpiece (approx. 0.1% deviation) is worthwhile only for very demanding measurement requirements.
To reduce the effect of temperature, simple but effective steps can also be taken. Avoid air drafts and direct light, enclose the measuring machine, thermally insulate the measurement lab, operate the electrical equipment on the CMM 24 hours a day, and keep as much distance as possible between the CMM and any heat sources or walls.
Determining the Measurement Uncertainty
Measurement uncertainty can be determined using several different methods. Measurement uncertainty budgets, prepared for the field of coordinate metrology in the guideline VDI/VDE 2617 page 11, are available only for the tactile measurement of individual points under certain conditions. For tactile sensors, estimates can be determined using computational simulation according to DIN EN ISO 15530 Part 4 or VDI/VDE 2617 page 7. These have not yet been mastered for optical and X-ray tomography sensors. One proven method for determining the total uncertainty of measurements on CMMs is based on measuring calibrated, real workpieces. It is described in DIN EN ISO 15530 Part 3 and numerous plant standards (measuring machine capability analysis). In this method, various workpieces of the same type are measured repeatedly and the results are evaluated as a whole. This means that the effects of the environment, the workpiece, and the operator (setup and removal) can be determined along with the deviations of the measuring machine.