- THE MAGAZINE
- WEB EXCLUSIVES
There is an element of truth in such accusations and the response to them is usually a call for industry to become more competitive. Government bureaucracy, red tape and taxation are three key obstacles to this happening in many sectors, but in an increasing number of cases, various industry sectors have only themselves to blame as they create similar problems that tilt the playing field.
I’m referring to the proliferation of quality standards and in particular sections of them or other standards that deal with calibration requirements. I understand that the requirements for the aerospace industry may be significantly different than the automotive or nuclear sectors, but when you get down to the gages and instruments used by any sector, the calibration of them doesn’t vary that much, if at all.
ISO 17025 was to be the common denominator for calibration facilities to eliminate the need for multiple end-user audits for companies offering calibration services. But the “one standard used by all” is becoming “one standard modified by many.” The benefits to both providers and users of calibration services are being eroded, and if the trend continues, we’ll all be back to square one with the increase in costs of being there.
I see properly accredited calibration laboratories being audited because their accreditation is not recognized by a particular industry. Far too often, the audit is by someone who is familiar with the standard, but has little knowledge of the technology they are reviewing. A laboratory that is properly accredited will have had to demonstrate competence to a technically experienced assessor so user audits of this nature can work out, except for the increased costs on the part of both parties.
Many of the calibration standards are essentially the same as ISO 17025, which has been used as the framework for them. Often, the only differences are about additional information a particular sector wishes to see on the calibration report. And some of these requirements are, in my opinion, of little value to the end user of the report.
Typical of this is a listing of the calibration due dates of the equipment used. Assuming they are all current-and who would show otherwise on a report?-what use is this information to anyone? Without an on-site audit, the recipient of this information does not know if the calibration frequencies are practical or not. For example, the report could show the masters involved are due for calibration in 12 months so everyone is happy. An on-site audit could show that they are on a five-year cycle instead of the industry standard of one year.
Another item of this nature is traceability. ISO 17025 does not require it to be shown on a report for good reason. To be accredited under the standard, the on-site assessor has to determine if it exists, and more important, if it is relevant. Listing a NIST number on the report to show traceability does not tell the reader whether the NIST report applies to the issuer of the report or someone two or three steps down the food chain from NIST.
For example, NIST may calibrate gage blocks for Lab A who calibrates gage blocks for Lab B who uses them to calibrate a micrometer for Lab C who uses it to calibrate gage blocks for you. Traceability is assured as A’s NIST number is referenced on reports by B and C. Competency is another story.
It seems to me that instead of industry sectors reinventing the wheel, they should accept ISO 17025 as a basic standard for calibration activities and then have an add-on to cover any extras they want. This would reduce accreditation costs for laboratories, auditing costs for all and ensure that a larger pool of skilled technicians is available to do the work.
As an added benefit, commercial laboratories could quote on what the “extras” are going to cost-which often has the effect of them not being needed any more.