Helium mass spectrometer leak detectors are designed to measure the partial pressure of trace amounts of helium passing through small leaks and then display the measurement as a function of leak rate. To ensure the accuracy of the measurement, it is necessary to tune and calibrate the leak detector on a regular basis by introducing a precise flow of helium into the leak detector and comparing the output measurement with the actual flow.

This regulated flow, or leakage of tracer gas, is achieved through the use of a calibrated helium leak standard. Many of today's leak detectors have integrated or built-in calibrated leak standards. Older models of leak detectors, and current models of leak detectors that do not provide automatic calibration, may require an external calibrated leak standard to do the tune and calibration function.

Two common types of commercially available calibrated leak standards are glass permeation and capillary leaks. Glass permeation calibrated leak standards consist of a permeable glass membrane encapsulated in a sealed reservoir that is charged with helium. The helium in the reservoir diffuses through the glass membrane out of the leak and into the leak detector vacuum system.

An important factor to consider when using glass permeation style leaks is the temperature's affect on accuracy. Permeation style leaks have a leakage temperature coefficient of 3 or more percent per degree Celsius. As the operating temperature of the calibrated leak increases, its leakage rate also increases by its temperature coefficient. The temperature at which the leak standard was originally calibrated is typically marked on the side of the leak. This allows the user to compare the ambient temperature at the time the leak is used with the original calibration temperature and to compensate for the difference as required.

Glass permeation leaks are fragile and should be handled with care. If the glass membrane breaks from impact, the helium supply will immediately deplete from the reservoir and the leak will be unreparable.



Capillary leak standards

Capillary-type leak standards are made up of a thin metal or glass capillary tube constructed into a pressurized metal envelope such that the gas from the envelope flows through the capillary tube out of the leak and into the vacuum system of the leak detector. One advantage of capillary type leaks over permeation leaks is that they are less sensitive to temperature changes. The leakage temperature coefficient of a capillary leak is approximately 0.2 percent per degree Celsius. Although capillary leaks are not as fragile as permeation leaks, they do have a capacity to become plugged by solids or vapor condensation.

The accuracy of calibrated leaks incorporating an integral gas supply depends on age. Gas-filled leaks will decrease in value over time as the source of helium leaks out. This applies to both permeation and capillary type leaks. The depletion rate of a calibrated leak depends on its specified leakage rate. A 10-7 or 10-8 standard cubic centimeters per second (sccs) range leak will lose approximately 2% to 3% of its initial leak rate value per year.

The depletion rate of a calibrated leak does not change when the leak is stored with its valve in the closed position. In fact, storing a gas-charged calibrated leak with its valve closed will result in helium buildup within the mechanism of the valve, causing the leak to become inaccurate. A gas-charged leak should always be stored with its valve in the open position.

Aged calibrated leaks can be recertified on a regular basis to keep up with internal or external quality standards. Recertifi-cation involves measuring the leak on a calibrated system and re-labeling it with the new leakage rate, calibration temperature and date of calibration. A typical re-calibration cycle is one year.



Automatic or manual

Regardless of whether the leak detector calibration is done automatically or manually, or whether it is done with an internal or external leak, the process is the same, in general. The first step in calibration involves tuning the leak detector so that the output signal is optimized for a constant in-flow of helium. This is generally performed by introducing a steady flow of helium into the leak detector vacuum system and then adjusting voltages within the analyzer cell and monitoring the output signal of the leak detector. The voltage values that result in the greatest signal output are set as the standard operating voltages until the calibration routine is re-initiated. Attempting to tune to a leak that is less than 1.0E-9 sccs may result in a "no signal" or "low signal" error message. This is an indication that the available helium flow is too low for the leak detector to do a successful tune adjustment.

After tuning is complete, the calibration routine is run. Calibration involves comparing the tuned output signal with the calibrated leak value and applying a multiplier or gain factor to make the leak detector output or display equal the calibrated leak value. It should be noted that adding additional gain to calibrate a leak detector may also amplify noise and background. Higher gain settings could potentially make the leak detector unstable in the higher sensitivity ranges.

The typical gain factor of a leak detector that is isolated from all external systems depends on the inherent sensitivity of the leak detector. This is a function of the design of the vacuum system and the analyzer cell components. A gain factor that is equal to or near 1.0 is the most appropriate because it indicates that the leak detector design provides enough inherent sensitivity to read a calibrated leak accurately without additional amplification of the tuned signal. A significant change in the typical gain factor of a leak detector may indicate a problem. Monitoring the gain factor on a regular basis is a good way to ensure that the leak detector is clean and working correctly.

It is possible for the gain value to vary as a result of a change in the test configuration. If the leak detector is connected to an external vacuum system during the calibration routine, the pumping action of the external system could divert helium from the calibrated leak away from the leak detector. This is referred to as a split-flow condition. In this situation, a greater gain value would be required to bring the measured value up to the fixed value of the calibrated leak. This is not an indication of a problem with the leak detector; it is the result of the application.

Calibration of modern leak detectors is relatively easy and fast. Many of today's leak detectors have an integrated calibrated leak standard and an automatic calibration feature. Calibrating a leak detector with an integrated leak often involves the push of a single button and a 3 to 5 minute wait period. It is important to warm up the leak detector for at least 30 minutes prior to performing a calibration routine.

The frequency of calibrating a leak detector can depend on the application or be mandated by quality control requirements. More frequent calibrations are recommended for applications that involve accurate and quantitative testing to a specific pass/fail value. Less frequent calibrations are required for qualitative or go/no go testing. A well-designed leak detector will hold its calibration within the tolerances of go/no go test requirements for more than a few weeks. In addition to scheduled calibrations, it is appropriate to perform a calibration after the leak detector has experienced an unexpected shock, such as a power outage, power surge or inadvertent venting of the test port while in high sensitivity mode.



Sidebar: Tech tips

• A regulated flow is achieved through the use of a calibrated helium leak standard.

• Many of today's leak detectors have built-in calibrated leak standards. Older models may require an external calibrated leak standard.

• Two common types of commercially available calibrated leak standards are glass permeation and capillary leaks.