Modern XRF has become a common analytical instrument technique for the quality assurance of manufactured products, including out-going product verification and production quality control as well as incoming inspection. Functional characteristics that make XRF so attractive to these quality applications are: the broad application to many kinds of materials, the ease of operation of the equipment, the short analysis time (often less than 30 seconds), and the noncontact and nondestructive nature of the technique.

Overview of XRF

XRF is an atomic spectroscopy. The analysis is based on the interaction of incident X-rays that are today mostly generated by an X-ray tube and the material under inspection. British physicist Henry Moseley demonstrated that certain characteristic X-ray emissions from ionized chemical elements were proportional to the atomic number of the elements1. This is the basis of modern XRF spectrometry. The wavelength or energy of fluoresced emissions acquired from a material is characteristic of the elemental composition of the sample material. The energy of the fluoresced peak provides identification of the element. The number of emitted photons at those specific energies represents the number of atoms (mass) of the emitting element that is present in the material.

Applying these basic physics to an analytical instrument requires a source of X-rays; as noted above, this is typically an X-ray tube, a means of determining the energy of the emitted X-rays and a means of counting the fluoresced photons. In earlier instruments and still used today, natural crystals (now often layered devices as well) were placed between the fluorescing sample and a counter, like a Geiger Counter and now gas filled or gas flow proportional counters. The crystals or layered devices disperse the emitted X-rays that can then be collected (counted) at specific angles that identify the wavelength or energy (E~1/λ where E is Energy and λ is wavelength) based on the Bragg equation, nλ=2dSinθ where d is the crystal lattice spacing and θ is the angle of diffraction. XRF spectrometers using dispersing devices are known at Wavelength-Dispersive XRF Spectrometers (WDXRF).

Evolution of XRF spectrometers for QA/QC Applications

As described in the previous section, the first XRF Spectrometers were what we now call WDXRF systems. WDXRF systems are now manufactured with a single detector (or tandem detectors, one to capture long wavelengths and the other shorter wavelengths) and switchable dispersing devices where the detector is driven to various Bragg angles θ around a goniometer circle. These are known as Sequential WDXRF systems. Or, they are configured with multiple, fixed dispersion devices and detectors specific to the desired wavelengths (elements) to be determined. These are known as Simultaneous WDXRF systems because multiple elemental spectra are acquired simultaneously.

Simultaneous systems are commonly used in quality control applications where high sample throughput is required. Common markets/applications for these kinds of spectrometers are steel manufacturing and cement manufacturing where a complete and specific suite of elements can be determined simultaneously and with very good precision in literally seconds per sample. These instruments, due to the optical benches consisting of dispersion devices and detectors that make them up, are floor standing and can be quite large.

The remainder of this article focuses on Energy Dispersive XRF Spectrometers (EDXRF). EDXRF systems are typically smaller than WDXRF systems and can often be installed and used near production, shipping or receiving areas as opposed to in a laboratory. There are also portable handheld EDXRF systems available today.

The name is a bit of a misnomer in that there is no dispersion device. Energies are determined electronically (pulse height discrimination). The level of discrimination or resolution is a function of the type of detector and pulse processor used in the spectrometer. Today, there are three types of detectors commonly use in EDXRF systems: gas filled proportional counter detectors and solid-state silicon detectors (Si-PIN and Silicon Drift Detectors – SDD). The choice of detector is driven by the application(s). From an X-ray spectrometry perspective, this means the material matrix that is being analyzed, i.e., what is the elemental composition of the sample material and what are the nominal concentrations, and for layered systems (thin-film applications) the thickness of the layer(s) as well.

Resolution is expressed in terms of Full-Width-Half-Max (FWHM) at 5.95 keV, which is the energy of the Mn Kα emission. Typical resolutions for the detectors listed above are: 900 eV for prop counters, 200 eV for Si-PIN detectors and 140 eV for SSDs. If the sample matrix is complicated—many element emissions and/or there are heavily overlapped emissions (adjacent atomic numbers)—then Si detectors are preferred or even necessary. If the emissions from the sample are well separated then the prop counter will do a very good job, i.e., Sn coating on a copper substrate. There is a significant cost benefit to the prop counter, if it can do the job. Concentration of an analyte element or a very thin layer will impact detector selection as well. The better resolution Si detectors provide much better peak-to-background (signal-to-noise) response than do prop counters, and therefore, much better detection limits whether in ppm concentration or nanometer thickness.

The SDD is one of the most recent technology advances commercialized in EDXRF. As noted above the SDD provides the best resolution in today’s commercial spectrometers, and therefore, the best detection limits. Another enabling capability of the SDD is sensitivity to lower energy emissions that as demonstrated by Moseley (where we began) are associated to lower atomic number elements. Two demanding applications that SDDs have improved on and in one case enabled are in quality assurance areas of functional coatings for electronics: RoHS compliance and ENEPIG (Electroless Ni-Electroless Pd-Immersion Au) coatings on PCBs and chip connectors.

The RoHS (or anti-RoHS, i.e. exempted defense contractors) applications can range from a very thin Sn-Pb solder finish to Cd based pigments in polymers. Plating companies doing quality control on outgoing parts and contract manufacturers or end-users doing quality assurance for incoming inspection of components want to assure that the parts are RoHS compliant. SDD-configured spectrometers manufactured for small area (chip leads) can reliably and nondestructively measure Pb content down to 120 ppm levels2, even in thin finishes. Detection limits can also be very important to the defense contractor that needs to provide components with greater than 3% Pb content. This may seem like a relatively high concentration until one is measuring thin finishes where the total areal mass of Pb (µm/cm2) is quite low. Figures 1 and 2 show spectra from a prop counter configured system and SDD configured system respectively. The spectra were acquired from the same sample, which is a 371 micro-inch thick Sn coating on Cu substrate having RoHS action Pb content of 1,000 ppm. The Pb is marginally detectable with prop counter, but easily detected and measured with the SDD configured instrument.

RoHS compliant levels of Cd in a plastic or polymer is a challenge, even to Si-PIN configured spectrometers. Low average atomic number matrices have high scatter cross sections to the incident beam, the major contributor to XRF spectrum background. Often primary beam filters are used to remove incident X-rays in the energy region of the analyte peak, thereby, improving the analyte peak-to-background response. Beam filters capable of removing background in the energy region of Cd K emissions are not practical because they would have to be so thick that they would greatly reduce excitation efficiency and analysis times would become impractical. The better resolution of an SDD configured instrument is sufficient to enable Cd measurement at the 100 ppm RoHS compliance level in 300 seconds with 5% precision.

As with many functional film applications, ENi thickness plating control and incoming inspection is an important parameter. Linear thickness measurements in XRF are based on converting the measured mass thickness (mass/unit area) to linear units (microns, microinches). This is achieved by dividing the mass-thickness by the material density, which requires knowledge of the material composition. The phosphorus content the ENi (NiP) layer has been historically assumed, i.e., 6%, 8% or 12% phosphorus, because P (Z=15) emits at a relatively low energy (2.02 keV) where air will absorb much of the signal and the noise characteristics of Si-PIN detectors prohibit the direct measurement of P, unless the sample is measured in a vacuum chamber, which takes time, limits the size of the chamber, and adds cost. With SDD detectors in a closely coupled X-ray tube-to-sample-to-detector design have sufficient sensitivity to measure P content even in air. So that P content, and therefore density, are no longer an assumption and thickness measurement accuracy is improved.

ENi layers are often covered by other layers. Some are at thicknesses that that will completely absorb the phosphorus signal, so that, if the P content is to be determined it must be done prior to adding the final layer(s). ENEPIG coatings that are becoming commonly used in the PCB industry to improve board shelf life and solderability have a thin Pd or PdP layer on the ENi and then a final very thin immersion Au layer. These layers are often thin enough to pass the P signal and sensed with an SDD configured instrument, so that, all three layers can be accurately measured simultaneously with the accuracy and precision required by the recently released IPC IPC-4556 Standard3 (Association Connecting Electronics Industries).

Computing, Software Algorithms & Summary

The technological basis for XRF has been around for a century, but it wasn’t until the introduction of compact and affordable computing (mini computers and PCs) in the 1970s and 80s that XRF spectrometry was able to become widely accepted and the preferred method for compositional analysis of many materials, including the characterization of functional and decorative thin-films—single layer and multi-layer. As a nondestructive technique, measurements are sample matrix dependent. There are physical X-ray matrix effects that can be complicated to deal with. Historically, correcting for these effects required many standards. Today, most spectrometers are supplied with what is termed fundamental parameters (FP) software. FP algorithms use well established X-ray physical constants to correct for matrix effects often without any calibration standards. With fast processing and FP software even complicated sample matrices like the ENEPIG functional coating system described above can be computed in seconds.

Today’s EDXRF spectrometers are compact and fast. FP software and current detector technology have enabled simple solutions to demanding applications, often a single button operation—ideal for many quality measurement requirements.


Jim Bogert is the director of western sales at Fischer Technology Inc. For more in-formation, call (650) 513-1691, email or visit



1 “The High-Frequency Spectra of the Elements,” Philosophical Magazine 26 (1913): 1024-34; 27 (1914): 703-13. 1 plate.

2 Fischer Technology Application Note (AN004en), “Determination of Harmful Substances in Very Small Concentrations – RoHS”

 3 IPC Standard 4556, “Specification for Electroless Nickel / Electroless Palladium / Immersion Gold (ENEPIG) Plating for Printed Circuit Boards”