- THE MAGAZINE
- WEB EXCLUSIVES
Recently, I was looking at the new equipment being used for industrial ultrasonic inspection, and I was amazed at the technology advancements that have come to fruition over the past several years.
Back when I started in nondestructive testing (NDT), there were no computers, so we had to thoroughly understand theory and the relationship of trigonometry, inspection site geometry, and how the parts, weldments, materials or items were manufactured from the raw material to its end use, and then its service life.
All of these elements set the method of inspection and selection of acceptance requirements; the type of discontinuity encountered would be directly related to the material and the method of processing, such as forging, extrusion and casting.
For example, the forging process would produce inherent types of discontinuities and defects starting at the processing of the raw material ingot, all the way through the entire process of producing a high-temperature forged valve. The terminology used to identify the discontinuity found by ultrasonic inspection was extremely important to determine product acceptance.
Today, NDT inspection has moved beyond aerospace and defense, and is becoming increasingly important to the medical industry. One of the most recognized forms of NDT within the modern healthcare market is the sonogram, or ultrasound.
A sonogram is a computerized picture taken by bouncing sound waves off organs and other interior body parts. This combination of computer technology and industrial NDT methodology has enabled health professionals to detect and identify health issues that previously could be detected only through invasive surgery, and also has allowed many soon-to-be parents to experience the gift of life growing before their very eyes.
The predecessor to the sonogram was the X-ray, a technology that still remains in use today, and is used primarily to inspect bone fractures or other injuries, while sonograms are used to measure tissue formations.
Dr. Wilhelm Conrad Roentgen discovered the first X-ray in 1895. Three days before Christmas, he brought his wife into his laboratory and photographed her ring finger and hand and viewed her bones. Three years later, in 1898, scientist Marie Curie and her husband Pierre published a paper announcing the existence of a new element, which they named polonium, in honor of her native Poland. In December of that year, the Curies announced the existence of a second element, which they named radium for its intense radioactivity-a word that they coined, and from which the term radiography is derived.
Early in my career, when I began working in radiography, we had big bulky X-ray machines that often would break down. Nobody wanted to use them. Instead, we used radiographic isotopes such as Iridium 192, Cobalt 60 and Caesium 137 because they were easier to transport and quicker to set up; we did not have to plug them in, so we could shoot at remote locations.
However, safety was a huge factor when using isotopes, and it was not uncommon to hear of an incident of workers becoming “radiated.” There also were many records of isotopes being left by mistake in unsecured areas. As a result, many people were injured and died before strict rules and regulations could be adopted and the Atomic Energy Committee (AEC), now known as the Nuclear Regulatory Commission (NRC), could be established.
Thankfully, we have moved past the necessity of using isotopes, and due to the portability of today’s sonogram and X-ray equipment, these technologies can be safely used by medical professionals, virtually anywhere.
I can’t wait to see what the future holds for this industry.