The selection of an appropriate 3-D technique depends on the feature(s) to be quantitatively determined.

Illumination techniques used in 3-D imaging using white light and laser-generated lines of light incident at different angles. The rectangles at the conveyor show the areas in the object that are imaged onto different areas on the CMOS sensor in the camera. The four dashed lines from the camera display the field of view on the conveyor belt. (Photo courtesy of Sick)



In this article, 3-D imaging is considered primarily in the context of on-line product and process control. It can obtain data on three dimensional objects for 3-D dimensional gauging, defect detection-including depth

and height of defects-part recognition, and structural and material integrity.

Applications continue to expand, including confirming the integrity of complex structures, such as engines and integrated circuits, determining the presence of required components and features, and robot-controlled recognition for pick-and-place operations.

3-D techniques are also used in high-volume consumer products, medical diagnostics (ultrasound and remote/battlefield surgery), and laser-guided weapons for ranging, guidance and targeting.

As will be evident from the examples following, the selection of an appropriate technique depends on the feature(s) to be quantitatively determined-the x-y-z-resolutions/image-distances/optical-field required, time per individual inspection and special useful characteristics of the object.

Structured lighting, special optical techniques and signal-processing can provide significant improvement advantages in these areas as well as in speed and reliability.

Basic 3-D imaging techniques include:

triangulation

time-of-flight/range imaging

microscopy

coherent illumination (including interferometry, holography and tomography).



Special optical techniques and signal processing can provide significant improvement advantages in these areas, as well as in speed and reliability. Variations in illumination techniques, such as structured laser light (lines, line arrays and dot arrays of laser light), are used in different imaging techniques.

In Figure 1, different light source geometries are shown. The top three arrows point to three illumination sources. The top left arrow is towards a white-light source that illuminates a broad area on the conveyor belt. The top middle arrow is for a laser that projects a plane of light that produces a line of light at the conveyor for a 3-D measurement of surface height. The top right arrow shows another laser source, incident at a different angle, for providing scattered light from the tablets that can be used for presence/absence detection; the surface height sensor can also be used for presence/absence detection.

Different signal-processing techniques are applied to different probes. In time-of-flight imaging, direct time delay can be used. Parallel measurement of the phase and amplitude of an RF modulated illuminating beam is done on some commercially available products. And there are variations within the variations

Although 3-D imaging has been applied using optics, ultra-sound, radar and other probes, we shall emphasize optical applications, but include atomic force microscopy. However, the basic physical concepts-which depend on geometry, wavelengths, frequency, timing-are applicable to non-optical techniques. For brevity, coherent optical will not be discussed.



3D imager using triangulation. In model shown, there are modules at the left and right. One end module provides the illumination at the area/volume of interest, and the other end provides the photosensor; this geometry is common in both automated visual units. In simpler models, the illumination can provide a focused spot or line and the photosensor detects the reflected optical signal.

Basic Techniques

1. Triangulation

Triangulation is used to detect the displacement in the apparent position of a narrow laser beam or line of light incident onto a surface. Triangulation has been used to measure small displacements and the presence of objects, as well as variations in feature dimensions.

This is generally referred to as triangulation, since, for a simple system with a narrow laser beam, the three points (e.g. emitting light source on a bar, the point of incidence of small diameter light beam on the object and the photo-sensor along the bar that is used to image the narrow light beam at the object) form a triangle.

To measure a 3-D profile, a laser is mounted to project a laser line onto an object at an angle, again, as shown in Figure 1. The camera, which views the line from a different angle, sees a curve that is generated by the changes in surface height. The camera would see a straight line if the surface were flat.

In triangulation, as with most of the 3-D imaging & gauging techniques, there are many geometrical and optical variations to improve different features of interest. Special characteristics of the objects should be used when possible to improve resolution, sensitivity and reliability.

The CMOS area sensor in the camera can provide 3-D gauging. Multifunctional operation is enabled with the CMOS sensor. Different sections of the sensor are software configured, with appropriate lighting, to detect different characteristics such as surface reflection, different wavelengths, laser scatter, as illustrated, again, in Figure 1. Measurements are acquired in a line-scan manner.

The three rectangular areas in Figure 1 illustrate how different functions in the image are mapped onto different areas imaged onto the CMOS sensor of the camera. The bottom, left, narrow rectangle shows the grayscale in the image area in the field of view that is mapped onto the CMOS sensor. The broad area rectangle shows where the 3-D area is mapped onto the camera CMOS sensor. The next, bottom, thin rectangle area shows from where the scattered light is imaged onto the CMOS sensor area. The field of view of the camera is shown by the four dashed lines from the camera.

In the camera, different areas on the sensor can be assigned to different image components by software. The software can be configured to provide several measurements at the same time, including 3-D provile, laser scatter, color and monochrome. As with other vendors, it provides calibration tools. The 3-D was introduced in late 2010.

Figure 2 is an example of another triangulation system. The basic geometry is as described above. It uses laser light to illuminate and then uses a series of detectors to sense the imaged area. This could be a simple triangulation system as described above using a narrow beam of light.

However, it uses the laser light to form an interference (fringe) pattern that illuminates the samples. With this structured lighting, the triangulation is said to provide depth z-resolution down to 30 microns (or 1.2 mils) with lateral resolution down to 200 microns.

Other 3-D vision uses geometric pattern matching against stored images to provide real-time, three-dimensional position information to work with a variety of stacked or tilted parts.

It applies multiple sets of stored two-dimensional images and uses geometric pattern matching tools from its integrated 3-D-Locate software to determine an object’s 3-D orientation. Single or multiple cameras can be used for different resolutions.

Calibration tools provide the necessary corrections for optical distortion and camera position. It is said to tolerate non-uniform lighting and even partly covered parts.

This applies geometric pattern matching against stored images to provide real-time, 3-D position information to work with a variety of stacked or tilted parts.

3-D imaging is also used to play various games without using any hand-held devices. I steered a boat, on white water river, by moving my body (shown on the video display) to avoid images of rocks. This consumer product is of interest because various groups are working on it to both understand how it works and to apply it to practical manufacturing problems.

It has the advantage of providing 3-D images interactively at a very low cost.

With this technology, an invisible (i.e., infra-red) array of dots is projected from one end of a bar and a camera at the other end of the bar is used to image the dots. Dynamic triangulation is used to find distances.

The triangulation aspect is simply demonstrated by deliberately bending the bar, which will change the detected distances as anticipated.1

2. Time-of-Flight Optical Imaging

Time-of-flight imaging measures the time delay between when a beam is sent to an object and when it is received by a sensor. It is used in familiar applications, such as radar (to determine the 3-D distance to airplanes) and ultrasound medical imaging (to image fetuses within the pregnant mother).

LIDAR is the optical equivalent of RADAR. LIDAR is an acronym for Light Detection And Ranging.

It is essentially a time-of-flight, range-imaging technique.

The distance to a surface can be measured using the time delay between pulses, as well as the phase and amplitude of an RF modulated illuminating beam and the return beam.

[Note: The speed of electromagnetic waves is 299,792,458 meters per second (3.00 x 108 m/sec or

3.3 nanoseconds/meter). Electronics with time responses up to 1000X less than nanoseconds are commercially available.]

The seemingly relentless advances in electronics (Moore’s Law) have resulted in corresponding improvements in this technique.



3. Microscopy

Remarkable advances continue to be made in 3-D microscopy, including near-field optical microscopy, atomic force microscopy and confocal microscopy. Information on atomic positions can be obtained.

All of these techniques provide 3-D information at resolutions considerably better (e.g., nanometers) than that available from microscopy that is Rayleigh-Criterion (diffraction) limited to about one-half wavelength of the light used.



Near-Field Optical Microscopy

This is done by placing the detector very close (distance much smaller than the light wavelength) to the specimen surface.

In near-field scanning optical microscopy, the resolution of the image is limited by the size of the detector aperture and not by the wavelength of the illuminating light. In particular, lateral resolution of 20 nm and vertical resolution of 2–5 nm have been demonstrated. Dynamic properties can also be studied at a sub-wavelength scale using this technique.

Near-field optical scanning is also used to determine the deflection of the cantilevered mechanical probe used in atomic force microscopy.



Atomic Force Microscopy

Atomic Force Microscopy can have resolution to less than 1 nanometer. It is one of the primary tools for imaging, measuring and manipulating matter on the nanoscale.

The information is gathered by sensing the pull on a cantilevered mechanical probe that has a tip with a very small radius. The probe is very precisely scanned near a surface using piezoelectric devices that control small, but precise, movements under computer control.

The pull is between the tip of the probe and nearby atoms on the surface.

In some units, the deflection of the cantilevered mechanical probe is gauged using near-field optical detection (as above).



Confocal Microscopy

In a confocal microscope, the illumination is focused in a small area on the sample. The sensing optics simultaneously images the same focused area.

The imaging process uses both spatial and spectral filters and results in increased resolution and optical contrast. Light from nearby sample volumes and from off-axis diffraction peaks are virtually eliminated.

The 2-D image of a plane (a “slice” through the sample) is then formed by scanning-row after row of individual spots in a plane are simultaneously illuminated and imaged. The resulting 2-D image of rows in a plane is then a “slice” through the sample.

The three dimensional image is formed by stacking these 2-D slices, similar to the way that biologists physically section a biological sample with a microtome and then reconstruct the 3-D physical sample.

The slices in the confocal microscopy do not disturb the sample, and are thin.

This basic confocal microscope was initially, and is still, used to image fluorescence from samples, but it has been expanded to non-fluorescent applications.

Confocal microscopy offers several advantages over conventional optical microscopy, including the elimination of image degrading out-of-focus information, controllable depth of field and the ability to collect serial optical sections from thick specimens.

One key to the confocal approach is the use of spatial filtering to eliminate out-of-focus light or glare in specimens that are thicker than the plane of focus.

There has been a large expansion in the popularity of confocal microscopy in recent years.



References

1.Electronics Design, Vol. 55, No. 5, April 7, 2011, p.28: “How Kinect Really Works”







Tech Tips

In triangulation, there are many geometrical and optical variations to improve different features of interest.

The seemingly relentless advances in electronics (Moore’s Law) have resulted in corresponding improvements in time-of-flight optical imaging.

One key to the confocal microscopy approach is the use of spatial filtering to eliminate out-of-focus light or glare in specimens that are thicker than the plane of focus.