Manufacturing often involves the fabrication of products that are made up of multiple smaller parts or components. Assembling these parts into finished products can be complex and labor intensive. Robots have been assembling parts into products since the mid-20th century [REF-Unimate] and the automotive industry is perhaps best known for robotic assembly as automotive factories employ roughly 30% of all industrial robots worldwide [REF-WR 2019].

If you’re like me (a gadget and technology nerd) you can’t help staring at big industrial robots at work in some automotive factory and be mesmerized by their ability to move so quickly and so precisely from one position to another, repeating the same tasks with no end in sight. The long conveyor belts, vibratory feeders, and complex jigs and fixtures all blend together into a carefully orchestrated symphony that you almost forget about the final product.

If you’re an engineer who works with robots, vision systems, sensors, and algorithms (like me) you also can’t help thinking about how to improve upon this beautiful dance that’s unfolding right before your eyes. And if you’re lucky enough that your job requires you to do just that, then you feel humbled by the countless decades of research and development that people have had to put into making what you are witnessing a reality and wondering whether you are even capable of improving upon perfection.

Arthur C. Clarke said that “any sufficiently advanced technology is indistinguishable from magic,” a sentiment echoed by others before and after him [REF-Clarke]. Unfortunately, this “magic” effect only lasts for a relatively short time, and when an advanced technology becomes part of everyday life, we take it for granted. As magical as today’s manufacturing robots are, for a roboticist like me, I can’t help but think about how the majority of industrial robots today are not as intelligent as we might imagine.

In fact, I can safely tell you that industrial robots, for the most part, don’t understand much about what is going on in the world around them. They are simply moving with very high precision along preprogrammed paths, which is what they are best known for. The robots’ lack of awareness of their surroundings makes them dangerous, but predictable. Humans, on the other hand, are less predictable, and that’s why, for the majority of their existence, robots have been caged and separated from people.

A collaborative robot

Figure 2: A collaborative robot with a 3D imaging system built into its gripper performing a peg-in-hole insertion task using a NIST prototype assembly task board. [REF-NIST1]

In order to take advantage of an industrial robot arm’s ability to repeatedly move from one point to another with very high precision, companies invest a lot of money into building custom conveyance systems, jigs, and end-of-arm tooling (EOAT) in order to present robots with the parts they need to work on in the same exact position each and every time. In fact, for every dollar that a company spends on a robot, they spend two to six times that amount on this type of support equipment and on integration costs [REF-engineering.com]. If a part is not where a robot expects it to be, a robot may blindly continue its task unaware that the welds are being applied in the wrong place or that the part that was supposed to be moved from one location to another was never actually grasped. In addition, robots often require periodic calibration in order to improve their repeatability and accuracy as a consequence of their normal course of operation. [REF-Marvel and Messina]

Many small and medium manufacturers (SMMs) with low production volumes face a large barrier to entry for implementing robots on their shop floors. Not only do they often lack the expertise needed to integrate, calibrate, and program the robots, but they also lack the production quantities that can justify large investments into integration services and fabrication of custom conveyance systems, jigs, and tooling. Collaborative robots promise to make the integration easier (and safer) for these SMMs, but for assembly tasks that require precise fitting of small parts, collaborative robots (and even industrial robots) alone may not be able to perform these tasks without the added costs of conveyors, jigs, and tooling.

 This is where three-dimensional (3D) imaging comes in. A 3D imaging system is “a non-contact measurement instrument used to produce a 3D representation (for example, a point cloud) of an object or a site.” [REF-E2455] Although machine vision has been used in robotic applications for a while, the majority of these applications use 2D sensing technologies, such as electro-optical (EO) cameras. Three-dimensional imaging systems (or 3D machine vision), such as LiDAR (light detection and ranging) or RGB-D (red, green, blue, and depth) cameras that can be had for as little as a few hundred dollars, promise to make robotic assembly more accessible to SMMs. For robotic tasks such as random bin picking in which parts must be identified and grasped by a robot out of a bin of identical (or different) parts, 3D imaging provides the type of information needed to calculate the precise position and orientation (or pose) of the parts. Such a task would otherwise be difficult to do with traditional 2D machine vision techniques.

However, relative to 2D machine vision cameras, 3D imaging systems are new to industrial robotic applications, and they use many different measurement techniques to measure shape [REF-Sansoni, REF-Anderson]. The choices between the different active and passive measurement techniques and the software approaches for identifying and calculating the poses of objects can greatly affect the performance of a 3D imaging system. Add to that the material properties of the objects being imaged and the ambient conditions of the environments in which the objects are being imaged, all of that makes for a rather difficult problem of choosing the right 3D imaging system for a specific robotic assembly application.

If you are selecting a 3D imaging system for picking parts from an unorganized (i.e., random) bin of parts using a robot, then the primary task for the 3D imaging system is to identify a part (or parts) in the bin that has the highest likelihood of being successfully picked by the robot. This means that the 3D imaging system must be able to measure the pose of the part and to provide that pose to the robot in order for it to execute the pick. Any uncertainty in the pose of the part that is due to errors in the 3D imaging system has a direct impact on whether the robot will be able to execute a successful pick. Therefore, quantifying the performance of a 3D imaging system is crucial.

While interface standards, such as GigE Vision [REF-AIA], define protocols for how different machine vision sensors should transfer images over various types of cables, there are few standards that define how manufacturers should specify sensor performance. Different manufacturers have different ways of specifying the performance of their machine vision systems, which makes it challenging to compare different systems. When it comes to 3D machine vision systems that can be used for robotic assembly applications, there are even fewer applicable standards to choose from.

Of the machine vision standards that currently exist, only two address the ability of 3D imaging systems to measure objects within their work volume. The VDI/VDE 2634 Part 2 and Part 3 Guidelines, as well as the ISO 10360-8 standard, define a 3D imaging system’s measurement performance using several metrics (such as “probing form error” and “flatness measurement error”) that are evaluated through a standard procedure. [REF-2634, REF-10360] The ASTM E2919 “Standard Test Method for Evaluating the Performance of Systems that Measure Static, Six Degrees of Freedom (6DOF), Pose” specifies a procedure for calculating a 3D imaging system’s error in measuring the pose of a single object under ideal conditions. Both of these standards evaluate only small aspects of the performance of 3D imaging systems and more standards are needed for a more complete picture.

The National Institute of Standards and Technology’s (NIST) Intelligent Systems Division is working with the 3D imaging industry to develop performance standards for systems that can be used for robotic assembly applications [REF-NIST2]. These standards are being developed through the ASTM standards Committee E57 on 3D Imaging Systems [REF-E57]. As part of this effort, NIST conducted a market survey that identified over 100 commercial 3D imaging systems that can be used for robotic assembly applications from over 40 companies. The specifications of the various systems identified showed that there was a wide range of non-standard terms used to describe performance and only one system out of the 100+ systems identified referenced a published performance standard in its specifications (the VDI/VDE 2634, Part 2 standard).

NIST and ASTM E57 held a series of virtual meetings with stakeholders from the 3D imaging industry and from academia in 2019 in order to develop a list of the standards that are needed. These meetings culminated in a two-day workshop that was held at NIST in December 2019 in which the list of needed standards was refined and prioritized. Of the 39 standards identified, participants chose the six highest-priority standards and expanded them into work items for immediate development. The full list of 39 standards will be published in an upcoming NIST report in which the list will be incorporated into a roadmap of standards needed for 3D imaging systems that can be used for robotic assembly applications.


References:

[REF-Clarke] Clarke, Arthur C. (1973). Profiles of the Future: An Inquiry into the Limits of the Possible. Popular Library. ISBN 978-0-33023619-5

[REF-Unimate] https://en.wikipedia.org/wiki/Unimate

[REF- WR 2019] International Federation of Robotics , Executive Summary WR 2019 Industrial Robots, 2019, https://ifr.org/downloads/press2018/Executive%20Summary%20WR%202019%20Industrial%20Robots.pdf

[REF-engineering.com] “The Real Costs of an Industrial Robot Integration,” engineering.com, https://www.engineering.com/ResourceMain?resid=858

[REF-Marvel and Messina] Marvel, J., Messina, E., Helping Robots Stay on Target, Quality Magazine, December 2, 2019, https://www.qualitymag.com/articles/95828-helping-robots-stay-on-target

[REF-E2455] ASTM E2455-11a(2019) - Standard Terminology for Three-Dimensional (3D) Imaging Systems]

[REF-Sansoni] Sansoni, G., Trebeschi, M., and Docchio, F., State-of-The-Art and Applications of 3D Imaging Sensors in Industry, Cultural Heritage, Medicine, and Criminal Investigation, Sensors 2009, 9(1), 568-601; https://doi.org/10.3390/s90100568

[REF-Anderson] Anderson, J., A Closer Look at 3D Imaging, Quality Magazine, September 1, 2016, https://www.qualitymag.com/articles/93544-a-closer-look-at-3d-imaging

[REF-AIA] GigE Vision - True Plug and Play Connectivity, Automated Imaging Association, https://www.visiononline.org/vision-standards-details.cfm?type=5

[REF-2634] VDI 2634, Part 2, Optical 3-D measuring systems - Optical systems based on area scanning, 2012, and Part 3, Optical 3D-measuring systems - Multiple view systems based on area scanning, 2008, VDI – The Association of German Engineers, https://standards.globalspec.com/std/9914533/vdi-vde-2634-blatt-2 and https://standards.globalspec.com/std/9914423/vdi-vde-2634-blatt-3

[REF-10360] ISO 10360-8:2013, Geometrical product specifications (GPS) — Acceptance and reverification tests for coordinate measuring systems (CMS) — Part 8: CMMs with optical distance sensors, the International Organization for Standardization, 2019, https://www.iso.org/standard/54522.html

[REF-NIST2] The NIST Assembly Performance Metrics and Test Methods, National Institute of Standards and Technology, 2020, https://www.nist.gov/el/intelligent-systems-division-73500/robotic-grasping-and-manipulation-assembly/assembly

[REF-NIST2] The NIST Perception Performance of Robotic Systems Project, National Institute of Standards and Technology, 2020, https://www.nist.gov/programs-projects/perception-performance-robotic-systems

[REF E57] ASTM Committee E57 on 3D Imaging Systems, ASTM International, 2020, https://www.astm.org/COMMITTEE/E57.htm