The component choices available in the architecture and configuration of machine vision inspection solutions are more numerous than ever. From smart sensors to smart cameras to complex automated application-specific devices, the machine vision market offers a wide variety of selection. While there is no single inspection system, component, or tool that will be appropriate for every application, the implementation of digital cameras interfacing with a PC (sometimes called “tethered” cameras and “PC-based” systems) potentially makes for one of the most flexible, powerful, and high-speed systems available to the machine vision integrator. This article will discuss the basics of selecting and using digital cameras, and will reveal some useful tips about real-world integration issues in industrial on-line applications.
Some engineers and end-users might only be familiar with machine vision smart camera components, and it is important to note that the PC-based or tethered camera system is by no means a new architecture; it was in fact the original machine vision system architecture, although early computing platforms were something other than a personal computer, and the tethered cameras were analog not digital. Analog cameras require dedicated frame-grabbers, and could/can be difficult to integrate. Nonetheless, processor-based, tethered camera systems have always been a core option for wide-ranging inspection applications, and over the past eight to ten years one key technology driver has helped made this system architecture more accessible and viable than ever before: the development of the digital machine vision camera. Digital cameras in machine vision now greatly surpass analog in usage, with interfacing options growing yearly. For the remainder of this article, the term camera will refer to a digital camera that is to be tethered to a PC type processor (as opposed to a smart camera).
Interface, interconnect, and programming standards
Cameras for machine vision applications connect to the host computer using different physical interfaces. The interface indicates the hardware (electronics, wiring, connectors), firmware, and data transmission (image and control) protocol that the camera is using. Early digital interfaces were proprietary or sometimes application- or device-specific. However, in the early 2000s, generic standards emerged within the camera industry that were widely adopted by manufacturers. Probably the most familiar of these are GigE Vision, Camera Link, FireWire (1394 DCAM) and USB. More recent interface standards include CoaXpress, USB3Vision, Camera Link HS and 10 GigE Vision. By adopting these protocols, manufacturers can offer users mostly-standardized choices for camera interface selection with the general expectation of a more simple and predictable integration of the camera with a PC and compatible machine vision software.
An important related standard for cameras is GeniCam, which specifies a programming interface for industrial digital cameras covering camera configuration, image acquisition, data and event communications and camera GUI. This valuable standard helps to deliver a degree of interoperability between digital cameras provided by different manufacturers.
Selecting and implementing the right camera interface
Interface selection is highly application specific, and for a given application there may be more than one viable protocol. Considerations of a technical nature for interface selection include speed (image data throughput and bandwidth), physical interconnect (connectors used, independent card or frame grabber required, power over cabling available), cable type and maximum transmission length, CPU usage and load, levels of software trigger latency and jitter, I/O support, and software interfacing flexibility and control function. Other important but more subjective criteria could include camera form factor and size, product availability (particularly in a required sensor format, resolution, and/or frame rate), and cost.
Each interface protocol has well-known technical capabilities and limitations, and detailed comparisons are available from a variety of sources outside of this article. Ultimately, the decision process rests heavily on the subjective considerations, but that should be only once technical requirements have been met.
Popular standard interfaces
Camera Link, GigE Vision and FireWire (IEEE1394/DCAM/IIDC) are the most mature of the standards, and very well supported. Camera Link is a very high-speed and deterministic interface with virtually no CPU loading, though it offers only a short maximum cable length (without extenders) and requires a dedicated frame grabber. Camera Link is excellent for line-scan or other applications that may require external triggering with low latency and high frame rates. It is a higher cost solution, but is well indicated where dedicated speed and image delivery are required. Camera Link cameras are the least standardized in terms of signal and command structure, and this can increase the complexity of an application, particularly if it is necessary to programmatically manipulate camera operation through direct serial communications. It is therefore extremely important with Camera Link to ensure that the selected camera is fully compatible with the targeted software package or library.
GigE Vision cameras have become highly popular and are considered by many to be a go-to component for general purpose machine vision applications. Implementing GigE Vision is generally very easy if one carefully follows recommendations for addressing and setting parameters for network interface cards (NICs). Be very careful too of arbitrary software like internet firewalls and anti-virus tools, which can block or disrupt the GigE interface with a camera. GigE Vision allows the longest possible native cable runs, and the interconnect is the familiar Ethernet RS45 plug, with power over cabling available if the NIC provides it. GigE cameras are relatively inexpensive, and do not require a frame grabber, although it is highly recommended that each camera have a separate dedicated Ethernet connection to the PC (no switches except for the slowest of applications).
This protocol by nature requires significant CPU load for image acquisition and memory storage, so specify a computing device accordingly. GigE Vision has somewhat poor software trigger latency and jitter, but this can be mostly solved by using a hardware trigger through the camera I/O instead of software triggering. GigE Vision would not be considered “deterministic” but is reasonable for many applications. Detractors point out that GigE Vision, unlike Camera Link or FireWire, does not have guaranteed image packet delivery. Again, true, but in practice not an issue for many applications.
Firewire is not a dedicated machine vision standard, but under the context of the IEEE1394 standard, the specification for industrial and instrumentation digital cameras (IIDC) defines capabilities for camera control that make Firewire cameras more appropriate in machine vision applications using a basic PC connection (unlike basic USB cameras which do not have basic control specifications or triggering and are not well suited for machine vision). Firewire interfaces are have lower bandwidth and speed as compared to GigE Vision, and have very short maximum cable length, but otherwise have similar features. Firewire image acquisition uses less CPU than GigE Vision, and the protocol is very standardized, even “plug-and-play.”
New additions: CoaXpress, USB3 Vision, 10GigE Vision, CameraLink HS
Some emerging camera interface protocols bear special consideration. CoaXpress and USB3 Vision cameras are available and the base seems to be expanding. 10 GigEVision and CameraLink HS have not been widely implemented yet. The CoaXpress interface protocol uses coaxial cable for image, data, signal and power transfer. The potential throughput rate is greater than any other protocol, except for 10GigE Vision, and provides for additional scaling using multiple cables. The cost-effective cabling might allow a CoaXpress camera to replace an old analog camera using existing infrastructure. CoaXpress does require a dedicated frame grabber card, but overall may be an option that demands close consideration as more devices become available.
USB3 Vision takes advantage of the USB 3.0 port that will be implemented on all newer PCs. Speeds are faster than the older protocols and potentially even higher than the new CameraLink HS. Cables are standard and can have power, but still suffer from very short maximum lengths. With low cost and system complexity, USB3 Vision may be useful for specific applications.
Other practical camera integration considerations
Sensors and resolution
When considering any imaging source for a machine vision application, resolution remains one of the most important selection criteria. The need for pixels is determined by application requirements, and ultimately and fundamentally dictates the scope of imaging component selection. There may be trade-offs to consider regarding interface selection, but these must take a back seat to the implementation of the correct and appropriate pixel resolution over the desired field of view.
Keep in mind also that not all cameras are created equal. Within the marketplace one can find digital cameras based upon the same exact sensor, but the remainder of the components and firmware may not be at all similar. If necessary, select an appropriate sensor, but remain aware of other features and issues that might affect the ultimate delivery of a high-quality image from an easy-to-integrate component.
With higher resolution comes smaller pixel size or larger sensor size or both. Smaller pixel size results in less light-gathering capability (requiring more illumination), and increases potential for noise and reduced dynamic range. Very small pixels also may contribute to loss of resolution due to diffraction, counterintuitive to the intent of having more pixels.
Software and image acquisition
Camera functions can be highly flexible and programmable through parameter settings. Often cameras have the capability to do a variety of image processing tasks internally, and can provide data as well as image information. The GeniCam standard mentioned earlier details a generic programming interface designed to simplify the camera application programming interface (API) and make camera control standard regardless of the manufacturer. Many cameras are GeniCam “compliant” or “compatible.”
However, in commercial practice GeniCam is something like the “pirate’s code” referred to in a familiar movie where it was said “the code [standard] is more what you’d call ‘guidelines’ than actual rules.” Many camera manufacturers do not fully implement GeniCam, or modify it so that the implementation is unique to their specific products. This is not a shortcoming and might result in a better product. The end result though from an integration point of view is that the user occasionally (or frequently) might have to manually access camera parameters using explicit register addresses or command strings in order to manipulate advanced or sometimes even basic camera function control.
Machine vision software and libraries usually indicate GeniCam compatibility, but native commands for even image acquisition may not always initiate the correct or desired camera configuration. For certain applications it may be critical that the integrator be very familiar with the camera control set or register structure, and be able to manipulate parameters where needed, not just for basic functionality, but to get the best performance out of the camera and the application.
The starting point of all machine vision applications is the acquisition of an image. In an industrial setting, this acquisition commonly must be event driven; that is, an image must be captured when a part or feature to be inspected is in place in front of the camera. The acquisition is “triggered” by an external signal. In some applications, this trigger might be received by the PC or processor, and then a software command sent to one or more cameras to initiate the acquisition. If this architecture is used one must be aware of software triggering latency and jitter (non-determinism) in the image acquisition, a critically important point if the part is moving or the inspection involves high speeds or simultaneous imaging. Often a better approach is to use the capability of the camera by directly triggering the camera using available on-board I/O. This produces zero latency (except for that of the trigger signal) and allows precision synchronization of multiple cameras.
Finally, a word about inspection software. The machine vision market has offerings both in turnkey programs for PC-based inspection, as well as libraries which must be used in a programming environment like VB.NET, C#, or C++. In general, these tools provide a high level of flexibility, and often more processing and analysis options than available on other platforms. Certainly with capability comes some degree of complexity, but that should not be a detraction. Packages are available that are highly user-friendly and still very powerful in providing access to the types of cameras discussed here.
A bright future for digital cameras in machine vision
Looking forward, it appears that ease of integration, power and capability have come together in the camera marketplace. End-users and integrators have embraced the “tethered” camera platform as a viable option in a growing application base. This is an architecture worth considering for your next inspection application.