- THE MAGAZINE
- WEB EXCLUSIVES
Interfacing cameras to computers used to be a challenge. In the 1980s, when machine vision was in its infancy, most cameras used analog interfaces based on closed-circuit TV (CCTV) standards such as RS-170. Analog camera data was sent as composite video down coaxial cable and digitized by frame grabber hardware that was built into special hardware dubbed a “machine vision system.”
In the 1990s, digital cameras proliferated, but most digital cameras used parallel data transmission, requiring cumbersome and quite short, expensive cables. Toward the turn of this century, PC-centric networking topologies like USB and IEEE-1394/FireWire became more and more universal and offered a tantalizing taste of what easy machine vision might be like.
For machine vision applications, however, both delivered some frustration and disappointment. Mass-market video-to-PC technologies are used primarily with live video to make movies or to display live images for entertainment. Sometimes, the video must be saved to disk, but that is about the extent of the requirement. This means that USB and FireWire video have been optimized for transferring a stream of images from a camera directly to a display processor or directly to a mass storage device.
In machine vision, operators want a single image, or a set of images from a number of cameras, and they want the image pixel data to wind up in the host computer memory. The image data is considered one frame at a time, and it must be transferred to host computer memory as fast as possible so that image processing and analysis algorithms can start looking at the pixels and making decisions rapidly. It is not at all uncommon for a machine vision application to require that an image be fully analyzed within a few tens of milliseconds after it is acquired; indeed, the next image to analyze may already be on its way by that time.
USB cameras never really came close to providing good support for these types of transfers, and a workable FireWire interface took years to stabilize. Proprietary drivers were required, support for external trigger and strobe was camera-specific, and a huge amount of work had to be done before a reasonably stable set of solutions took hold.
Along the way, the machine vision industry became understandably impatient, and so embraced CameraLink, a high-speed combination serial-parallel digital interface based on the National Semiconductor ChannelLink chipset and low- voltage differential signaling (LVDS). CameraLink works well, with fast speeds up to 2.38 Gigabits per second (Gbps).
An AlternativeBut now there is another general-purpose networking alternative for machine vision image data. Although not quite as fast as CameraLink at this time, the 1000BASET-T Ethernet standard, dubbed Gigabit Ethernet, or GigE, offers 1 Gbps transmission rates, a path forward to emerging higher-speed standards, and something surprisingly useful-conformance with the same networking infrastructure that is used in the bulk of today’s computer networks.
GigE is the same old Ethernet Local Area Networking (LAN) technology that has been around since the 1970s. It has gone through numerous upgrades from the old days of coaxial cable to the new twisted pair standards that run at up to 1 Gbps with cable lengths up to 100 meters. Emerging standards for 10 Gbps (10 GigE) and even 100 Gbps are in the design phases, although these may require single- or multi-mode fiber optic cable to achieve reasonable cable lengths.
A GigE vision system consists of entirely standard and familiar components. For example, it could use a 1/3-inch 1,024 x 768 color camera that runs at up to 20 frames per second (fps). In 8-bit YUV422 format, this works out to about 126 megabits per second (Mbps), about 1/8 of the GigE channel capacity. An accessory cable runs from the silver Hirose connector on the camera back to the connector lying atop the camera and provides convenient local access to trigger, strobe and camera power connections, while the blue CAT-5e or CAT-6 Ethernet cable carries all video commands and data and runs up to 100 meters to the GigE switch.
The switch allows many cameras to be connected to the same port on a PC. A single GigE port can safely service several such cameras as long as the total data transfer requirement stays well below 1 Gbps.
A good rule of thumb with these networks is to pretend that bytes (B) contain 10 bits (b) instead of 8, and thereby rate the 1 Gbps channel as having a 100 megabyte per second (MBps) capacity. The other parameter to watch is latency-the time required to transfer the data from the camera to the PC. For a 1,024 x 768 camera, the frame size is 786 kilobytes (KB), which at 100 MBps will experience a transfer latency of about 8 milliseconds (mS). While this is not much time to wait, it is certainly something that needs to be understood in the design of the machine vision application. At 20 fps, acquisition time itself is 50 mS, so the transfer latency often is not significant for standard applications.
This analysis reveals the most significant advantage and challenge of GigE video; the GigE topology converts the problem of designing image acquisition hardware from a video problem to a networking problem. Fortunately, there are many low-cost and readily available products around to help with high-speed networking problems.
The cameras use UDP (the unacknowledged cousin of TCP), IP and the Ethernet MAC protocols for all of the camera-to-PC communications, handshaking and packet assembly/disassembly. Those who are not sure what this alphabet soup stands for should ask their IT person. They most likely know it all inside and out.
Vision ApplicationWhat about making this into a real machine vision application? The industry’s experience with FireWire and CameraLink helps tremendously here. The accessory connector on the standard GigE camera supports the local trigger and strobe control needed to synchronize image acquisition and to fire an illumination system for stopping fast-moving parts. In this way, the camera is triggered directly and fires an optional strobe before commencing acquisition.
Furthermore, the Automated Imaging Association (AIA) has rolled out a set of protocol standards called GigE Vision. These standards, along with a way of describing camera capabilities called GenICam, make it possible to mix and match different cameras from different vendors on the same GigE network. Camera setup and control functions are standardized across vendors for the first time in machine vision history. GigE Vision also reduces the complexity of changing camera hardware in an existing application, permitting much easier upgrades, for example, to higher-resolution or faster cameras when application requirements change.
Most vision system and camera suppliers now extensively support GigE and GigE Vision, leaving more time for the vision experts to worry about lighting, optics and what to do with the pixels after they arrive, instead of camera interfaces, frame grabbers and specialty cabling. This clears the path toward bigger, more complex and more reliable machine vision applications.
Ned Lecky is the president of Lecky Integration (Albany, NY). For more information, call (518) 258-5874, e-mail firstname.lastname@example.org or visit www.lecky.com.
Tech Tips- GigE Vision standards make it possible to mix and match different cameras from different vendors on the same GigE network.
- GigE Vision also reduces the complexity of changing camera hardware in an existing application, permitting easier upgrades when application requirements change.
- Most vision system and camera suppliers now extensively support GigE and GigE Vision, which clears the path toward bigger, more complex and more reliable machine vision applications.