At the time of writing, six years have passed since the launch of GigE Vision 1.0. The standard has become one of the dominant interfaces in the machine vision market and continues to grow. This article traces the path blazed by GigE Vision, highlights the benefits and important features in the spec, and describes the steps the GigE Vision Committee is making to ensure end-users get quality, GigE Vision-compliant devices.
IN THE BEGINNING, THERE WAS ANALOG
The decade preceding GigE Vision’s 2006 launch saw the birth of digital solutions for the imaging industry. The late 1990s brought us IEEE 1394, or FireWire, and then Camera Link in 2001, but these interfaces didn’t wipe analog off the map. FireWire cameras were inexpensive and used a standard NIC instead of a frame grabber, so its attractive price point got a lot of attention. Camera Link offered higher acquisition and frame rates, but was not a low-cost solution. But both FireWire and Camera Link were limited to 5-10m cables and that prevented both standards from completely taking over the market.
GigE Vision, then, was developed as a digital solution that would permit long cables for machine vision applications, essentially offering a digital alternative to analog cameras. GigE Vision is a pair of protocols that handle the transfer of data, in packets, across a network; one protocol dictates camera control and the second controls image streaming. Because it is built on Ethernet technology, the IEEE 802.3 Gigabit Ethernet standard, setting up multiple cameras in a vision system is as simple as building a network of PCs.
According to the AIA’s 2012 Machine Vision Camera Study, GigE Vision and FireWire are about equal when you compare the number of units sold. When you look at the benefits of GigE Vision, it’s not hard to see why it’s become popular. GigE Vision is fast: it supports real-time transfer rates of 125 MB/s. Standard CAT5e or CAT6 cables can be up to 100m long and use standard connectors. The Ethernet protocol’s native support for fiber optics means that GigE Vision applications have the potential to operate over a range exceeding 500 meters, and even operate wirelessly. GigE Vision cameras are inexpensive and don’t need a frame grabber, so equipment costs remain low; you do need a network interface card (NIC) but most PCs already have them on-board. What really makes GigE Vision different from other standards is how it’s changed the traditional vision system’s structure. Instead of a PC controlling a camera in a point-to-point configuration, GigE Vision allows a single remote PC to control a network of cameras, and as such, that image data can be broadcast throughout the network.
AVAILABLE FEATURES IN VERSION 2.0
IEEE 1588, the Time Precision Protocol, was included in GigE Vision 2.0. The protocol essentially synchronizes the on-board time stamp of the cameras. Particularly useful for parallel systems with lots of cameras working on different PCs, it’s a feature that is unique to GigE Vision when compared to the other camera standards. Synchronizing cameras in a traditional frame grabber system requires a separate cable that runs between all the frame grabbers—which must then be synchronized in the application’s code, a process that many vision engineers describe as cumbersome. The IEEE 1588 protocol essentially broadcasts a master clock to all devices on the network. Those devices measure the latency through the Ethernet switch or the NIC and can compensate for it. Cameras, then, can synchronize image acquisition within a few microseconds, which is acceptable latency for many applications.
Data Reduction and Control
Many industries looking at MV cameras see a lot of interest in data reduction. Sometimes even GigE Vision, with its 125 MB/s bandwidth, isn’t enough. For some applications with many cameras that converge to one PC, like traffic surveillance, image compression is the only way to transfer all the image data. GigE Vision 2.0 defines a protocol for splitting compressed data (JPEG, JPEG2000 and H.264) into packets that are easily transferred and reconstructed by the Host PC.
The GigE Vision video streaming protocol states that data must be transferred in three blocks. A leader packet tells the PC what’s coming and a trailer packet closes the image. In between are the data packets, which contain the image. The number of transferred data packets depends on the size of the image. We tend to forget that the PC is at the end of the cable, and must reconstruct the packets into the image. It’s a necessary step that uses the CPU’s resources. GigE Vision can use all-in-one transmission to reduce leader and trailer packet processing overhead by grouping the leader, image data and trailer information in one packet. It is useful for transferring small images or single line image data from a line-scan camera.
The addition of metadata to the streamed image is a new but useful tool for machine vision applications.
Suppose you have one of the new CMOS sensors, one that allows you to have multiple ROIs. If you have a big image and specify those smaller regions, you can use all-in-one transmission to reduce the data transfer overhead to the host. With all-in-one transmission mode you can put each ROI into its own packet, thereby reducing the overhead. You could even use jumbo packets (8000 bytes) to accommodate larger ROIs.
Even though 10GigE and link aggregation are actually defined by the IEEE and indirectly supported in GigE Vision 1.0, this was clarified by the spec in 2.0. Link aggregation refers to bundling multiple cables into one large virtual cable to increase the bandwidth. So one cable gives you 125 MB/s and two double the speed, but the GigE Vision specification allows the system to see those two cables as one.
Flow control has also been improved in 2.0. The purpose of GigE Vision is to stream packets to the PC as quickly as possible. While networking equipment is sufficient for activities such as streaming an mp3 file over the internet, a lot of networking equipment is not designed or tested to work at the high speeds that machine vision demands. The IEEE resolved this by defining a pause mechanism, a handshake between the transmitter and receiver to regulate the flow of the packet transmission. When the receiver sees too many packets or notices its buffers are full, it signals the transmitter to wait a certain number of micro- or nanoseconds. The GigE vision committee recommends camera manufacturers implement the pause mechanism to regulate the transmission flow.
The GigE Vision specification has also supported non-streaming devices since version 1.2. Now the same GigE software that controls cameras can control devices such as an I/O controller or a light source.
The addition of metadata to the streamed image is a new but useful tool for machine vision applications. Currently, GigE Vision is the only standard that supports it, though the upcoming USB3 Vision will support it as well. It is becoming increasingly common for vision applications to acquire images with different camera settings; an application may even toggle between multiple settings. Metadata provides a method of including camera information with the image, for example, the length of the exposure, a histogram, or gain or offset.
GigE devices are created equal
The GigE Vision Committee exerts a lot of effort to protect the integrity of the standard, and ensure product compliance quality, and interoperability. Manufacturers must register their products for compliance before they can bear the GigE Vision logo on their packaging. The Committee provides an automated test plan, a test suite that evaluates the requirements of the spec; the tests generate a tamper-proof report that must be emailed to the AIA by the device manufacturer.
Currently, the Committee is focused on providing more robust tools to help manufacturers develop GigE Vision cameras. Wireshark is an open source packet dissector. It’s a debugging tool that examines all the packets that travel the network and presents the data in a way that a human being can easily understand what’s going on. By providing a free plug-in tool, the more tech-savvy customers can use it to develop and debug their systems while the Committee maintains a standardized testing process.
Simple, practical, affordable
So why use GigE Vision? There are certain types of applications where it is easier to deploy with GigE Vision than with other technology. Traffic surveillance is one example that benefits from Ethernet technology. It’s not practical to mount a PC to a traffic light, but with a long cable it’s easy to transfer all the data to a centralized control room, or even transmit it wirelessly. Likewise, for any application in a harsh environment, such as one where the camera is exposed to excessive heat, vibration, water, or dust, it is preferable to have a remote PC for processing, and Ethernet technology makes it easy to do.
The benefits are compounded for applications with multiple cameras. Not only is adding or removing cameras as simple as plugging a device into or removing it from a network, but GigE Vision’s clock synchronization mechanism lets you control multiple cameras much more easily than with a traditional frame grabber-PC system. And that is important whether you inspect a large object, such as a car, or are bound by legal reasons to simultaneously acquire images of a car’s front and rear license plates and the driver’s face.
Ethernet has been extremely popular for non-machine vision applications, and there has been a surge of GigE Vision cameras deployed in non-traditional areas such as outdoor inspection and intelligent traffic systems. But even for the traditional machine vision market, the high bandwidth, low costs, simple connectivity and long cable length make GigE Vision a very attractive option.
GigE Vision was developed as a digital solution that would permit long cables for machine vision applications.
GigE Vision 2.0 defines a protocol for splitting compressed data into packets that are easily transferred and reconstructed by the Host PC.
Quite simply, there are just certain types of applications where it is easier to deploy with GigE Vision than with other technology.