Over the last decade imaging system designers have faced a new set of challenges. Ten years ago the performance of standard camera lenses was sufficient to get optimum performance out of the available sensors. Sensor resolution is twenty to fifty times better than that available just ten years ago, and pixels in many cases have gotten much smaller, with currently available sensors providing up to ninety times more pixels in a given area. For machine vision systems to take advantage of high-performance sensors, the optics must meet increasingly challenging requirements. Today’s system designers must often coax maximum performance from their optics. But there is no “one size fits all” optical design that guarantees optimum performance, and designers must define their requirements wisely to ensure a given optic will do the job. Before starting to define their system requirements, designers need to recognize that the performance of even a “perfect” design is limited by the laws of physics.
A perfect lens would be one that transferred light from one infinitesimal point from a plane in object space to a unique corresponding infinitesimal point in image space. But no lens focuses light down to an arbitrarily small point. The physical phenomenon of diffraction spreads the light out around the point in image space, typically in a circular pattern called the Airy disk. As points in image space near each other, the disks overlap, until at some point the two spots are indistinguishable. This point, the physical distance representing how close two points can be and still be resolved, is called the diffraction limit. The diffraction limit is controlled by the working f/# of the lens and the wavelength(s) of light that pass through the lens, as in the following equation.