Learning with Lecky: The Return of Linux
March 31, 2009
I never thought I’d say it, but Linux is starting to look like an appealing solution for machine vision application development and deployment.
I started out as a UNIX C programmer back before, well, before Windows existed-heck, before MS-DOS existed for that matter. And after a few weeks of working on a machine vision application server based on Ubuntu Linux, I must say it all seems more than a tad bit retro to be working in this world again.
But what a relief, too. My old company, Intelec Corp., invented a Windows-based machine vision software program called Sherlock back in the early 1990s. This product still exists, in vastly and beautifully-improved form, as part of DALSA/iPD’s software line today.
Back then, developing for Windows was difficult but exciting. It was emerging and growing in new ways that had seemed impossible just months before. Windows 95 blew the top off of all of the application possibilities we had dreamed of and enabled real-time applications, sleek and easy-to-develop Visual Basic front-ends, and a new way of distributing, licensing and extending applications that was truly new.
But now, 15 years later, things have stagnated in the Windows world. The growth is not so much in the area of performance computing, but more in the areas of extreme object-oriented-ness and oddity languages that have more of a cult feeling than a general usefulness. As an old APL and FORTH programmer, I understand the appeal of cult languages, but they probably aren’t adding much to the machine vision world except for giving a few programmers an excuse to play around with some shiny new objects.
What’s different about Linux? Well, you can develop code back in a very much stripped-down and clean environment. Straightforward C code that is easy to develop, maintain, debug and repackage for deployment in other environments is very satisfying to work with. Days and days of often-unnoticed lost productivity dealing with installation mechanisms, frameworks, specialized operating system and user interface builders, tuners, hooks and callback functions that distract a programmer almost to madness are simply gone in this environment.
You should see how much more work I get done working in a command-line environment using vi… I know, I’m sounding like someone from the Jurassic period, but once you get the Internet browsers, mail apps, calendar bonging programs and music/media/paparazzi programs out of your face, you can almost feel the brain starting to activate again.
And there’s another amazing advantage that I’m just rediscovering. Rapid deployment and integration today often push us in the direction of open systems, networking, web access and database management. These are all technologies that were born and came to maturity in the UNIX environment, and which still work best and most reliably there. In an era when profit margins are so thin and development budgets so small, being able to rely on simple, clean, well-debugged, open-source-and free! -software is an amazing advantage. And through the use of virtualization, using software from Parallels or VMware, or many open-source alternatives, taking advantage of the hardware capabilities of our powerful computers by simply running several copies of Linux concurrently, and perhaps one copy of Windows for user-interface support if required by a customer, is a powerful architectural idea.
My Apple MAC OS X laptop is presently also running Windows XP (for an Altium circuit board design that I’m procrastinating on), while also running Ubuntu Linux Server for web hosting and FTP services, as well as a copy of Ubuntu Desktop Linux which has a beautiful GUI. Reliability, robustness, and performance are all excellent, and this is just on a laptop. So the next time you’re looking at a machine vision architecture or algorithm development project, keep Linux in mind and look long and hard at our old friends, UNIX/C and the amazing Linux community that carry them forward today.