Consider one of the world’s prevalent issues, blindness. According to the World Health Organization (WHO), 39 million people were estimated to be blind in 2014. Keeping this statistic in mind, various organizations have tirelessly worked to educate and empower people who are blind to complete everyday tasks which are more tedious.

Enter the National Federation of the Blind (NFB). The NFB is the largest organization of blind and low-vision people in the United States. Its main purpose is the complete integration of the blind into society on a basis of equality. An example of this integration is driving, a challenging feat indeed for one that is blind.

However, the NFB doesn’t think this way. Instead, it is viewed as an opportunity. The NFB developed the Blind Driver Challenge which calls upon developers and innovators to create interface technologies to allow those who are blind to drive a car independently. Held at the Daytona Speedway as a pre-race event for the Rolex 24, a blind driver was to independently drive the vehicle down the main straightaway, onto the road course. The Rolex 24 is an annual sports car endurance race which lasts for 24 hours. It is organized by the International Motor Sports Association and is one of the races for the Tudor United SportsCar Championship racing series. On the course the driver needed to avoid obstacles such as cones and boxes being thrown at the vehicle, and finally complete a moving pass of the lead car before crossing the finish line. This was an opportunity for TORC to live out their belief.

Drive-by-wire: A Way to Drive Electronically

TORC partnered with the Robotics and Mechanisms Laboratory (RoMeLa) at the Virginia Polytechnic Institute and State University (Virginia Tech) to develop vehicles for the next generation of Challenge vehicles. Using a crossover SUV, TORC implemented its ByWire drive-by-wire conversion modules, SafeStop wireless emergency stop system, and PowerHub distribution modules on the vehicle. Drive-by-wire gives a driver electronic control of a vehicle. The premise comes from the fly-by-wire system. Essentially, an aircraft’s controls produce electronic signals which are read and put through computing systems connected to actuators that control the surfaces of the wings and tail. Jesse Hurdus, TORC’s project manager for this event, stated, “Cars are much further behind in taking this step. In order to have an autonomous vehicle, you need to have it so a computer can control the throttle, transmission, and braking systems. This is drive-by-wire”.

However, accomplishing this challenge is more than just controlling various vehicle functions. A driver must be aware of their surroundings and react to any obstacle that may present itself. The use of light detection and ranging (LIDAR) helps perceive the environment electro-mechanically. LIDAR measures distance by emitting a laser pulse and analyzing the reflected light. “It will give you a 3D point cloud of the obstacles around the vehicle. You can determine from this data what are these obstacles that the driver needs to drive around or what part of the ground can be driven over. It is great for obstacle detection or object segmentation”, explains Hurdus. Yet, LIDAR does have its setbacks. Hurdus continued, “one of the big places LIDAR becomes difficult is with obstacle classification. All you have is the shape of an obstacle and that’s it. There are many instances where this can be very difficult, such as, off-road applications and differentiating objects such as vegetation from other solid objects.”

Allied Vision’s Prosilica GC1290C Camera provides the Solution

TORC used Allied Vision’s Prosilica GC1290C camera to help overcome the challenges LIDAR presents. “Allied Vision’s camera was used to help on the perception piece: to take sensor data from the world and feed it into the software to provide an understanding on what’s around the vehicle”, Hurdus mentioned. The camera helped detect lane markings, which do not come up well on LIDAR since they are painted onto the road. The information is fed back to the autonomous system and provides input to the blind driver so that he or she can keep the vehicle centered and within the lane. In addition to this benefit, Hurdus recalled, “the Prosilica GC1290C had a Gigabit Ethernet interface which made it nice for us to get all the data and process on the computer in the vehicle.”

The blind driver is outfitted with special gloves, called DriveGrip, and sits on a special padded insert placed on the driver’s seat called SpeedStrip. The gloves contain small vibrating motors on top of each finger which help relay steering information from the autonomous system. The padding on the driver’s seat also contains vibrating motors stretching along the driver’s legs and back. This relays the vehicle’s speed information. As the driver drives, the padding on the seat will vibrate along the legs if the driver needs to accelerate. If the driver needs to apply the brakes, vibrations will be felt along the back. The gloves vibrate on the hand which signals the direction the car needs to be turned. The degree of angle is indicated by vibrations on particular fingers. For example, vibrations on the index finger signals a slight right turn is to be made. Similarly, vibrations on the pinky finger indicates a hard right turn. As the driver completes the turn, the vibrations will either stop immediately, if felt on the index finger, or, move across the fingers in the opposite direction of the targeted finger (e.g., pinky to the index finger) and then stop.

While the focus of TORC’s systems were specifically for the Challenge, they can be potentially used for future solutions. Hurdus concluded, “This was an exploratory effort to see how we could use the cameras to achieve the goal. A person blind from birth was able to drive a vehicle outfitted with sensor technology to give him an understanding of the environment generated by a combination of Allied Vision’s cameras, LIDAR systems, and GPS localization systems. The fusion of all this data was able to give this person the ability to ‘see’ the environment as a person would be able to see through their own eyes.”