The Texas A&M Bipedal Experimental Robotics (AMBER) Lab, led by Dr. Aaron Ames, is devoted to theoretical methods for bipedal robotics that can successfully be transferred to practical implementation. The overarching goal of this research is to understand fundamental walking mechanisms in order to develop the next generation of robotic systems. Next-generation applications will range from prosthetic devices to legged robots for space exploration.
Control MethodologyWe began examining experimental human walking data and found that specific human outputs, such as functions of the joint angles, are all characterized by a single function of time, termed the canonical walking function. Using these human outputs, we designed a human-inspired controller that aligns the output of the robot to the output of the human (as represented by the walking function). We choose the controller parameters through a human-inspired optimization equation that finds the best fit of the human data while simultaneously guaranteeing bipedal robotic walking.
To develop the controller for our underactuated bipedal robot, named AMBER (Figure 1), we began with a SolidWorks model (Figure 2). This helped us develop a highly accurate mathematical model of the hybrid (continuous and impact) dynamics for our system. We obtain the controller parameters by numerically solving the human-inspired optimization in MathWorks Inc. MATLAB® software. To simplify implementation and reduce computational overhead, we constructed a proportional controller on the human-inspired outputs by considering the actual outputs of the robot versus the desired outputs as represented by the canonical walking functions. We used numerical simulation to verify that the proportional controller results in robotic walking.
Implementation Using LabVIEWTo experimentally implement these formal methods for the human-inspired proportional controller, we used LabVIEW software with an NI cRIO-9024 embedded real-time controller and an NI cRIO-9114 chassis with five NI 9505 DC motor drive modules. Figure 3 shows a graphical overview of the implementation. We constructed the desired output functions, which resulted in walking simulation, in LabVIEW Real-Time using exponential, sine, and cosine floating-point blocks. In a similar way, the actual robot outputs are computed from the joint angles as calculated using fixed-point arithmetic in the field-programmable gate away (FPGA) coupled with “foot strike” detection. The robot’s configuration is then relayed to the real-time target where a proportional controller is implemented based on the difference between the actual and desired outputs. The end result is a duty cycle equivalent of voltage that needs to be applied. A pulse is then generated in the FPGA and passed to the NI 9505. Due to the low computation overhead, which is a direct result of the controller design, all these operations are carried out in under 5 ms.
By implementing human-inspired control in LabVIEW, we experimentally obtained bipedal robotic walking on AMBER. The key features in LabVIEW that played a crucial role in the experimental implementation were:
The Impact of NI Products and Services on the AMBER LabTo achieve robotic walking on AMBER, we needed a platform with flexible software that could interact with MATLAB and provide a real-time interface to translate the walking derived in simulation to experimental robotic walking. We initially considered microcontroller-based sensing and motor control solutions. Instead, we were drawn to LabVIEW and NI products because of the intuitive graphical programming, real-time OS capabilities, ability to interact with third-party software, a wide variety of analog, digital I/O interfaces, and most importantly, reconfigurable FPGA options through hardware abstraction. With NI products, we could rapidly implement control algorithms, reuse existing code, and increase the efficiency in executing time-critical tasks by delegating the tasks between the real-time processor and FPGA hardware. We also saw unexpected benefits in that we greatly enjoyed using the network shared variable technology for file I/O and the flexibility of the chassis platform to accommodate a wide array of I/O modules.
In addition to the appeal of NI products, the support from NI was vital in experimentally realizing bipedal robotic walking in such a short timeframe. We had numerous interactions with NI engineer, Andy Chang, regarding the selection of software and hardware required for the project. We also sought his advice in various capacities to improve our LabVIEW algorithms. The NI Developer Zone and NI customer support for LabVIEW code development using LabVIEW FPGA logic were also instrumental in resolving technical issues and accelerating the completion of our project.
In the near future, we plan to use the LabVIEW Robotics Environment Simulator and the LabVIEW Control Design and Simulation Module for rigid body simulation of AMBER. We will further investigate a streamlined methodology for design, simulation, optimization, and implementation.