The average automobile relies on millions of lines of code to provide improved fuel economy and emissions, as well as infotainment features such as navigation, and to respond to safety threats such as vehicle instability with corrective actions. Software has proven to be critical to innovation in mechatronic systems including automobiles, appliances, medical devices, and energy systems. However, the nature of embedded software development dictates that as many as 20 software defects are produced per 1000 lines of code, which presents a formidable challenge for the engineers tasked with ensuring quality in these systems.
Increased complexity has been an ongoing trend in transportation systems for decades—so much so that software has expanded from a single electronic control unit (ECU) to a network of ECUs working in harmony to orchestrate multiple electromechanical subsystems in a single system such as a train, aircraft, or automobile. This evolution in complexity has forced validation engineers to evolve the way in which they find and correct the growing myriad of defects inherent in embedded software design.
Since the arrival of embedded software to mechanical systems, more sophisticated testing approaches such as hardware-in-the-loop (HIL) test systems are being utilized to allow software testing to begin earlier in the development cycle—ultimately reducing validation time and cost. In the same way, development projects involving systems-of-systems are employing system integration labs to identify system-level errors before the first system is even assembled.
System integration labs use multiple HIL simulators operating in unison to accurately represent the interaction of the entire system with the network of ECUs used to control it. These simulators are networked using deterministic data transfer mechanisms such as reflective memory, synchronization interfaces such as IRIG-B, and ethernet to provide non-determinist communication between each simulator and the host interface.
Historically, system integration labs have been created by attempting to scale solutions intended for testing single ECU subsystems. The approach has been to “string” all of the subsystem simulators together to build a complete system integration lab. While in theory this approach is appropriate, the reality is that testing an entire network of ECUs poses many unique challenges—in regard to cost, complexity, and ease of use—that are either not present or are of a small enough scale that they go unnoticed at the subsystem level. As system complexity continues to increase, these limitations cannot be tolerated; therefore, test engineers have again evolved their techniques to stay in front of system complexity.
Time is the most precious commodity for the test engineer, and errors in the creation of a test system can equate to a punishing hit in time with its severity depending on when the test system error is discovered and how long it takes to fix the test system. The sheer scope of a system integration lab makes it particularly susceptible to these pitfalls.
Often involving thousands of hardware I/O channels, multiple data bus networks, deterministic model execution, hundreds of thousands of internal variables, and multisystem synchronization and communication, these test systems require significant effort to create, qualify, and reconfigure for evaluating multiple product configurations. Automation is the proven antidote to reducing time, human error, and its significant impact on system integration lab development and operation.
While automation cannot solve the mechanical and electrical challenges required to physically lay out and connect these systems, it can significantly reduce configuration, deployment, operation, and evaluation time for these systems. With reasonable investment on the front end of a project, system configuration, operation, and reconfiguration can be achieved most efficiently through the automation of:
• Simulator configuration—I/O hardware, model execution, signal mappings
• System configuration—simulator-to-simulator communication and synchronization
• Host interface configuration—user-interface creation and connection to the system
• Stimulus generation—test case generation, mapping, and distribution to simulators
• Data logging and results analysis—logging system setup, analysis automation, report generation
This investment in automation can significantly decrease the manpower necessary to operate and maintain the lab as well as reduce the lab’s downtime between tests. This was certainly the case for Embraer for its Legacy 500 executive jet program. The Embraer Legacy 500 is the first aircraft in its class designed with full fly-by-wire controls. More than 50 embedded computers must be validated in multiple product configurations before the first test flight takes place.
This system consists of a model made up of more than 90,000 parameters executed across 21 real-time PXI-based simulators that are networked together using IRIG-B for synchronization, reflective memory for deterministic data sharing, and ethernet interfaces for communication with the host interface.
Building the infrastructure to completely script every aspect of their system integration lab using the rich API libraries and system configuration features provided by NI VeriStand, Embraer reduced initial development time by 12 months compared to the previous program with additional operational benefits realized in the usage of the lab.
Automation is the way to tame the unique challenges of system integration lab development and maintenance amidst growing system complexity. And mastery of this technique will be seen as a differentiator between vendors—not in quality of their systems but in the amount and level of innovation in their product. This is because customer demand will ensure that they deliver the expected level of quality; whereas, the amount of effort required to achieve this goal will determine how much resource remains to innovate and ultimately differentiate.
Chris Washington, Senior Product Manager–Real-Time Test, and Ian Fountain, Director of HIL and Real-Time Test, National Instruments, wrote this article for Aerospace Engineering.