PXI-Based Testing with HIL-Based Simulation

Author : Raffaele Fiengo & Samah Chazbeck, NI

17 June 2023

Figure 1: Illustration of cost commitments even at the earliest stages of system development
Figure 1: Illustration of cost commitments even at the earliest stages of system development

In the past complex test set-ups called for purchasing numerous items of bulky equipment, with each constituting a significant investment. The shortfalls of this approach have become increasingly apparent, as system complexity has risen. Consequently, major changes to test strategies are now underway.

One of the drawbacks of test systems made from discrete items of equipment is that such arrangements will be wasteful. Industry studies and customer feedback show that, on average, only 30-40% of purchased box test systems’ functionality is actually used. If systems then have to be expanded, additional boxes will need to be stacked up on racks (with only a fraction of the functionality of each being utilised). On top of heavy costs being accrued, there is the issue of integrating everything together - with interoperability, high throughput data transfer and synchronisation between the different items proving difficult to achieve. 

The arrival of modular instrumentation has changed this. Now test teams only need to buy in the functionality they use and have the flexibility to add more in the future - with money no longer being squandered on superfluous capabilities. Also, this provides the flexibility that any function can be added as requirements dictate.  

Embracing a modular paradigm
Analysts at Data Bridge Market Research recently published a report predicting that the modular instruments market will experience a 6.8% CAGR between now and 2029. It will hence have a $3.67 billion annual worth by the end of that period. 

Through use of high-performance modular instrumentation that leverages PXI technology, test engineers are presented with a highly effective platform on which their test/validation workflows can be built. It means they can swap in/out different functional elements when their test criteria alter, as well as adding additional capacity (such as more I/O channels, or data processing resources) as testing activities associated with a project ramp up. Since all the functional elements contained within the chassis are synchronised with one another, multiple low-latency, phase-coherent channels can be handled.    

Subsystem and system level testing
Throughout many industry sectors, adoption of ‘concurrent engineering’ is now underway - making the argument for migrating to modular test arrangements even stronger. Here OEMs have separate development teams all working on parts of the system in parallel. By being able to carry out testing to subsystems at the early stages of a project, engineers can avoid the risk of issues emerging further down the line. As a result, the financial outlay, engineering effort and time needed for redesign work are not needlessly expended, with projects completed in budget and to deadline. 

Figure 2: Schematic showing a radar system test flow - from components and subsystems right through to an entire system
Figure 2: Schematic showing a radar system test flow - from components and subsystems right through to an entire system

Studies show that when a development project commences, in the region of 80% of the costs are already committed. Therefore, if prospective failures are not uncovered early on (prior to complete system assembly) then substantial amounts of money can be lost. Engineers need to broaden their test coverage - going beyond the functional parametric testing of components or modules, and accurately determining how they will perform at a system level.
   
By complementing model-based testing with hardware-in-the-loop (HIL) simulations, complex system designs can be validated (and issues identified) while keeping down engineering overheads. The following aspects are essential to such validation: 

•Wideband signal generation and data acquisition
•Multi-channel phase alignment
•Real-time signal processing  

Phased array use cases 
Digitised phased arrays are now being employed in a plethora of RF applications and highlight the importance of HIL simulation. The following examples will look at the challenges involved. 

Radar - Active electronically scanned array (AESA) radar systems comprise huge numbers of small antennae made up of transmitter/receivers. Tens of thousands of them may be utilised in longer range systems. Via phased array transmitter/receivers, beams can be directed electronically (rather than the antennae needing to be moved mechanically). The validation of such systems requires high-channel-count capabilities with phase alignment, so coherency is maintained between all the channels. An initial proof-of-concept, made of a few hundred antennae, can be tested in a HIL arrangement. Then, this may be ramped up to a full-scale system, where TB/s of data is being acquired. The ability to pre-process this data on an FPGA and apply compression will help to make it more manageable. Sub-ns resolution will be mandated to ensure synchronisation of the modules in the PXI chassis (and, in larger systems, across multiple chassis). The data must be time tagged correctly for effective data management after acquisition.  

Satellite communications (SATCOM) - SATCOM links are an enabling technology to meet the current and future communication needs in the civilian and defence markets. New services (voice, video and data), requiring very high data rate, are being developed according to an anytime-anywhere paradigm, with the result of increasing the capacity of terrestrial communication networks operating in very dense electromagnetic environments. Increasing demand in quality, security and cost management mandates a HIL testing approach, where the systems are tested in a realistic condition by emulating the digital/analogue systems to which they are connected and co-operate. One example of this HIL-based system validation uses a satellite link emulator (SLE) to measure the impact of real world electromagnetic channels.

Figure 3: NI’s VST solution
Figure 3: NI’s VST solution

Electronic warfare (EW) - Modern electromagnetic spectrum operation (EMSO) scenarios span from low observable, low probability of interception, smart, covert and camouflaged ‘threats’ to high-power and high-density, interfering, agile emissions. Inter-dependent, time-varying, reactive and adaptive emissions are to be found in any sub domain of EMSO, which accounts for radar, communication and navigation frequency bands, as well as the infrared, ultraviolet and visible spectrum. Such scenarios require the rapid evolution of test and evaluation (T&E) principles and architectures, in order to cope with the emerging engineering and operational challenges throughout the system life cycle (SLC). They include those associated with software-based capabilities, which enable agility, adaptivity, cognition and artificial intelligence (AI), as well as federated and multi-domain sensor suites and system-of-systems (SoS), plus cyberspace. These challenges are tackled by the digital transformation of T&E capabilities, which empower the verification and validation engineers with highly programmable infrastructure on which to model and stimulate the external stimuli for the system under test (SUT). However, transformation requires an integrated and cross-functional approach for improving the utilisation and efficiency of the T&E resources throughout the SLC. This will keep down the total cost of ownership (TCO), while also accelerating the time to market by introducing HIL and software-in-the-loop (SIL) testing to all stages of the SLC and numerous test environments.

The prerequisite for modelling, simulation and HIL testing is a flexible hardware platform. Modularity is essential, as functions need to be added/removed in line with changing requirements. There also needs to be scalability when it comes to the number of channels, computational resources and frequency bands supported. Likewise, there should be flexibility in terms of functions being hosted and possibility for re-programming plus connection to other systems and/or tools for data sharing and control.
 
The NI Vector Signal Transceiver (VST) (shown in Figure 3) is a modular and software-defined instrument that fits well with the requirements of modern T&E systems, like the I/O latency and the wide instantaneous bandwidth (up to 2GHz). The onboard availability of an open FPGA makes the instrument flexible enough to host any modelling, simulation, processing and control applications in real-time, enabling T&E throughout the entire SLC.

Conclusion
The challenges engineering teams now face when developing new systems are daunting. What they clearly need is lifecycle support. Through this they can go all the way from validation of their initial concept right through to actual production, while remaining on the same fundamental test platform. Software-defined modular instrumentation has shown itself as the best way of achieving this. It delivers a streamlined, compact, adaptable, cost-optimised and scalable method for testing complex data-intensive systems, where many channels are involved. The PXI-based VST platform described, accompanied by advanced FPGA technology, allows detailed investigation of system and subsystem behaviour early in the development process, so engineers can be assured that development projects are on the right track from the start.   


Contact Details and Archive...

Print this page | E-mail this page






This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.