Metal-oxide (MOX) gas sensor testing using a PXI test system

Author : Dale Johnson, Customer Technical Services Manager at Marvin Test Solutions

04 May 2018

MOX gas sensors are MEMS (micro-electromechanical systems) devices fabricated as multi-chip modules (MCM). The basic components of the MCM are the microcontroller ASIC, pre-tested on the wafer, and the sensor itself. These components are sited on a common substrate and a lid is placed over the components, with a small hole or mesh that allows gas into the sensor. This piece outlines how PXI test systems can be used to test and validate such devices. 

For the digital issue of this piece, please visit this link – or click here to register for EPDT's magazine.

The adoption of MEMS technology for MOX gas sensors has resulted in greatly reduced manufacturing costs. However, each of these sensors must also be tested, which presents a unique set of challenges when compared to the manufacturing and testing of typical semiconductor devices.

A PXI-based test system provided a focused test solution that offered the required accuracy, accommodated very large site counts, and matched the overall through-put performance of high performance semiconductor test systems – at much lower cost.

Test system requirements

To test these devices, the test system needed to have the following capabilities and attributes:

• The time to test/calibrate a MOX sensor can be several tens of minutes. This long ‘dwell’ time is required due to the need to ‘soak’ the sensor(s) with clean air and the target gas.

Clearly, using a large, high performance semiconductor tester, which is effectively sitting idle for many minutes during the gas soak time, is not a productive use of that expensive resource. The solution had to have a low initial capital cost.

• With long dwell times, it was essential that the system could support very large parallel test capability, so the soak time could be amortised across many devices.

• The DUT load board had to reside in an enclosed environment where concentrations of gas could be precisely controlled.

This precluded the typical practice of using handlers to load and unload devices, resulting in the requirement for manual insertion and removal of the devices.

Consequently, a means for an operator to visually identify passing and failing components for manual binning was required.

• Once the desired soak time had been reached, all measurements needed to be completed in less than one second to avoid skewing of the results due to disparate soak times.

• As the component was a new design, the system needed to be expandable, so test capacity could grow as production volumes increased.

Figure 1. System block diagram
Figure 1. System block diagram

• Test capabilities needed to include support for multiple I2C buses (one per device), as well as performing a contact test on each device.

System configuration

A PXI-based system was selected which provided most of the necessary functions for testing multiple sensor devices. The specific PXI instrumentation and test components included the following:

• A 32-channel, digital I/O card which supported all digital control interfacing via the I 2 C bus, as well as contact testing via the card’s PMU per pin capabilities. And by incorporating an 8:1 multiplexing scheme, it was possible for a single card to support a 64-site load board.

• A static digital I/O card supporting mass flow controller and exhaust valve actuation

• System power supplies, controlled via a USB interface

• A device load board supporting 64 devices, with 64 tri-colour LEDs for visually indicating the pass/fail status of each DUT. The LED power includes a battery backup so the visual pass/fail state is retained after the load board is removed from the chamber.

• A high power PXI chassis with a custom interface adapter. The chassis features 60 watts per slot and supports up to 20 PXI peripherals, providing adequate space and power to expand the configuration – to support the testing of up to 512 devices.

See Figure 1 (above) for a block diagram of the overall system configuration.

To facilitate precise control of gas exposure to the DUT, an enclosed environment (chamber) was designed (see Figure 2 below) to house the load board and restrict the test gas to a confined space. The gas is introduced to the chamber via mass flow controllers (MFCs) that meter ‘clean-dry air’, ‘humidified air’ and the ‘test gas’, allowing them to be mixed in precise quantities to achieve the desired gas densities (measured in parts per million).

Overall control of all instrumentation, communication with the devices and control of the MFCs were all provided by ATEasy: a test executive and test development software environment which supports multi-threading applications.

For each load board/chamber, ATEasy instantiates a separate thread for processing measurements and generating data logging reports for all load boards and devices.

Figure 2. Device gas manifold with DUT load board installed
Figure 2. Device gas manifold with DUT load board installed

Test process

The overall sequence for testing devices included the following tests:

Continuity: Using the per-pin PMU function of the digital instrument, it was possible to quickly conduct a contact verification test by using a test for the presence of the DUT’s ESD diodes.

Sensor Initialisation: Initialising the sensor by reading relevant NVM (non-volatile memory) data and storing it for future data correlation.

Gas Measurement: Measuring MOX – with the devices exposed to a controlled mixture of ‘clean/dry air’ (CDA) and the ‘test gas’, and after a defined ‘soak time’, the MOX measurement is then taken. This is compared against the MOX minimum/maximum values previously calculated, and recorded. Devices are then binned accordingly.


The specialised requirements associated with the testing of MOX gas sensors presented unique challenges. In particular, the large test ‘dwell’ time required by these devices demanded a different kind of test solution: one that could be easily scaled to support very large parallel test capabilities, along with test assets that were significantly lower in cost than the traditional ‘big iron’ ATE solution.

The PXI-based system detailed above, based on off-the-shelf hardware and software, provided the optimal solution – achieving both high throughput and very moderate cost. In particular, the solution offered the following benefits:

• The provision of low initial capital cost with the initial investment being 1/5th of the cost of a conventional semiconductor test system.

• The capability to support very large parallel test, with site count expandable from 64 to 512 devices in 64-site increments. With a 512-site configuration, and a 35-minute test cycle, a throughput rating of just over 4 seconds per device is achievable.

• The guarantee of test consistency, necessary due to all measurements on the 64 devices needing to be completed in less than one second. The time to measure the MOX resistance for 64 sites was measured at less than 800 ms.

• The simple integration of the load board with the gas chamber was achieved. This allowed the operator to easily identify pass/fail parts based on the use of LED indicators with battery backup.

Contact Details and Archive...

Print this page | E-mail this page