Automated instruments simplify test processes
Modular design helps accelerate design, verification and production.
Next year, the VXI standard celebrates its 30th birthday. The board format marked the first adaptation of a communications standard to the needs of test, leading to a gradual spreading of formats ending in -XI to denote their eXtensions for Instrumentation. PCI, PCI Express and Ethernet have all been co-opted in the same way, leading to standards such as PXI, AXIe and LXI.
Developed by National Instruments in the 1990s, PXI put the PCI bus onto a backplane to make it easier to insert and extract boards, providing additional functions for test. PCI Express similarly gave rise to PXIe and AXIe, the latter inheriting the larger form factors of VXI. LXI in turn uses the Ethernet network as its basis. The modular board-level formats first found applications in production test with conventional benchtop instruments favoured for development.
Darcy Dement, NI’s marketing manager for automated test and RF in EMEIA, says: “When PXI came out it was for the production-?test use-case. It was welcomed for that. A lot of times people were going through the pain of their own back-end stuff and were putting their own systems together.
“More people are now using PXI in verification and validation. A hot topic is communications prototyping. People are prototyping those standards as they get evolved.”
Dement points to the experience of Qualcomm, which moved from using conventional self-contained benchtop instruments to test designs for advanced forms of Wi-Fi. The company moved to a PXI chassis that contained NI’s vector signal transceiver (VST). As well as supporting computer control the field-programmable gate array (FPGA) on the VST board allowed test functions to be downloaded into hardware for faster processing. “They saw a big increase in the speed of their verification and validation runs,” she adds.
Being able to mix and match data-acquisition boards provides a high degree of flexibility, but computer control is the key to driving automated test further into both development and production.
“It allows you to do more runs and discover more about the product you are developing, that is a concrete benefit for them. In general, if you are able to identify bugs in design, you save a lot of pain and heartache in the market,” says Dement. If more tests can be run during design that probe a wider range of possibilities, the lower the chance of those bugs making it to production.
“And you have a lot of soft benefits,” she adds. “The same architecture can scale from verification and validation to production.”
As well as drilling down into the details of a new protocol as in Qualcomm’s case, communications equipment manufacturers are using automated test to examine system-level behaviour. Power consumption has become a target in devices such as smartphones because a reputation for poor battery life can hurt sales. Joerg Koepp, market segment manager at Rohde & Schwarz, says software control readily supports end-to-end testing, with benchtop instruments being as amenable as modular board-level standards.
A common set-up, Koepp says, is a radio tester coupled to both an Internet-protocol analyser and an instrument to check power levels. “As you set up a TCP-IP connection, you can see the device-under-test’s current consumption go higher and lower as conditions change. There are also situations where you combine with location-based services.”
Because some applications are triggered by a change in location, Koepp says, a test instrument that emulates the GPS network can probe the system to see whether power demands in a smartphone peak excessively as the systems moves around in a virtual space. An oscilloscope probing part of the board can help pinpoint where the largest changes in power consumption occur.
“You have in this case a set of instruments with software running on a PC controlling the test sequences to provide automated tests and correlating the results,” Koepp says.
Bob Stasonis, sales and marketing director at Pickering Interfaces, says another area where software-controlled modular test hardware has become important is in testing the high-speed serial buses now being used to pass large amounts of data around aircraft and increasingly car chassis as well as data-centre computers: “High-speed serial buses are huge. A lot of markets are doing it. In automotive we have lots more data moving around, with data going through the transmission systems and the engine and even the connections to the phone. All that is moving over to Ethernet.”
To test the multiple serial buses that are now being implemented in each system, switching cards such as those made by Pickering have become important features of the modular hardware so that instruments can quickly be moved from one port to another as tests progress. “You want very low crosstalk across those channels, which means putting a lot of effort into the density,” says Stasonis.
Mentor Graphics is using modularity at the network level to increase the throughput of the testing regimes that establish the reliability of power transistors used in rail traction, road vehicles and the growing number of electronics-based power systems needed for the grid. The results feed into thermal simulations that determine how well the system will respond to changes in demand, which lead to large changes in temperature that, if not catered for, can lead to unexpectedly high failure rates in the field. “If the thermal simulations are not calibrated, you get a 20 per cent temperature difference between simulation and reality,” explains Roland Feldhinkel, general manager of Mentor’s mechanical analysis division.
Car-makers have had to add to their list of reasons for recalls early failures in the power transistors they use in hybrid and electric vehicles. László Tolnay, product line director at Mentor, says there are several failure mechanisms power transistors suffer from when exposed to frequent temperature cycling, such as breaks in the wire bonds that link the die to its package and solder degrading.
“In many cases the thermal failure is not due just to the absolute temperature but the amount of temperature change. You can’t test for this in a static way. The devices need to be tested over the conditions encountered in a typical drive cycle. The tester technology we use does the power cycling with the end goal of developing a failure-lifetime estimate,” Tolnay says.
The T3ster system developed for the automotive industry has up to 16 channels, with Ethernet used to expand that number out to more than 120 to cater for the need for extensive statistical information on reliability.
“The automotive industry is setting up standards to measure reliability. One requires tests of 77 devices of one kind or they can’t make a reliability statement,” Feldhinkel says.
Dement says teams need to make a trade-off when considering their level of automation: “When you don’t have to have an operator physically pushing buttons and turning knobs you have to make the time investment to configure the sequence. That’s time you spend whether you are doing it physically with knobs and buttons or time invested in software. But when you’ve made that investment once you can run more tests, such as cycling through different temperatures. That way you get a lot more understanding of the device under test. And it’s a lot more repeatable. You can take all of that data. But you are also taking some of the human error out of the equation.”
Tolnay adds: “It’s an industrial way of testing, rather than a smart test with limited repeatability.”
Whether to rely on network connections or closer interactions between boards on a rack depends on both timing and how much space the equipment needs.
Stasonis explains: “Board-level modular makes a whole lot of sense because of compactness, lower cost. Timing can also be more accurate because the cabling between instruments gets shorter. But something like a 1kW power supply might more likely be LXI controlled. LXI fits well into areas where the instrumentation doesn’t fit cleanly into PXI.”
A key difference that VXI provided when it emerged based on the VME backplane standard was its support for a dedicated trigger and synchronisation bus. The links let instruments coordinate when to start and finish acquisition runs and tell each other whether they have received the commands. PXI and similar standards have adopted similar dedicated trigger buses and connections, with PXI supporting a fibre-optic extension for multi-track triggering.
For synchronisation LXI currently uses the IEEE 1588 standard, which uses estimation of turnaround time for packets bounced backwards and forwards between nodes to try to determine a common time window for messages. “In most LXI applications there are latency issues that you don’t see in a PXI backplane,” Stasonis says, but he notes the LXI members are working on protocols to tighten up the timing.
Dement says the fine tolerance of timing available in the backplane standards is essential to a number of test strategies. “In semiconductor design there are good use-cases, such as RF power amplifiers,” she says, pointing to the emergence of envelope tracking. This fine-tunes the voltage applied to a power amplifier according to the data being sent.
“It’s a multi-instrument type of measurement that’s required. You need to synchronise RF generation and its measurement by a signal analyser with digital I/O. It all has to be totally synchronised. You just can’t get that level of synchronisation with a rack-and-stack approach.”
Dement adds: “It’s also important to recognise what’s really modular and what’s not. It really has to do with what’s open and what’s not. If modularity is only for the benefit of the vendor and the ability to scale and ability to adapt to needs is not passed on to the user that is less useful. In some cases, even if it’s a PXI card cage under the hood, if the users don’t have access to the software they are still stuck. Some standards called modular are attempting to make a play at that.
“I insist on that point. The software is vital. Many of the systems we see today on the market are based heavily on software and receive regular updates. The only way to keep up with that is a software architecture that you have control over.”