Test at the edge of the quantum world
A new wave of computing research is placing novel demands on test instrumentation.
Computing is changing. As the pace of silicon technology used to implement digital computers begins to slow down, novel computing concepts based on analogue and quantum techniques are emerging that go beyond the Turing Machine ideas that underpin digital processing. This is placing new demands on the test equipment needed to explore their operation and potentially point a way to more efficient computers.
These novel computers rely on the manipulation of single atoms or photons that work on the edge of the quantum regime or harness quantum-mechanical effects directly. Quantum computers use processes such as entanglement between the individual atoms or ions that make up the quantum bits (qubits) of the machine. The entanglement allows direct interaction between the qubits that, for some types of problem, dramatically increases the speed of processing. The problem that faces scientists working on these machines is how to probe and measure the changes in state as the experimental computer operates.
The qubits are generally controlled using an array of lasers that use sequences of pulses to either hold each qubit in a defined state or attempt to push it from one quantum state to another. Careful control is vital. One false move easily leads to the carefully formed quantum states losing their coherence – and the rich caches of information entangled states can hold.
Commonly employed in electronic-warfare jammer and countermeasures development, the arbitrary waveform generator (AWG) has become a frequently encountered feature in papers from condensed-matter and quantum physicists. Often the AWG is used to supply the components of complex RF signals. A vector signal generator then mixes the in-phase and quadrature signals to produce the final complex RF stimulus. The advantage of the AWG is that it has the fast hardware control to generate rapidly changing terms that create the highly intricate modulations needed to keep qubits stable and nudge them into useful states.
The problem that has faced users of older AWG designs is that they need to combine multiple instruments to generate the complex waveforms. Although calibration helps iron out the differences, each unit has its own peculiarities, which makes mixing the I and Q components for each waveform more difficult. Quantum computing makes life even more difficult because each qubit needs to be controlled. Dean Miles of test equipment supplier Tektronix says: “They need many, many channels of stimulus.”
The result is a need for multiple instruments linked together to manipulate the various qubits inside the experimental computer. “There are custom systems that people have built but they can look like rats’ nests as they scale up,” Miles says. “The ability to have an off-the-shelf eight-channel AWG means support and cost is reduced.”
For its AWG5200, Tektronix adopted a digital approach based on high-speed digital-to-analogue converters (DACs) that allows the generation of signals with complex modulation profiles up to 5GHz without further processing. It is possible to reach higher frequencies using an external mixer, with multiple channels being processed by the same local oscillator that feeds each mixer to preserve synchronisation. The instrument can use interpolation between digital samples to reduce the memory needed to store waveforms and to make it easier to generate waveforms on the fly.
Fine-tuned control over the states is becoming increasingly important, which calls for responsive processing to provide the stimulus and interpret results as quickly as possible. Jacob Sherson, researcher at Aarhus University in Denmark, compares the problem of maintaining quantum coherency while instruments coax the elements into the right state as being like the water in a cup sloshing around as you carry it. The key is to prevent the liquid from spilling over. In the quantum experiments carried out at Aarhus, lasers under the control of algorithms written in National Instruments’ LabVIEW software attempt to coax individual, supercooled atoms into tunnelling between locations without disrupting their quantum state. If the control algorithm is too forceful, it simply sloshes over the top of the quantum well instead of tunnelling through.
Algorithms to do the manipulation are incredibly difficult to create. But the researchers thought people might be able to work out successful motion profiles. To test their hunch, they developed a computer game played by people around the world and then selected the winners. The aim, according to Sherson, is to analyse those profiles and find common elements that can be fed into a model running on the LabVIEW-based control system for the experimental quantum computer.
To provide the responsiveness to react to changes in the system, software running even on a fast PC is often not fast enough. NI’s CompactRIO systems can use field-programmable gate arrays (FPGA) to process data using hardware circuits that are much faster. Some groups employ FPGAs in semi-custom rigs to provide the control they need.
At Stanford University, Yoshihisa Yamamoto’s group is working on a form of analogue machine that may provide an alternative to the quantum computer for some problems. Photons racing through a long loop of fibre are progressively tweaked by an electronic system that tries to find the optimum solution for conundrums such as the famous Travelling Salesman problem, which has proved difficult for traditional digital computers to solve.
Helmut Katzgraber, an associate professor at Texas A&M who is pursuing other lines of research into these systems, says: “There is a deep synergy between classical optimisation, statistical physics, high-performance computing, and quantum computing. Those things really go hand in hand. Nature is the best optimiser. Lightning typically chooses the path of least resistance. A soap bubble will always give you the minimal surface.”
The Ising model
The Stanford system models the behaviour of the way in which a hot lump of magnetic material cools into a minimum energy state. Put forward by Ernst Ising in the 1920s, the spins of the atoms flip one way and then another until they reach that final minimum. The Ising model covers a wide range of computing devices that reach into the world of quantum computing. D-Wave’s approach to computing uses a process known as quantum annealing for its form of computer.
“Quantum annealers are types of Ising machine,” says Stanford University researcher Peter McMahon.
Whereas the D-Wave design appears to use quantum-mechanical tunnelling, the virtual spin changes in Stanford’s Ising machine are mediated with the help of an algorithm mapped into the FPGA. This overcomes problems caused by noise that makes it hard to use direct interactions between photons. Fibre-optic receivers and transmitters designed for laboratory work such as the Thorlabs PDB480C-AC photodetector handle the conversion between the optical and electrical domains.
Rather than work with quantum-mechanical spins directly, the processing involves changes in phase and intensity between the individual photons. “We introduce coupling by tapping out a little bit of the light at each stage. We measure the phase of the light that we tap out and use memory on the FPGA to store the phase of the light of the last 100 pulses. As each pulse comes back around, the FPGA looks at its memory of phases of each of 100 pulses and it computes a sum to work out whether it should be flipped,” McMahon explains.
After multiple passes around the fibre, the tuned pulses converge on an answer that should be the optimum for whatever cost function was programmed into the FPGA. Even with the higher speed of the FPGA, the system introduces 60 additional dummy pulses to provide enough time to compute the transitions on each pass.
Improvements in FPGA speed and capacity and in digitising speeds will lead to more advanced control and measurement systems – with both custom and off-the-shelf hardware being combined to help make quantum and other novel computing technologies practical options for the future.
To help measure the changes in state inside the experimental quantum computer, the mainstay of research has been the sampling oscilloscope. Able to record events at gigahertz rates, the unit provides a convenient way to record rapidly changing events in the machine.
Dedicated digitiser modules have been developed by companies such as Guzik, now part of Keysight Technologies, to push sample rates higher. Some of them include programmable hardware based on FPGAs to compress the data or filter it for the algorithms that are used to program and stabilise the system.
Work into quantum effects may well lead to new forms of oscilloscope design itself. Earlier this year, researchers at the Tokyo Institute of Technology and NTT developed an oscilloscope that can measure properties such as electron spin directly in addition to its charge rather than inferring the properties with the help of an instrument such as a vector network analyser.
The charge signal picked up by a traditional instrument is the total charge of the spin-up and spin-down electrons. The spin signal is the difference between the densities of spin-up and spin-down electrons.
In the experimental machine there is a spin filter and nanometre-scale charge detectors. The spin filter separates the electrons by spin; the time-resolved charge detector measures waveforms of the charge-density waves.
Using this spin-resolved oscilloscope, the team demonstrated waveform measurements of charge- and spin-density wavepackets in a semiconductor device. They observed the spin-charge-separation process in a circuit of quantum Hall edge channels, a system for investigating the behaviour of electrons confined to one-dimensional channels.
Researchers at the Technical University of Munich and the University of Notre Dame, Indiana built a receiver in 65nm CMOS to analyse the frequency, amplitude and phase of electron spin waves.