The quantum computer testbed developed by Professor David Lucas and colleagues traps ions at its core for use as qubits

The trouble with quantum computing

Quantum computer breakthrough announcements are coming thick and fast. But for real-world use, is it really a matter of if rather than when?

It seems as if a month cannot go by without a ‘crucial hurdle’ being overcome in quantum computing, bringing us ‘a step closer’ to realising a practical device. If you believe these pronouncements we are just a few years away from achieving this ambition: one that would render useless many of the encryption keys today used to secure data.

“Quantum computers are based on a fundamentally different model of computation that uses the bizarre features of quantum mechanics - such as superposition and entanglement - to do things that are impossible for standard computers,” says University of Bristol researcher Ashley Montanaro.

Superposition and entanglement make it possible to rework computer algorithms to perform jobs such as factoring the large semiprimes that underpin some of the most popular tools for protecting data from prying eyes. Whereas digital computer bits can only switch between the states of zero and one, until the moment of quantum collapse when a measurement is taken superposition allows their quantum equivalents - qubits - to have a probability of holding either state. The quantum computer can use changes in probability to push the calculation towards the desired outcome and, in the process, perform the equivalent of many, many more calculations on a digital computer. 

Another advantage qubits have is their ability to get entangled. Quantum entanglement is a phenomenon where two quantum particles interact in such a way that they become deeply linked, and essentially ‘share’ their existence. If, at the point of quantum collapse, one particle takes on a particular state, its pair will take on a complement instantly. Entangled qubit correlations can act like invisible wires between the qubits, allowing them to communicate.

These capabilities are powerful. In principle. Getting to the stage where a quantum computer outperforms a classical computer on tough tasks - known as the ‘era of quantum supremacy’ - is fraught with difficulties and roadblocks.

The algorithms developed so far are complex and demand a careful balancing of classical and quantum techniques tuned for each calculation. There is no high-level language akin to C or Pascal that can ease the job of programming (see box: How do you program a quantum computer?). But, even before thinking about such practicalities, scientists face huge challenges. The biggest is noise - one of the reasons why scientists conduct quantum-computing experiments at close to the temperature of liquid helium. Noise leads to decoherence and the inevitable collapse into a classical state that, if it happens too quickly, destroys the results of the calculation.

A quantum state encoding a billion possibilities of a calculation may shrink to a classical state encoding just one of them if it interacts with a photon or any other particle or simply because of inherent quantum noise. This noise is a complicated phenomenon that results from “neglecting some ‘degrees of freedom’ when we describe or control a quantum system”, explains Professor Gil Kalai of the Hebrew University of Jerusalem.

Noise that does not cause decoherence can introduce other errors. The most basic quantum circuit - known as a quantum gate - cannot be made with perfect accuracy. In a larger device, it results in a build-up of small imperfections that in combination result in a quantum computer that fails to operate. Such quantum noise was once thought to be an insurmountable roadblock to quantum computing until professors Peter Shor and Andrew Steane independently described the first quantum error-correcting code in the 1990s.

“Prior to that, it was thought that quantum computers were an interesting theoretical idea, but one that could never be implemented in practice,” confirms Professor David Lucas from the University of Oxford.

Shor’s breakthrough made quantum computation on a practical level look possible. Researchers are busy at work trying to implement types of qubit that are not only easier to handle but which can take advantage of error correction.

New approaches to buildings qubits include using the spin of quantum dots, polarisation of photons or charge of electrons - anything that can exist in a quantum superposition of physically distinguishable quantum states.

The two most advanced and promising quantum computer technologies today rely on trapped ions or superconducting circuits. Released this year, IBM’s five-qubit Quantum Experience (QX) device uses superconducting qubits. These qubits are electromagnetic waves oscillating back and forth in an electronic device. IBM’s QX is not only programmable, it is also available for anyone to use over the internet. “We see the IBM QX as the birth of a quantum community, letting users gain a better intuition about what it means to program and use a quantum computer,” says Jerry Chow, manager of IBM’s experimental quantum computing team. The superconducting circuitry has the advantage of being manufacturable using the same lithography and fabrication technologies as those used for silicon chips. It can be integrated on a solid-state platform, with the hardware needed to maintain qubits simply printed on a chip.

In the IBM five-qubit system, it is possible to run a primitive error-detection experiment. Specifically, four of their qubits can contain data. They are coupled to the remaining qubit called the syndrome bit, which functions in a similar way to the error-checking ‘parity bit’ in conventional silicon memories. The state of the syndrome qubit depends on the parity of all the other qubits. “Basically, we can measure one ‘syndrome qubit’ to detect the joint parity of a surrounding set of four ‘code qubits’,” explains Chow. “Many rounds of this type of parity check need to be performed to keep track of errors and correct for them.”

IBM’s five-qubit machine is way too short on qubits to offer solutions to problems that normal computers cannot solve. “However, these small-scale experiments are invaluable test beds, both for scaling up the technology, and for checking that the algorithms we have actually work,” adds Montanaro.

How do you program a quantum computer?

Beyond the physical construction, if a quantum computer with the potential to deliver quantum supremacy is ever built, a more down-to-earth difficulty will be programming it. This is why IBM has opened up the QX machine for anyone to use. But even so the extreme difficulty of designing quantum algorithms - combined with the fact that a machine capable of running anything more than the most basic algorithms does not yet exist - means the field is in its infancy.

“The small-scale machines that we currently have, such as IBM’s five-qubit quantum computer, can easily be simulated on a standard classical computer,” says University of Bristol researcher Ashley Montanaro. “So algorithms running on these machines won’t solve any problems that we couldn’t solve already.”

Despite the lack of useful hardware and relatively small community working on them, Montanaro says that the quantum algorithms that have already been devised could solve a number of important problems. These include: Shor’s algorithm, which has the potential to break the RSA cryptosystem underlying internet security; and Grover’s algorithm, which could be used to solve a plethora of practically important search and optimisation problems faster than naive classical algorithms.

One option may be to use quantum computers in a similar way to analogue computers: as simulators of quantum systems that are incredibly slow to compute digitally.

Such machines could be used to predict the outcome of chemical processes and the behaviour of materials more quickly - and would not suffer from the same issues as those developed to compete directly with digital computers.

Also promising as a source of qubits is the spin of trapped atomic ions. Atomic ions can be confined by electromagnetic fields supplied by nearby electrodes and laser-cooled to form a linear crystal of qubits. Although collisions with gas molecules can temporarily break up the crystal and scramble the qubits, the ions are not flung out of the trap easily and such interruptions only occur roughly once an hour for each ion, or far less if the trap is placed in a cryogenic vacuum chamber.

“This is why some of the best atomic clocks are made with trapped ions,” explains Lucas. “These ions are very well isolated from the environment, giving their qubit states the longest decoherence time of any qubits around - that is, any qubits with which it has also been possible to implement two-qubit quantum-logic gates.”

Lucas’ ion trap quantum computing group measured a 50-second coherence time for calcium-43 qubits, which he believes “is still the world record for a system where the qubits can be individually controlled”. However, they have only achieved this lifetime using just one or two qubits.

Elsewhere, a team at the University of Maryland recently succeeded in making a programmable five-qubit computer from ytterbium ions held in place by electric fields and lasers.  Whereas the superconducting qubits in IBM’s machine can only swap data with their nearest neighbour, arbitrary pairs of trapped ions in the Maryland team’s experiment can be entangled with each other - a feature crucial to unleashing the full capability of quantum computers.

So what is stopping these five-qubit quantum computers from being scaled up? Noise may be a more fundamental problem than it seems from experiments that operate on the level of several qubits.

Kalai is part of a small community of researchers who do not sign up to the idea that quantum supremacy will ever be achievable. In essence, Kalai believes that for highly entangled states, effective error correction remains out of reach.

In his latest paper which appeared in Notices of the AMS, Kalai points to two fundamental problems with quantum states: “We cannot accurately control them and we cannot accurately describe them.”

He adds: “When you attempt to implement highly entangled states, without quantum fault-tolerance, the error rate in terms of ‘qubit-errors’ scales up linearly.”

A linear scaling does not sound so bad. Unfortunately, as you add qubits the effort required for sufficient quantum fault tolerance increases at least exponentially.

Kalai’s punchline is simple: “We cannot possibly engineer (and neither can nature) primitive computational devices to perform superior computational tasks.”

He elaborates: “Noisy quantum systems with a small number of qubits are very primitive computational devices and we will not be able to engineer such devices to directly demonstrate quantum supremacy or to realise quantum fault-tolerance.”   

Performance: quantum errors and entanglement

Unlike digital computing where the individual bits are swapped between two states independently, the qubits in quantum computing can be entangled. Manipulation of one is reflected in others.
The qubit represents the probability of the particle occupying a particular quantum state and, unlike the digital bit, is a continuous property. The quantum computer can manipulate the qubit around the sphere that describes the probability function as it nudges towards a solution.
Entanglement complicates error correction. Digital bits only flip unexpectedly between states; qubits can develop errors in the relationship between two bits. Addressing this phase-flip error requires knowledge of how the qubits diverge in time and how to correct for that before measuring them, which would destroy the quantum superposition. This is extremely difficult.

The pessimists’ view is hotly debated, inciting hundreds of blog posts from esteemed scientists in an online discussion and even prompting theoretical computer scientist Scott Aaronson to offer a US$100,000 award “for a demonstration, convincing to me, that scalable quantum computing is impossible in the physical world”.

Aaronson regards a definite demonstration of quantum supremacy as imminent in a matter of few years, while Kalai expects that all these attempts will hit a wall well before quantum supremacy is demonstrated. Theory and experiment cannot definitively judge one way or another currently, so a complete resolution of the debate may not come until a working large-scale quantum computer is built. Researchers from both parties agree that efforts planned in the next few years for quantum computers with a few tens of qubits could be decisive.

Even if Kalai is wrong and they can be scaled up, huge leaps in science and engineering will be needed for the ultimate aim of a useful large-scale universal quantum computer - because to make practical cipher-breaking computers we will need to deploy several thousand qubits in parallel.

“Currently, putting more qubits together makes the systems more complex and susceptible to decoherence and crosstalk,” Chow says. “That is a challenge in all physical qubit systems that will have to be addressed.”

Lucas agrees: “A big challenge is to make systems that perform well with more qubits, because eventually a quantum computer will require hundreds of thousands of ‘physical’ qubits if it is to have enough workspace to correct errors due to imperfect logic operations.”

This is the issue of ‘fidelity’, a measure of the quality or ‘precision’ of the physical system - how close the real system is to the ideal system. The precision of elementary quantum logic operations needed in a quantum computer will be extremely high.

“According to theory, a quantum computer cannot function if the precision is below about 99 per cent,” explains Lucas. “We have managed to get all the basic operations up to 99.9 per cent precision, which on paper is good enough, but at this precision it would still be very difficult and expensive to build a functioning device.”

Indeed, Lucas says that the general consensus is that 99.99 per cent precision will be necessary to make a practical quantum computer.

Much like outcomes in the quantum world itself, the future of quantum computing involves a number of unknowns, none of which we can predict. When and whether quantum supremacy and universal quantum computers can be achieved is currently up in the air, but the work of Chow, Lucas, Montanaro and even Kalai is contributing to tilting the odds in its favour. And each ‘step closer’ is only that. The road to quantum computing stretches off into the distance and may have no end. 

What about D-Wave?

D-Wave claims to be ‘the world's first quantum computing company’ and has sold its system to the likes of Lockheed-Martin, Google and NASA. 
In 2015, Google asserted that its D-Wave quantum computer could solve certain problems 100 million times faster than an ordinary computer. However, many in the scientific community question the validity of such claims, and even the fundamental quantum mechanism underpinning the machine - quantum annealing.
“Quantum annealing does not bring us closer to universal quantum computing,” says Jerry Chow, manager of IBM’s Experimental Quantum Computing team. “It is unclear whether there are any speed-ups that can be gotten from a quantum annealing system over classical algorithms.”
Professor Gil Kalai of the Hebrew University of Jerusalem agrees: “The qubits and evolutions in this machine are quite noisy and it is not understood theoretically how these machines might lead to quantum supremacy.”

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close