Systems based on quantum processing techniques could represent a new era in high-performance computing - but they will not necessarily be suitable for every application. E&T reports.
Quantum computing has been likened to a disruptive technology that works theoretically, but faces huge engineering challenges to apply in practice on a large scale.
In the event quantum computing is getting there first, particularly in the case of one of the most promising applications, cryptography. Indeed, 2008 could go down in history as the year when quantum computing finally proved that it will deliver on its early promise - even if it will remain at a pre-competitive stage for several years yet as far as the main players such as IBM, Hewlett-Packard, and Toshiba are concerned.
"Information processing systems are physically embodied. The underlying physics is, ultimately, quantum-mechanical," according to Samson Abramsky of Oxford University's Computing Laboratory, speaking at the Computer Science 2008 conference last December. "This forces us to re-examine many of the basic assumptions of computer science. It has already led to some exciting developments, remarkable new algorithms, cryptographic schemes, and basic questions in computational complexity."
It is important though to distinguish between the two quite different applications of quantum mechanics to IT - computation and cryptography. While the former has created the greatest excitement and popular interest, with the promise of extraordinary increases in speed for certain problems, only for cryptography has quantum mechanics been shown to work beyond doubt.
Meanwhile, small-scale quantum computing machines have been built, but they have been unable to compete with classical machines even for problems that exploit quantum parallelism (see page 67).
Furthermore, there has even been doubt that the machines are really quantum computers at all, for it is not straightforward to prove that quantum effects really have been invoked in a calculation.
The first such controversy came in 2001, when IBM claimed to have factorised the number 15 into its prime components, five and three with a seven qubit (a unit of quantum information) quantum computer. This was trivial, but supposed to prove the concept that quantum computation could be applied to such problems. As it happened, a number of experts in the field questioned whether IBM really had performed quantum computation, given that the task was, of course, simple even for the arithmetically-challenged.
The same doubts have been dogging D-Wave, a company some believe will be credited as the pioneer of commercial quantum computing. D-Wave has now built a 28 qubit machine, codenamed 'Orion', and has been planning to make it available online for applications involving pattern matching and searching, for which quantum computing would be well-suited.
Aware of the need to establish credibility among the quantum computing research community, D-Wave has been trying hard to gain independent verification that Orion really is generating qubits and performing parallel calculations. "We have been working with leading scientists on defining sets of tests that can conclusively show that the machines we are building are in fact running the algorithms they are designed to run," says Geordie Rose, D-Wave's CTO (and former CEO). "Some of these tests are complete, others have yet to begin. So far the data is consistent with the machines being adiabatic quantum computers."
Adiabatic quantum computation is one design option, in which qubits are created by immersing the processor in a very cold bath close to absolute zero in order to maintain the quantum system at a very low energy state, such that it maintains its coherence and does not lose it through contact with higher energies in the 'outside world' (in this case its low temperature bath). Indeed the tests, performed by a group including self-styled 'quantum mechanic' Seth Lloyd, professor of mechanical engineering at the Massachusetts Institute of Technology (MIT), have confirmed that the machine appears to operate as a quantum machine. This still falls short of absolute proof, which will only really be obtained when there is a quantum processor big enough to start executing calculations at a speed that cannot be achieved by a non-quantum machine.
Only that will silence the skeptics for good, and so, not surprisingly, D-Wave is straining to build such a machine. Even then there are significant problems in building a machine that can be guaranteed to produce the output at a reasonable speed, even if it is executing quantum parallelism.
Program for change
There is also the issue of generalising and programming quantum machines, for current prototypes such as D-Wave's Orion are designed just for one specific problem, and it is not possible yet to make one programmable in any sense.
"For the foreseeable future all quantum computers will be special purpose machines, designed specifically to run only one quantum algorithm," admits D-Wave's Rose; but he insists that quantum computing would eventually pick up the mantle left by classical processors, which will soon be unable to keep up with Moore's Law. "Processor innovation has gone to the dogs," Rose asserts.
Not surprisingly, this sentiment was not expressed in quite this way by TN Theis, director of physical sciences at IBM's Thomas J Watson Research Centre in New York; but he too saw investment in quantum computation as essential for maintaining performance, and possibly storage, growth curves in the distant future. "It is a part of our strategic research looking beyond the CMOS roadmap to entirely new devices and new ways to store, process, and communicate information," Theis thinks.
Rose, meanwhile, anticipates a split occurring in computation between tasks such as word processing or email that will continue to be done on classical machines, and those requiring high performance, that will be executed remotely, perhaps on a quantum machine. Such tasks could include searching, particularly where image or video sequences are involved.
Recognising faces or searching video libraries for sequences containing a specified image are extremely challenging tasks computationally and well suited to quantum parallelism. "It's all about building special purpose machines for running algorithms that can't run on classical machines," Rose says.
While commercial quantum computation is still for the future, quantum cryptography is much closer, with no doubt that it is actually taking place. A major step has just been taken with the first demonstration of quantum key distribution (QKD) over an existing metropolitan telecommunications network in Vienna using different vendors' equipment. Earlier demonstrators ran over point-to-point links using a single vendor's equipment, making implementation much more straightforward.
"It was a significant milestone since it demonstrated the working of a QKD network backbone mixing different QKD devices with different basic technologies," notes Vicente Martin, head of the Quantum Computing and Information Research Group at Universidad Politecnica de Madrid, one of the partners in the Vienna project called Development of a Global Network for Secure Communication Based on Quantum Cryptography, funded by the European Union.
In fact, the network, comprising six nodes and eight intermediate links between 6km and 82km long, featured six different quantum cryptographic technologies for key distribution, integrated via standard interfaces. "It demonstrated that a metro area QKD network is technologically feasible. It also showed the areas in which an extra effort is needed to achieve a commercial quality service," Martin adds.
QKD exploits either entanglement or Heisenberg's Uncertainty Principle, which are related fundamental properties of quantum mechanics. Essentially, they enable any attempt to eavesdrop, which inevitably involves measurement, to be detected by its effect elsewhere in the system. QKD is similar in its higher level operation to public key cryptography, in that it is used solely to encrypt a private key, not the message data itself.
The key is then used to encrypt the message data securely. The strength of the method lies in being absolutely certain whether or not the key has been intercepted. If it has, another is sent, and if not, it is used for encrypting the message. The point is that decrypting encrypted message data is virtually impossible when the key is unknown.
Ironically, take-up of QKD may be encouraged by developments in quantum computing that threaten to crack currently used public key systems based on the RSA algorithm. The point is that RSA relies on the difficulty in factoring two very large numbers, which would take a conventional supercomputer many years, but which could be done almost instantly by a large quantum computer.
Even before such quantum computers are built, QKD will hold great appeal by virtue of its proven immunity against eavesdropping when commercial deployments become feasible in a few years, with a number of promising applications. It could be used as a secure source of keys for local electronic payment applications: for example, using a mobile phone as an electronic wallet to load money and then make small payments wirelessly for vending services.
Most of the work on QKD so far has been focused on backbone network applications, with potential applications including Internet banking and online commerce, both with consumers and between businesses. But Hewlett-Packard (HP) has been exploring the potential of QKD for short-range applications involving electronic cash, where it believes the ability to distribute secure keys will stimulate take-up of such services.
"Our aim is to try and do QKD in the short-range between a mobile and a fixed-base station," reports Bill Munro, principal research scientist in HP's Quantum Information Processing group. "Then we can start moving electronic cash, and things like this."
This would avoid having to rely on fixed four-digit PINs to access electronic banking services for downloading cash to, say, a mobile phone or other portable device for making small payments to a vending or ticket machine. Instead, one-time keys would be downloaded securely to the phone in response to a variety of verification procedures, which could include the SIM identity of the phone as well as personal information.
Then the user would enter, say, the first four digits of the one-time code, but this would only provide access at a given time, reducing risk of fraud through eavesdropping of the PIN. "This gives you much more security," says Munro, who concedes, though, that it still remains to be seen whether banks will embrace quantumic technology.
In any case, there are remaining engineering challenges here too, in particular the need for quantum repeaters to extend the range of QKD. Munro also identified an issue common to the whole quantum computation field, of particular interest as we saw for D-Wave, which is how to verify what is actually happening inside the machine, which he referred to as quantum tomography. "Tomography is a way of trying to characterise what is happening inside this black box rather than just looking at inputs and outputs," says Munro.
In fact, such challenges could be said to go beyond mere engineering, for there is still a lack of understanding of what is going on at the level of fundamental physics, even if the basic laws of quantum mechanics are taken as given, as IBM's Theis points out.
"Quantum computing exploits the known principles of quantum mechanics," Munro adds. "However, much fundamental invention and new physical insight is required to understand and design the physical systems - the devices - which can make it practical."
So, as IBM's Theis said, quantum computing development will continue to be led by physicists, rather than computer scientists, for some years to come.