Quantum approach to machine learning allows particle physicist to see through noise
Image credit: Dreamstime
US researchers have harnessed simple quantum computing technologies to solve problems in physics for the first time.
The existence of the Higgs Boson was first confirmed in 2012 by CERN scientists using data collected from the Large Hadron Collider’s detectors.
Spotting the signal that indicated the existence of the Higgs - and other sought-after particles - was a challenge. In part, this is because data collected in high-energy physics experiments normally contain a lot of noise: meaningless extra data which drowns out the subtle but significant signals the physicists are looking for.
Many physicists use machine-learning approaches to separate meaningful from meaningless data, although this approach has some drawbacks. The patterns in the data identified by neural networks can be difficult to interpret, as the means by which they were identified is not understood and the accuracy by which a machine-learning program can detect patterns is dependent on having a large training dataset, which must be manually sorted.
This is highly frustrating for high-energy physicists, who are searching for very rare events buried under a huge amount of noise.
“Some people in high-energy physics are getting ahead of themselves about neural nets, but neural nets aren’t easily interpretable to a physicist,” said Joshua Job, a postgraduate student at the University of Southern California and co-author of the study.
“[The new quantum approach is] a simple machine-learning model that achieves a result comparable to more complicated models without losing robustness or interpretability.”
Job and his colleagues at the California Institute of Technology (Caltech) and the University of Southern California were able to develop a quantum-compatible machine-learning approach to extract a Higgs Boson signal from noisy data. The researchers streamlined their approach but only including data associated with excited states.
The team modelled the problem such that it could be tackled using a quantum annealer: a type of basic quantum computer capable of running optimisation tasks using quantum fluctuations. This device was used in order to sort through the masses of data collected by particle physicists while running experiments.
“Surprisingly, it was actually advantageous to use the excited states, the sub-optimal solutions,” said Professor Lidar, a professor of engineering at the University of Southern California.
“Why exactly that’s the case, we can only speculate. But one reason might be that the real problem we have to solve is not precisely representable on the quantum annealer. Because of that, sub-optimal solutions might be closer to the truth.”
This approach has performed effectively, even when using much smaller datasets than machine-learning techniques typically require.
The researchers say that quantum annealers are not necessarily superior at this point, particularly as they are comparing a system which uses a thousand qubits to a computer with a billion transistors. Future advances in quantum computing could make the use of devices such as these a far more attractive option for researchers.
Already, the researchers are looking into other applications and have demonstrated that this methodology is effective in solving a problem in computational biology.
“The result of this work is a physics-based approach to machine learning that could benefit a broad spectrum of science and other applications,” said Professor Spiropulu, the Caltech physicist who conceived the project.
“There is a lot of exciting work and discoveries to be made in this emergent cross-disciplinary arena of science and technology.”