Novel theorem demonstrates scalability for quantum AI
Image credit: Dreamstime
Researchers from Los Alamos National Laboratory have published a paper which describes the use of a novel theorem to demonstrate that convolutional neural networks can always be trained on quantum computers. This overcomes the obstacle of 'barren plateaus'.
There is considerable interest in running convolutional neural networks on quantum computers, thanks to their potential ability to run quantum simulations far more effectively than classical computers can. However, the fundamental solvability problem of 'barren plateaus' has so far limited the application of these neural nets for large data sets.
“The way you construct a quantum neural network can lead to a barren plateau, or not,” explained Dr Marco Cerezo, quantum computing expert at Los Alamos and co-author of the study. “We proved the absence of barren plateaus for a special type of quantum neural network. Our work provides trainability guarantees for this architecture, meaning that one can generically train its parameters.”
Quantum convolutional neural networks are inspired by the visual cortex. They involve a series of 'filters', or convolutional layers, interleaved with pooling layers that reduce the dimension of the data while retaining the important features of the dataset. These neural nets can be applied to a range of problems, from image recognition to materials discovery.
Overcoming barren plateaus is considered critical to extracting the full potential of quantum computers: “All hope of quantum speedup or advantage is lost if you have a barren plateau,” said Cerezo.
The crux of the obstacle is a 'vanishing gradient' in the optimisation landscape, which is otherwise composed of hills and valleys, with the lowest point representing the optimal solution. In a flat landscape, it is not possible to train the parameters as the correct direction to take cannot be determined. This becomes particularly difficult when the number of data features increases; the landscape becomes exponentially flat with the feature size. Hence, in the presence of a barren plateau, the quantum neural network cannot be scaled up.
Cerezo explained that researchers have gone to great lengths to mitigate the limitations of barren plateaus, but lacked a theoretical basis for avoidance; the work he and his colleagues have carried out shows that quantum neural nets are effectively immune to barren plateaus.
They developed a novel graphical approach for analysing the scaling within a quantum neural net and proving its trainability.
The type of quantum convolutional neural network the researchers focused on is expected to have applications in analysing data from quantum simulations.
For example, much research is focused on ceramic materials as high-temperature superconductors, but analysing data regarding a material’s many phases in order to classify them goes beyond the capabilities of classical computers. Using a scalable quantum neural network, a quantum computer could sift through a vast data set about the various states of a given material and correlate those states with phases to identify the optimal state for high-temperature superconducting.
“With this guarantee in hand, researchers will now be able to sift through quantum-computer data about quantum systems and use that information for studying material properties or discovering new materials, among other applications,” said Dr Patrick Coles, a quantum physicist at Los Alamos and a co-author of the paper. Cole said that many more applications will emerge as quantum computers are more widely used and more training data is generated.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.