Computers that process light instead of electrons could provide the boost supercomputers need to get to the exascale generation.
From next September, the UK Met Office will be using 480,000 microprocessors to improve the precision of its local weather forecasts down to a resolution of 300 metres. Part of a 140-tonne, £97m Cray XC40 supercomputer, the software running across those thousands of digital processors will help the UK better predict disruptive weather events such as flooding, strong winds, fog and heavy snowfall.
The Met Office, like all weather forecasters, is struggling with data sets so large and complex they can only be processed in parallel using thousands of processors that, in total, soak up megawatts of power. Other large-scale physics simulations and genomics have similar problems, created by ever higher-resolution imaging and sensing technologies that can measure and capture huge amounts of detail. As Big Data gets even bigger, there are concerns that trying to process it with conventional computing methods is becoming unsustainable in terms of power consumption alone.
There is another way: a return to analogue computing. Once used to simulate nuclear reactions and to control naval guns, electronic analogue computers were quickly swept away by their digital counterparts once they had caught up in performance because the analogue circuitry was inherently inflexible. But a shift to photonics provides both the performance and flexibility that Big Data applications need.
By 'imprinting' data onto light waves, the idea is that certain mathematical operations can be performed simultaneously on thousands or even millions of data points per second as the waves travel through suitable lenses or filters. Because of the inherent parallelism, energy consumption would be so low that systems could be powered from a standard domestic electricity socket.
UK start-up Optalysys is among the pioneers of this new direction in information processing. The company has built a system using low-power lasers and tiny liquid-crystal displays (LCDs), using weather forecasting as an application in its R&D work with the European Centre for Medium-Range Weather Forecasts (ECMWF). Another research project is with a UK genome centre that is interested in using the technique for finding DNA patterns within genomes.
In academia, Nader Engheta from the University of Pennsylvania, working with colleagues from the Universities of Texas at Austin, Sannio in Italy and Pennsylvania have been working on a project to look at the feasibility of optical analogue processing using metamaterials, composite materials with physical properties that can manipulate electromagnetic waves in novel ways. Earlier this year, the researchers published a paper in the journal Science showing how a theoretical metamaterial could do photonic calculus and find the first or second derivative of a light wave's profile, as the light passes through.
The metamaterial research is only just moving out of the theoretical stage. Optalysys, however, is planning to launch its first prototype in January 2015.
Optalysys' first proof-of-concept is due to deliver performance equivalent to a 340Gflops digital machine but be able to scale up to exaflop-scale operation – seven orders of magnitude more powerful – but still fit on a desktop. When the Met Office's room-sized computer reaches its full capacity in 2017 it will process at 16 petaflops, or about 5,000 times more operations than the Optalysys proof of concept.
The technology employed by Optalysys relies on diffraction and Fourier optics to perform matrix operations in parallel. The idea behind the processing is that whenever a portion of a wavefront is obstructed, the parts propagating beyond the obstacle interfere to create a diffraction pattern. Focused by an appropriate lens, the result is the Fourier transform of the input wavefront's amplitude profile, providing the component frequencies that add up to create the source waveform.
Breaking a waveform into its constituent frequencies provides the ability to perform an array of operations useful in big-science applications, such as calculating rates of change. Such derivatives form a key part of the Navier-Stokes equations used in weather modelling. The big advantage of the system, says the company, is that you can scale the Fourier transform without increasing the processing time or running into the data management problems you have with multi-core processing architectures.
Whereas a Fourier transform on a 256x256 array of elements can take more than 100,000 individual operations to perform on a digital processor, the optical transform is almost instantaneous. The limiting factor is the resolution of the optics used to present that array of data to the diffractive optics.
Early demonstrator systems contained traditional optical components but the latest design replaces most of these with the micro-LCDs. Two-dimensional matrices of numbers are programmed into the input micro-LCD's grid such that the intensity level of each pixel represents a number. When a laser is shone through or is reflected off this input data pattern, the pattern is effectively 'stamped onto the beam', turning the data matrix into a waveform. After processing, the results are converted back into digital form with a camera.
'"The resolution of each LCD is around 4kpixel at the moment and will move to 8k in the next few months. Each pixel is around 5µm in size, so they're very small devices," says Dr Nick New, CEO and co-founder of Optalysys.
"With our proprietary alignment methods, which do not use conventional optics, we can produce multiple optical stages to extend the mathematical functionality by allowing many LCD devices to be mounted with high accuracy, thus removing the previous limitations of free-space diffractive optical systems," Dr New claims.
The micro-LCDs do not just provide the input patterns. Others act as filters, which are useful for correlation tasks, and focusing elements. In DNA matching for example, the data from a genome can be encoded using four different light intensities on the input modulator with the pattern to be matched represented by its pre-processed Fourier transform on a filtering micro-LCD.
The output would show the location and strength of the matches where the string appears in the input grid. "The potential of the optical processing is to reduce the power consumption and boost the processing power to the point where producing personalised medicines or local virus analysis becomes viable," Dr New argues.
By reconfiguring the micro-LCDs on the fly with different patterns and filters many millions of data points could be put through the system at many thousands of times per second, says Dr New. Today's ferroelectric LCDs can be controlled at kilohertz rates.
The work performed by Engheta and his academic colleagues takes a different approach, using metamaterials to manipulate waveforms as they propagate through them. Using metamaterials instead of conventional optics could provide more compact processing that could be integrated into solid-state devices, the team claimed in their Science paper. The materials can have thicknesses of less than the wavelength of light they process.
"My initial question for this work was could we design a block of material such that when a wave goes through it, the block would do some operation?" Engheta recalls. "My next question is; would it be possible to design a material that would solve an equation? So, you would have a tiny block of material in a cavity or in an optical fibre such that light goes through and the profile of the light would be the solution to the equation. I don't know if you can do this, but it would be very exciting."
If that were possible, one would be able to predict outcomes of some physical phenomena that are governed by such equations, Engheta suggests.
"It does not have to be optical; our system could be microwaves, infrared or so on. For different frequency regimes, one would design suitable metamaterials to 'do the math' as the wave propagates through the structures," he adds.
The US-Italian team simulated two possible concepts: one was a metasurface approach, in which thin metamaterial blocks perform mathematical operations in the spatial Fourier domain; the other used Green's functions, which perform differentiations on an input waveform. Multiple layers of metamaterials yield the desired version of the Green's function at the output. By choosing the thickness of each of the layers and the material properties in the multi-layered approach, they found that it was possible to design a Green's function that would perform calculations as double differentiation and convolution.
Proof of concept
At this stage, the metamaterial research is still in the lab, although the team has moved on from simulations to experiments to design and test the concept. Over the next few months, Engheta's team hopes to develop a physical proof of concept that will work using reflections rather than transmissions through the materials, based on the multi-layer Green's Function approach the group has simulated.
"Depending on the nano-fabrication facility we use and the wavelength of interest, there is a list of materials we can use," he says. For the simulations the team looked at materials such as silicon and aluminium-doped zinc oxide.
"The big picture is that you manipulate light with material. Ideally you would want to do as much as possible in the optical domain before turning it into electrical signals, and you would want to change the function of these materials on demand. For instance the permittivity could be changed in individual layers. You could do this electronically or by using temperature or by magnetics, for example. This is how we think in the future this could be a possibility. Imagine magnetically active layered structures," says Engheta.
Both Optalysys' and Engheta's teams focus on doing as much computing as possible in the analogue domain before turning it into electrical signals for further processing, thus keeping all the data management within a highly parallel processing environment. Being able to change the function of the system components or materials on demand is also key.
The perennial problem of analogue systems is noise. Engheta's team does not know the effect of local quantum fluctuations or camera pixel noise at this stage but plans to study it later. Dr New points out that Optalysys' approach minimises noise. "We believe that the relatively large pixel sizes in our system will lead to less noise overall as small fluctuations will have less of an effect," he says.
Dr New believes that the two approaches to optical information processing may be complementary in the long term: "The metamaterial technology, from what I have seen, could act as the filters in our system."
If its initial research trials on genomics and weather forecasting simulations are successful, Optalysys hopes to extend optical processing to other linear algebra tasks, such as matrix inversion, that are used in computational physics, science and engineering. Dr Andy Lowe, Optalysys' applications developer says: "If successful, optical processing will give us a boost in processing power as well as the ability to monitor very complex simulations way beyond our present abilities, as they proceed, monitoring hundreds of millions of data points in real time."
Mainstream uses are possible. Optalysys has proposed an online web interface to an optical processor, in effect a kind 'mini' Cloud for 'speed of light' data-mining. If the metamaterial idea takes shape, direct image processing inside cameras could have mass market potential. "Second differentiation is interesting because it can pick up the edges in images. Imagine, you could put this slab into the focal plane array of your camera," suggests Engheta.
Even though they are at the early stages of development, both approaches are light years ahead of the electronic analogue processors of the past.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.