Woman tapping a fitness band

Could skyrmions change the future of computing?

Image credit: Getty Images

Researchers believe devices based on skyrmions – particles consisting of a magnetic field surrounding a group of atoms – will have the potential to change the future of computing.

The chances are high that most people you know have never heard of skyrmions. This is natural, because they are rather murky products of quantum field theory that act like particles without actually being particles. So far so obscure. Skyrmions, however, may soon be as well known as silicon chips. That’s because they have the potential to change the face of computing, revolutionising the fields of data storage, information processing and artificial intelligence.

Skyrmions are quasi-particles – little twists in energy fields that look and behave like particles. They can be moved around, interact with other skyrmions and be created and destroyed. Like particles, they can be extremely stable, lasting years in the right conditions.

Based on their inherent stability, the most promising application for skyrmions is in data storage, where each skyrmion acts as a bit. However, there are challenges to overcome – researchers have only been able to find magnetic skyrmions that are too slow and too large to compete with existing memory devices. Until now.


What are skyrmions?

Skyrmions were first proposed in the 1960s by theoretical physicist Tony Skyrme, who thought they might be a way of describing protons and neutrons within atoms. They turned out not to be particularly useful in this regard, although they did find applications elsewhere such as in liquid crystal displays.

Since then skyrmions seemed doomed to linger in the shadows. But a decade ago these swirls of spin caused a sudden storm in the field of magnetism, when researchers realised that their inherent stability, size and manipulability could lead to breakthroughs in data storage and information processing.

“People realised that you could create these structures inside magnetic materials,” says Professor Christopher Marrows of the University of Leeds. “They can be very small, and you can put them together very densely and you could use them inside various nanoscale devices.”

Skyrmions’ stability comes from their topology or shape. “A sort of analogy would be a Moebius Strip,” explains Marrows. “That’s a loop with a twist in it. You can move this twist around the loop, but you can never get rid of it unless you do something quite violent like tear it open, untwist it and stick it back together again.”

In December 2018, a team from MIT published a paper showing that they could achieve the right sizes and, to some degree, speeds to be competitive. They did it by focusing on a new set of materials. “The field in this past decade has focused on ferromagnets, which are things like cobalt, nickel and iron,” says MIT researcher Lucas Caretta. “We discovered in the paper that ferromagnets have a fundamental limit for the size of the skyrmions you can have and how fast you can drive them.”

Instead, the team looked at materials called ferrimagnets. How ferrimagnets differ is that instead of having their magnetic fields all pointed neatly in one direction (known as a stray field) they are anti-parallel, or all mixed up. Caretta’s team used a material called gadolinium-cobalt where the spins of the electrons in the cobalt were pointed in one direction and the spins of the gadolinium electrons in the other.

“If you get rid of stray fields,” explains Caretta, “then your skyrmions, or bits, are no longer interacting with each other over long ranges. Instead they are stabilised by local exchange interactions and these will dictate the size of the skyrmion.”

‘If you replaced all the different types of memory in your computer with a device like this, there would be no such thing as booting up your computer. It would always remember where it last was, and only consume power when performing a function.’

Lucas Caretta, MIT

Using this new material, Caretta’s team were able to get the size of the skyrmions down to 10 nanometres, the benchmark to be competitive.

The size, speed and stability of skyrmions could mean smaller and faster memory devices are around the corner. But skyrmions’ potential goes far beyond just improving existing models. Memory architectures involving skyrmions could replace the different forms of memory that currently make up your computer, such as RAM, ROM and cache, and merge them into one structure. What’s more, because of skyrmions’ inherent stability, they are still there when you turn the power off. This means that something called normally-off computing could be achieved.

“If you replaced all the different types of memory in your computer with a device like this,” explains Caretta, “there would be no such thing as booting up your computer. It would only consume power when performing a function. So, you’d go up to your computer and start typing and that’s the only time it would consume power. It can always remember where it last was, essentially, without power.”

Skyrmion-based devices could go even further, not only integrating different types of memory but combining memory with processing – one of the holy grails of modern computing.

One way of envisaging it is as a track of skyrmions representing bits. These are there all the time even when the power is off, preserving the data. However, two of these tracks could be merged, forcing the skyrmions to interact with each other to provide the information processing usually supplied by transistors.

“It removes the need for you to be constantly shuttling back and forth between the CPU and the memory and constantly having to refresh it all the time,” explains Christopher Marrows, a professor of condensed matter physics at the University of Leeds and one of the UK’s leading skyrmion researchers. “So if you could make that work it would have huge benefits in terms of energy.”

Skyrmion illustration

Image credit: Prof R Wiesendanger, University of Hamburg

Skyrmions’ potential extends even beyond classical computing. Neuromorphic computing seeks to use neural networks to mimic the way the brain processes information and is behind some of the most powerful machine-learning algorithms. Current machine-learning programmes such as Facebook’s facial-recognition software, DeepFace, and Google’s DeepMind are neural networks, but they are run on traditional hardware. This leads to huge energy inefficiencies, with supercomputers using megawatts of power to do the same task that a cat’s or mouse’s brain can achieve with just a few watts.

“You’re trying to do one type of computing on a piece of hardware that’s designed to do something very different,” explains Marrows, “ – like add up a list of numbers on an Excel spreadsheet, not to do this kind of fuzzy pattern recognition.”

What is needed is hardware to match the software being run. This is where skyrmions come in because, for one thing, they can move more freely. “What you have is a particle that can move in two dimensions,” says Dr Karin Everschor-Sitte of the Johannes Gutenberg University of Mainz, who is researching how skyrmions can be applied to neuromorphic devices. “But you’re restricting it to one dimension. So somehow you’re not optimally using its functionality.”

One area Everschor-Sitte explored was using skyrmions in a neural network model called reservoir computing. Reservoir computing works by feeding an input signal into a dynamic system of objects that are random and non-linear (the reservoir), which maps the input to a higher-dimensional space before it is read out again on a linear scale. Everschor-Sitte and her team used skyrmions as the reservoir.

“We took a magnetic configuration,” explains Everschor-Sitte. “We applied some different voltage signals, and because of the non-linear answer of the system, it allows you to say which one was which signal. The optimal version would be, I let you speak, and my reservoir can tell me what you said.”

‘It removes the need to be constantly shuttling back and forth between the CPU and the memory... if you could make that work it would have huge energy benefits.’

Christopher Marrows, University of Leeds

Neuromorpic computers like this could find applications where pattern recognition is key, such as face- and voice-recognition software. It could also benefit any application where information is spatially and temporally correlated. Driverless cars are an example, where multiple sensors need to instantly process information and communicate with each other to navigate obstacles.

The ability to process lots of information quickly means neuromorphic devices could be applied to wearable medical devices and the Internet of Things. With billions of connected devices all generating their own data, it is more helpful to have them process it themselves than send it to a central computer perhaps hundreds of miles away. In this way it could be extremely useful for analysing the torrent of data from security and CCTV cameras, which currently provide more information than is feasibly possible to analyse.

“One way would be to have machines that analyse the data on the spot,” says Everschor-Sitte. “So, to analyse situations that are unnatural, and which are super-hard tasks for normal computers, like how do you differentiate between people who are just naturally talking or being angry and about to start a fight?”

There are still many hurdles to cross before any applications, either in neuromorphic or classical computing, can be realised. Firstly, researchers haven’t quite got a handle on how to manipulate skyrmions to their full potential. “They sometimes get stuck, they sometimes crash into each other. They don’t move in a smooth way,” says Marrows.

Although the MIT team managed to crack the size problem, the speed issue was solved with a bit of a fudge involving moving other parts, called domain walls, relative to the skyrmions rather than the skyrmions themselves. Also, although skyrmions can now be created at will – essentially the writing process of data storage – there are still technical issues with the reading process.

All these issues mean the huge potential of skyrmions for the moment remains just that – potential. Professor Geoffrey Beach, head of the MIT team, thinks it could be 10 years before a skyrmion-based device is on the market. Everschor-Sitte thinks it will take longer for skyrmion-based neuromorphic devices. Another leading skyrmion researcher, Professor Jiadong Zang of the University of New Hampshire, believes it will take five years just to answer the question of whether magnetic skyrmions will be applicable at all.

One thing they all agree on is the huge effect skyrmions could have if their potential is realised. “Things like your phone or computer could potentially be extremely small in size,” says Caretta. “I’m talking maybe two, three, 10 times denser memory. And the speeds could be particularly fast. We could be talking an order of magnitude faster memory. And if we can achieve normally-off computing the power efficiency of these devices could be absurd. Instead of charging your phone once a day, you’d charge it once a month.”

In fact, the scale of the change could be so large that we can’t predict what the consequences will be. As Marrows observes, “there have been very unexpected consequences of the social media revolution, that were built on cheap information storage, cheap computing and touchscreen technologies that have affected society quite profoundly in ways that were probably quite impossible to predict. So too with skyrmions, there will be things that will be impossible to foresee.”


What is neuromorphic computing?

Since computing began computers have been based on the classic Von Neumann architecture. Information processing happens in a central processing unit (CPU) before being sent to the memory unit for storage and vice versa. In modern-day computers, information processing occurs via transistors on silicon chips and long-term data is stored on hard drives via magnetism.

Neuromorphic devices would do away with the bottleneck of piping information back and forth between the CPU and the hard drive by combining the two processes in one neural network, just as the brain does.

In the brain, memory and logic are integrated closely in neurons and the connections between neurons known as synapses. A single neuron can combine the outputs of thousands of other neurons via addition and subtraction before producing its own output. The strength and relationships between synapse connections constitute memories which may change over time due to how information is processed and in turn may influence the way that data is used.

Thus memory and logic have a close working relationship which enables the brain to perform many operations simultaneously, or in parallel, and carry out complex actions like speech and facial recognition using relatively little power.

Several ways are being explored to mimic the brain’s hardware, of which skyrmions are just one. IBM for example has created the TrueNorth chip, which combines information processing and storage on the same microchip via one million individually programmable neurons and 256 million individually programmable synapses while consuming just 70mW of power.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles