New form of computer memory is 1,000 times faster
Image credit: DT
A new type of computer memory could speed up certain kinds of data operation by 1,000 times while also being easier to code for than traditional memory.
The amount of data generated each year is on an exponential incline. The developers of the new form of memory, Blueshift Memory, believe their new technology will better cope with the increasing demands.
The start-up firm designed the memory to address the gap between rapidly developing central processing units (CPU) and slower progress in computer memory chips. The disparity creates a “tailback” when high-performance computers perform large-scale operations, like database searches with millions of possible outcomes. The troves of data effectively gets stuck in a slow moving queue between the CPU and the less-efficient memory which reduces the speed at which computers can deliver results.
Blueshift claims its new memory design will allow certain complex operations to take place in a matter of minutes compared to tasks that take hours today.
Peter Marosan, Chief Technology Officer at Blueshift Memory, said: “Imagine if you are a taxi driver but the town where you work is always changing, people are constantly swapping houses, and the shops and services are forever disappearing and reappearing in different places. That’s similar to the way in which data is organised in existing chips.”
“Our design is the equivalent of replacing that with a stable, structured town where you already know where everything is and can find it much more quickly. It makes everything faster, easier and more effective.”
Computationally expensive operations such as drug discovery, DNA research, artificial intelligence design and the management of future smart cities could be made much faster on the new memory, Blueshift said.
However, it is not expected to have an impact on simpler operations such as word processing. The chip’s designers stress that this is only part of a solution that will require greater collaboration between various companies who are working on the “data tailback” challenge.
They have built a working model to emulate the chip’s effects, ahead of the more expensive task of creating the first chip.
In testing it was demonstrated that the algorithms used in weather forecasting and climate change modelling could run 100 times faster using the chip. It could also improve the speed of search engines and the processing speeds of virtual reality headsets by as much as 1,000 times.
Blueshift is now seeking funding to create a full first iteration of the chip.
The company said that changing the way in which computer memory works could also help the artificial intelligence in autonomous vehicles, like driverless cars, which need to process huge quantities of data quickly to make decisions.
They added that fast, real-time data processing on a large scale will be essential in a future in which objects and people are likely to be closely connected in smart cities, with technology used to manage traffic flows, utility supplies, and even evacuation procedures in times of danger.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.