- Hewlett Packard Enterprise on Monday debuted a successful example of memory-driven computing that does not rely on processing for computing power.
- The proof-of-concept research, a part of "The Machine" project, would upend the current server market, changing how computers use and access data. Rather than moving data from processor to processor, with memory-driven computing the data can remain still.
- HPE is working to "rapidly" commercialize the technology into new and existing products for its non-volatile memory, fabric, ecosystem enablement and security categories. Some of the capabilities will be available by 2018/2019, according to the company.
By placing memory at the core of the computing technology, HPE has the potential to drastically increase performance and efficiency of computing platforms. But more than that, it is upending a technology with an architecture that has remained largely unchanged for the last 60 years.
The Machine and its computing efforts is "one of the largest and most complex research projects" in HPE history, according to the company, and the potential payoffs could be huge. The amount of data created every day is exploding and there needs to be technology available on the market that can keep up.
As HPE notes, Gartner predicted that by 2020 there will be 20.8 billion connected devices, each generating large quantities of data every day. HPE is hoping that by revolutionizing storage and servers they're going to be one of the key providers for next-generation technology. HPE CEO Meg Whitman has pointed to the vast potential of "The Machine" project. If it pays off, the new servers would be "hundreds of thousands of times" more powerful than products currently on the market, Whitman told Fortune.
The storage market is also changing, as more companies adopt cloud-based storage solutions. Server revenues have continued to fall, even though HPE remains a leader in the space. As the market changes, HPE will remain more competitive if it can stay on the forefront of computing.