It's big. It's powerful. And we probably won't see it for another five or so years.
So why contemplate what Hewlett Packard Enterprise's The Machine — the company's name for its memory-driven supercomputer — could mean for business? After all, the prototype HPE recently demoed is aimed at applications where performance trumps budgetary concerns — meaning government-funded initiatives such as a trip to Mars.
A 'Computer Built for the Big Data Era'
Businesses should start paying attention for a few reasons. First, HPE promises The Machine will make certain business applications possible. Indeed, HPE needs some kind of edge to remain on top of the supercomputing market and business-facing applications could fill that role.The company owns 28.6 percent of the market, according to The Top 500. Its closest competitor is Lenovo with 17 percent.
Also, the supercomputer space in general is undergoing a shift as a report commissioned by the Japanese national science agency Riken makes clear.
Steve Conway, report co-author and senior vice president of research at Hyperion told IEEE Spectrum that, "unlike more specialized supercomputer applications from years past, the workloads of tomorrow’s supercomputers will likely be mainstream and even consumer-facing applications."
Suddenly those five years seem to be speeding by a lot faster than they were a few minutes ago, don't they?
"The secrets to the next great scientific breakthrough, industry-changing innovation or life-altering technology hide in plain sight behind the mountains of data we create every day," Meg Whitman, CEO of Hewlett Packard Enterprise said when she unveiled The Machine prototype on May 16. "To realize this promise, we can’t rely on the technologies of the past, we need a computer built for the big data era."
And that is precisely what HPE claims to have done.
160 Terabytes of Memory
It's no secret HPE was working on memory-driven computing: Its focus on memory performance and data speed as opposed to processing power has been a hobby horse for the company for years, even as it dominated the supercomputer market.
Memory-driven computing, Charles King, principal of Pund-IT, told CMSWire, "is an interesting approach to applying the value/benefits of pooled memory technologies to high-end supercomputing and technical computing scenarios." Then, close to a year ago HPE announced a breakthrough and unveiled what it said was the world’s first Memory-Driven Computing architecture. In November 2016 it introduced its first commercialization roadmap.
This May came the actual product, or rather prototype: a 160-terabyte in-memory system based on that architecture. To state the very obvious, this is an enormous amount of computing power. One comparison HPE gives is that The Machine would simultaneously work with the data from 80,000 human genomes. (And in fact, HPE's first collaboration using the Memory-Driven Computing architecture is with the German Center for Neurodegenerative Diseases.)
The Machine OS is familiar, at least to techies — HPE based it on a modified form of Linux. The system can accept all forms of memory technology from the conventional (such as DRAM or Flash) to the emerging storage class memory (like spin-torque, 3D Xpoint or Memristor).
It is a product built for the big data era.
How Unique Is It?
Unsurprisingly, given The Machine is still in the prototype period, there are doubts about HPE's vision for its latest baby.
King, for instance, said it’s difficult to sort out the "uniqueness" of HPE's solution.
"The company's 'X1' silicon photonics (optical) interconnect is central to the Machine's robust memory capabilities but several of the top 20 systems in the current Top 500 list of leading supercomputers leverage remote direct memory access (RDMA) capabilities that seem analogous to HPE's Machine," he says. It's also unclear whether HPE is utilizing any of the NUMAlink interconnect technologies acquired when it purchased SGI last August, King added. (HPE acquired supercomputer maker SGI for roughly $275 million in August 2016.)
Rob Enderle of The Enderle Group also had doubts about that five-year timeline.
"A new architecture like this has a path to market that is measured in decades," he told CMSWire. Furthermore "HPE's ability to launch a product is extremely doubtful both because it is very difficult and because HPE hasn’t even been able to showcase executive stability of late," Enderle said. "The resources needed to bring this to market would be massive and the initial available market very limited."
And as for the business applications that are promised to wow the commercial world, Enderle reminded us of an inconvenient fact: "These will need to be written from scratch creating a cart and horse problem because without the applications there is little utility to the box but without some sales there is no money in the applications."
A Future-Focused Product
But then there is this consideration: If the Machine works as HPE says it will, its ability to search and parse 160TB of DDR4 memory is larger by three to four times existing commercial in-memory analytics systems, King said.
The Machine could make a number of applications possible with that kind of capacity and performance. One obvious area is in data management. Managing 160 terabytes is the equivalent of the data in five Library of Congresses or 160 million books.
Also, because the architecture can scale — per HPE claims — to a near unlimited pool of memory, it has the potential to enable real-time insights not possible with today’s conventional systems.
Indeed, the possible applications range from the grandiose — a Mission to Mars — to the more mundane, but practical, creation of highly personalized retail experiences in real time for one specific consumer out of tens of thousands.
The list of what we could do with this kind of capacity and performance is bound to expand over time. "In that sense, the Machine is really future-focused, and highlights directions and achievements that HPE hopes to pursue in the years ahead."