HPE shows off The Machine prototype without memristors
16 May 2017 | 0
In 2004, Hewlett-Packard Enterprise’s Kirk Bresniker set out to make radical changes to computer architecture with The Machine and drew out the first concept design on a whiteboard.
At the time Bresniker, now chief architect at HP Labs, wanted to build a system that could drive computing into the future. The goal was to build a computer that used cutting-edge technologies like memristors and photonics.
It’s been an arduous journey, but HPE today finally showed a prototype of The Machine at a lab in Fort Collins, Colorado.
It’s not close to what the company envisioned with The Machine when it was first announced in 2014 but follows the same principle of pushing computing into memory subsystems. The system breaks the limitations tied to conventional PC and server architecture in which memory is a bottleneck.
The standout feature in the mega server is the 160Tb of memory capacity. No single server today can boast that memory capacity. It has more than three times the memory capacity of HPE’s Superdome X.
The Machine runs 1,280 Cavium ARM CPU cores. The memory and 40 32-core ARM chips – broken up into four Apollo 6000 enclosures – are linked via a super fast fabric interconnect. The interconnect is like a data superhighway on which multiple co-processors can be plugged in.
The connections are designed in a mesh network so memory and processor nodes can easily communicate with each other. FPGAs provide the controller logic for the interconnect fabric.
Computers will deal with huge amounts of information in the future and The Machine will be prepared for that influx, Bresniker said.
In a way, The Machine prepares computers for when Moore’s Law runs out of steam, he said. It’s becoming tougher to cram more transistors and features into chips, and The Machine is a distributed system that breaks up processing among multiple resources.
The Machine is also ready for futuristic technologies. Slots in The Machine allow the addition of photonics connectors, which will connect to the new fabric linking up storage, memory, and processors. The interconnect itself is an early implementation of the Gen-Z interconnect, which is backed by major hardware, chip, storage, and memory makers.
HPE is improving the memory subsystem and storage in PCs and servers, which is giving a boost to computing. While data is being processed faster inside memory and storage, it reduces the need to speed up instructions-per-clock in CPUs.
In-memory computing has sped up applications like databases and ERP systems, and HPE is blowing up the design of such systems. There’s also a move to decoupling memory and storage from main servers. That helps speed up computing and makes more efficient use of data centre resources like cooling.
There have been some glitches, though. The initial model of The Machine was supposed to have memristors, a type of memory and storage that could help computers make decisions based on data they retain. HP announced memristor in 2008, but it has been delayed multiple times. The company is now developing technology with Western Digital, Bresniker said.
Bresniker is taking an open-source approach to the development of The Machine, with the ethos of cooperation among partners to build such systems in the future. This system is a prototype that will drive the development and implementation of Gen-Z and of circuits that can be used as co-processors.
While HPE is trying to build a new system, Intel is coming from another angle with its 3D Xpoint storage and memory. System makers will try to build faster computers around Intel’s 3D Xpoint-based Optane storage, which the chipmaker says will eventually replace DRAM and SSDs.
The Machine is a future computer architecture that is also practical, said Patrick Moorhead, principal analyst at Moor Insights and Strategy.
“The fact that they can do this and run programs on it, it’s absolutely amazing,” Moorhead said. The Machine runs a version of Linux.
The Machine stands somewhere between the computers of today and future systems like quantum computers. But it’s still three to five years away from being ready for practical implementation in data centres, Moorhead said.
IDG News Service