New machines: beyond x86 and the PC

Pro
(Image: Stockfresh)

10 November 2016

With quantum computing, photonic processors and memristor technologies all being developed at a furious pace, it seems clear that the era of the x86-based PC is at an end. More and more computing is done in the cloud, and while the PC has enjoyed a resurgence of sorts as a thin client that gives access to the power of distant processors, things are changing.

martin_sadler_hp_labs_webs

We tend to think of a processor and then maybe co-processors, FPGAs and other things that sit around that one processor. In The Machine, memory is at the heart of what we do and then there are many, many processors that all sit around that memory, Martin Sadler, HP Labs

An ever-growing number of devices are being connected to the Internet, generating huge quantities of data and requiring ever more complex solutions to manage — it is clear that the computers of tomorrow are going to have to be able to handle data processing tasks several orders of magnitude greater than those the average computer was designed for.

But just what kind of machines will replace the current generation of servers and PCs taking us into the next era of business computing? The answer is not immediately clear, but there are several prototype projects around the world that offering fascinating possibilities.

Smarter and connected
“If you look at where the world is going, we’re getting a lot smarter and we’re getting a lot more connected. If you look at the underlying data structures that we need in our computing architectures to facilitate that, they tend to be quite memory hungry,” said Martin Sadler, vice president and director of Hewlett Packard Labs’ security and manageability lab.

“So if you take that as the context, look at current architectures and realise that memory is largely the bottleneck. We spend all our time moving stuff back and forth between memory and storage. So where The Machine started was with the idea of flipping this on its head — we thought let’s build machines around memory rather than around processors.”

The Machine
The Machine, as HP Labs calls it, is a prototype computer system that turns the conventional Von Neuman computer architecture on its head, eschewing a processor-centric view of the world. It uses petabytes of fast memory and storage collapsed into one vast pool called universal memory.

To connect memory and processing power, The Machine uses advanced photonic fabric — essentially using light instead of electricity to allow it to rapidly access any part of its massive memory pool while using much less energy.

“We tend to think of a processor and then maybe co-processors, field programmable gate arrays (FPGAs) and other things that sit around that one processor. In The Machine, memory is at the heart of what we do and then there are many, many processors that all sit around that memory,” said Sadler.

robert_mccarthy_ibm_web

Watson analyses high volumes of data and processes information more like a human than a computer — by understanding natural language, generating hypotheses based on evidence and learning as it goes, Robert McCarthy, IBM

“The reason we can do that is because a lot of memory and storage technologies are coming together — the distinction is collapsing. So basically, we can get memory cheap enough to put enough of it into a box so that you can behave in this different kind of way.”

“We’re talking about hundreds of petabytes of main memory in our initial prototypes. That allows you to hold a huge amount of information and then of course process it very quickly. If you’re going to process across a data set that size, you don’t want one processor, you want many, many processors all operating on that same data,” he said.

Advantages
The advantages of a system that has vast amounts of memory and the ability to operate at extreme speeds are obvious, but what is perhaps most interesting is the question of how much of the technology found in something like The Machine will end up filtering down to the enterprise and eventually, the desktop?

“General purpose computing isn’t going anywhere — there will remain a place for it for some time but to get the speed that we really need, we think memory-centric computing with specialised processors is likely to be a game changer,” said Sadler.

 

advertisement



 

Read More:


Back to Top ↑

TechCentral.ie