The future of computing

Image: Stockfresh

11 May 2015

One of the continued sources of enjoyment for me in my current role is the need to keep up with trends and developments in the world of information and communications technology. This often means reading up on first principle and blue sky research where mad new ideas get a good shake to see if they can stand up on their own, and, like freshly laundered pyjamas, many cannot.

However, one thing seems abundantly clear as I read through a slew of materials around the fiftieth anniversary of Moore’s Law is that silicon, having persisted longer than initially anticipated, will not continue to be the basis of computing into the future. Alternatives will have to be found.

Now, 20 years ago, the imminent demise of silicon as the base of commercial computing was being predicted much more vociferously than it is now, but the advent of multi-core technologies, and the means to efficiently produce them, has seen the humble x86 in particular, enjoy a far greater run than was ever anticipated. Then again, other contemporary common cries of the end also included peak oil, an ozone layer hole that now looks all but fixed and, of course, civilisation as we know it. But that last one comes along fairly regularly and so can largely be ignored.

“Quantum computers, it appears, are still very much theoretical and anything close to a working prototype is still very far from any kind of standardisation that would see it a commercial reality any time soon”

In the field of new, alternative computing models, there are currently two approaches that seem to be leading the charge to replace current computing models that have served us so well. These are firstly quantum computers, and secondly a project by HP called The Machine.

Uncertainty abounds
Anyone who has ever read about quantum physics will know that at the heart of the theories that underpin our understanding of the quantum world is uncertainty, and this key principle has extended into the world of quantum computing insofar as many experts and commentators have doubts as to whether anyone has actually built a quantum computer yet.

Without going into detail, a quantum computer is one that relies on quantum states of particles such as electrons, photons or certain ions, to represent information. So just like the ones (ons) and zeros (offs) of classic computing, the quantum computer can use the uncertainty about the state of a quantum particle to represent both zero and one at the same time. The idea is that this quantum uncertainty can be exploited for massive parallel computing capability where every outcome can be calculated with quantum speeds for machines that are incomparably faster than current ones.

How do you do that then? Well the answer is no one is really sure yet. Traditional computing based on the silicon transistor does not yet have a quantum computing equivalent — there is as yet, no base component of a quantum computer. This is because the particles on which quantum computers would rely upon vary and are currently being manipulated in various ways to set and then read quantum states. Think of it as that bit in the Alan Turing film where he is still making his logic gates by hand — he knows what they are, he knows what they have to do, but had he to make another one, he would probably modify the design. Quantum computers, it appears, are still very much theoretical and anything close to a working prototype is still very far from any kind of standardisation that would see it a commercial reality any time soon.

That said one company, D-Wave, claims to have a 128-qubit quantum computer, but while this machine certainly uses quantum effects, it has been questioned by many people who know far more than me as to whether it is a true quantum computer.

Error checking
IBM recently announced that it had solved an error checking problem in quantum computing systems, which is regarded as a key breakthrough in the field, and will likely pave the way for the future software development in the field.

The overall impression to this lay person is very much of an exciting field that is not only challenging the way we think about computing, but is pushing the very understanding of computation in new directions. But this is still firmly a white coat domain.

The other major candidate is HP’s massively ambitious The Machine.

At the core of this approach is the idea that a new form of combined storage and memory called a memristor can collapse the core architecture of a computer, with significant benefits for data paths and speed. By having the memory more or less integrated with storage in a non-volatile form, there is no need to have this removed from processing. Also, with light-based interconnects, as opposed to copper or silicon, speed is further enhanced, but critically, the power that is necessary is reduced by orders of magnitude, perhaps to as much as 1.25% of current machines while providing 6 times the processing capacity.

But again, practicality is a huge problem here. Memristors are tough to manufacture, and initial agreements for commercialisation appear to have run into problems, but now other manufacturers are coming into the fray.

A very interesting development comes from closer to home, with the AMBER centre and Trinity College developing a multi-state memristor style component that could pave the way for base-10 computing. This would constitute a major leap forward. For example, massive binary 64-bit integers could be stored in 20 bits, instead of 64, paving the way for huge increases in computing power and speed.

New departures
With all of these new departures, new software and operating systems would be necessary, and HP is already working on Linux++ as a new OS for its Machine project. Quantum computing would likely call for an entirely new way of writing software, not to mention the new interfaces that would be required between base-10 and current computing models.

When added to all of this is carbon nanotubes and the potential wonder that is graphene, and the number of interesting new permutations increases so much that you might need a new style of computer just to calculate them all. Oh, wait minute…

Joking aside, the HP project is set to have a prototype OS sometime this year, while the actual computer prototype is expected next year. Alas, quantum computers are a much more distant prospect. While we can expect to see multi-core developments pushing current architecture for a while yet, new developments are underway to ensure that the pace of computing developments is unlikely to slow at all in the coming years.

Read More:

Leave a Reply

Back to Top ↑