New machines: beyond x86 and the PC

Pro
(Image: Stockfresh)

10 November 2016

Cognitive computing
One of the ways in which cognitive computing may end up having a broader influence in the ICT world is in the area of bridging the gap that exists between what the human brain and technology can do.

“We know that the human brain can consume and process only a limited amount of information. People are also prone to physical and mental fatigue, as well as mistakes. We are seeing from our industry insights and research that cognitive computing allows organisations to bridge this performance gap, helping to overcome limitations of both humans and systems,” said McCarthy.

“Looking at all of connected things, if you take it right back to a micro scale, there are enormous savings to be made by cutting tiny amounts of power from billions and billions of devices”

“It is poised to enable a broader range of innovation that promises to redefine industries and business. Watson won Jeopardy but what if its power could be used for the greater good to help make better cancer care choices?”

Collaboration
A collaboration between IBM and Memorial Slone Kettering (MSK) Cancer Centre has led to ‘Watson for Oncology,’ a system that uses MSK expertise to evaluate specific details of each unique patient against clinical evidence. The goal is to display several choices for the physician with various degrees of confidence and to provide supporting evidence from guidelines, published research and MSK’s breadth of knowledge.

“By combining world-renowned cancer expertise with the capabilities of Watson, this system can offer oncologists and people with cancer individualised treatment options that are informed by medical evidence and highly specialised experience,” said McCarthy.

“Since Watson Oncology is a learning system, we have a unique opportunity to continually improve it based on users’ experiences. Oncologists anywhere will be able to make more specific and nuanced treatment decisions more quickly, based on the latest data,” he said.

Quantum leap
Based in Canada, D-Wave systems was the first company to offer quantum computers for sale and its machines are currently being used to develop algorithms and tools with the goal of solving what it describes as ‘very challenging applications in the areas of optimisation, machine learning and sampling’.

Where standard computers use electrical circuits that are switched on or off, representing the numbers one and zero and known as bits, quantum computers use quantum bits or qubits, which can be zero or one or both zero and one at the same time. Amongst those enterprises using D-Wave’s machines are Lockheed Martin, Google and Nasa.

As we continue to scale and innovate our system — and ours is the only scalable and commercial system available — we will deliver quantum computing systems that can solve complex, real world problems that are accessible more broadly through the cloud,” said Jeremy Hilton, senior vice president of systems for D-Wave.

“Quantum computing works best as a complement to classical computers. It will most likely be used for specific challenges a classical computer can’t solve rather than entirely replacing the computers we use today.”

The beginning
D-Wave was founded in 1999 and claims its machines are already matching and in some cases surpassing the performance of the most state-of-the-art classical computers which have been developed for over 60 years with nearly a trillion dollars in investment.

“We’re only beginning to see what quantum computing can do, but this technology could lead to the exploration of cleaner energy sources, superior image recognition, more accurate financial forecasting and more precise genome mapping and analysis,” said Hilton.

Meanwhile, technology is advancing in other areas to help facilitate the next generation computing starting to appear in the enterprise. Donnacha O’Riordan is executive director of Microelectronic Circuits Centre Ireland, where research is being done into the potential of low power circuits to help deal with the power drain created by high end computing.

Ultra-low power
“A big part of what we’re doing is looking into low power and ultra-low power circuits and using or researching various techniques to solve the power problem. This creates the potential for thermally intelligent circuits which really means adding intelligence into some of the circuits that we’re working on to enable lower power at a system level,” he said.

The power problem will be familiar to anyone who’s ever seen the electricity bill for a data centre or helped manage any sizable server farm — computing systems require enormous amounts of power to run and to keep cool.

“I was recently told that Google pays something like 3% of the global electric bill and that by 2040 its data centres are going to need more power than is generated today, so clearly it’s a problem,” said O’Riordan.

“It’s one of the grand challenges but it’s easy to understand — after all, a data centre is a big thing and it’s easy to see where the power is going. However, at the other end of the spectrum, there is also huge power consumption at the micro level.”

O’Riordan is talking about all the devices that connect to the internet, use cloud technology and go into making up the so-called Internet of Things.

Small savings
“We’re looking at all of the connected things essentially and if you take it right back to a micro scale, there are enormous savings to be made by cutting tiny amounts of power from billions and billions of devices,” he said.

By way of analogy he points out that when the world largely converted from analogue to digital, the Nyquist rate was used to make sure that the signal of whatever it was that was being converted was preserved in its entirety.

While this was necessary for things like audio and video signals, O’Riordan said that this is not necessarily the case for all the different kinds of signals that will make up the next generation of computing and communications.

“Converting the Nyquist rate to fully capture signals is theoretically what as an industry we’ve been doing — fully converting all of the information in a given signal. You need to do that for things like audio or video but in an awful lot of cases you don’t need to convert the entire signal bandwidth,” he said.

“So we’re looking at really moving away from the traditional Nyquist and Shannon sampling theory where you convert at twice the highest frequency component to looking at analogue to information converters and extracting only the important information from a signal.”

Read More:


Back to Top ↑

TechCentral.ie