Within a quantum
Theoretical for at least seven decades, quantum computers are now a reality, but are they ready to escape the lab?
technology companies including IBM and Fujitsu think so, not to
mention insurgents like D-Wave. Others, however, have argued that
useful quantum computing is either still some distance away, or that
the technology is not the way forward for general purpose computing.
The reason for drive toward quantum computing is not hard to understand. First of all, there is the perennial demand that computational power grows. As we reach the limits of Moore’s Law, the demand for more processing power has not gone away. If anything, the continuing explosion in data processing will demand that more and more computational power is available.
Speaking of Moore’s Law, there is a second, and even more pressing, reason for the flurry of interest in quantum computing: computer processors are reaching the absolute limits of the technology and chip components now are almost at the atomic level, creating hard physical limits to how much further they can be reduced.
As the transistors inside semiconductors are now down to around 14nm – the size of a few atoms – the electrons cease to operate according to the laws of classical physics and instead shift into the confusing domain of quantum physics.
Quantum computing replaces bits – the basic unit of binary data, which are represented by on/off, power/no power – with qubits, also binary, which can be represented any quantum level activity, thus producing the possible information states of zero and one.
Of course, quantum mechanics demonstrates that these two positions are not the only possible outcomes and, in fact, a qubit can, in its unobserved state, be in any of an infinite number of ‘superpositions’ between the two, only forced to take on a state of zero or one through the application of the testing process. The process of observation itself causes the qubit to enter one of its two definite states – and no cats are harmed in the process.
By way of analogy we might think of wave-particle duality: the state of light is that it is both a wave and a particle, but will become one of the two when observed and tested for being in that state. This revelation led to Heisenberg’s Uncertainty Principle, which states that there is a hard limit to the precision to which the pairs of properties of a particle can be known.
In its application in computing, super-positioning is the theory that promises to unlock greater power: as qubits can be in multiple states at the same time, they can represent a greater amount of information than binary bits – and this grows exponentially by adding another qubit. In addition, a quantum gate, rather than passing on a tested state of zero or one as with a traditional logic gate, takes a superposition and produces another superposition, which is then measured producing a string of zeros and ones. Though an error-prone process that requires double-checking – noise can produce the wrong results – it is, in theory, still much more efficient than traditional binary computing as all possible calculations are performed simultaneously even if only one result can be measured at a time.
That is all very interesting, but what are the real world applications? Mostly, it turns out, specialist ones: no-one is predicting that quantum computers will replace traditional machines on the desktop. However, they could, in theory, be used to augment them for specific applications.
With databases, for example, the benefits are obvious: database searching requires a linear testing of the state of every field, which is a laborious process. Quantum computing would obviously greatly reduce the amount of time required to search. Another promising application is IT security, another resource-intensive set of computations. Of course, scaling up encryption also means scaling up the ability to crack encryption. Other touted applications are the simulation of quantum physics using actual quantum physics, simulation of complex systems, such as Earth in climate change modelling, and drug discovery: basically, anything that requires enormous amounts of calculations.
Robin Wootton, a researcher with IBM in Zurich, explains that quantum
computing is, in terms of the desired outcome, in line with
purpose of quantum computing is the same as any computing: to solve
problems we need to solve,” he said.
the problems intended to be solved are ones that traditional
computers struggle with.
“The reason we need quantum computing is that there are some problems that are too complex for normal computers to perform and would need an unthinkable number of instructions and a computer the size of the planet – like Deep Thought or the Earth in The Hitch-Hiker’s Guide to the Galaxy.”
question has lingered, though: does quantum computing actually exist
in any meaningful sense or is it purely a lab-based phenomenon?
At least 18 companies are investing in quantum computing right now, though not all are manufacturers.
claims by one manufacturer, D-Wave, have raised eyebrows.
systems utilise a process called ‘quantum annealing’ that is
unsuited to certain applications, meaning it is not a true general
purpose or ‘universal’ quantum computer. On the other hand,
D-Wave does not claim its systems are universal computers.
Nonetheless, D-Wave has delivered real hardware, including to customers such as Google, NASA and Lockheed-Martin.
uses similar techniques in its quantum computer, which is actually
called the Digital Annealer.
financial services, one of the UK’s leading financial institutions
is exploring how the Fujitsu Digital Annealer can optimise its
investment portfolios in real-time. Additional use cases include
maximising return on investment for utilities companies and, in the
pharmaceutical sector, the discovery of new substances and
development of new drugs,” a Fujitsu spokesman told TechPro.
quantum computing technology is also being used by “a premium
European automotive manufacturer” in its robotic welding systems.
however, including IBM are pursing the development of universal
Wootton says that for those who want to experiment with universal
quantum systems there are already options, but that major
developments should be expected within a decade.
five qubit one, or fourteen qubit ones that are available to the
public,” he said.
“If you want the full dream of quantum computing – for everything to be error corrected, that’s still about a decade away [and] there’s going to be an era between then and now where we learn to work with things and you’ll see a progression.
now, it’s accessible from your desk in the sense that you’ll have
cloud access,” he said.
issue of error correction is fundamental: unlike with traditional
computers, everything on the quantum level is a bit off, resulting in
“Everything they do is a little bit wrong. A bit is zero or one with a not gate a noisy not gate is one with a probability of them getting it wrong.
you’ve done a lot of things the probability of getting something
wrong somewhere down the line is 100%.”
was working at a university in Basel when the opportunity to put his
quantum research into practice with IBM came up.
“I had an experiment I proposed for five qubits and suddenly here was a five-qubit machine – so now a theorist can do an experiment in the same way we used to do a simulation.
thing that I personally work on is trying to get people engaged in
quantum computing by getting them to look at games on them. An
example of quantum code doing simple and understandable things like a
game of Battleships.”
it moves on it can become more complex.
generated content like terrain maps, which is using the devices to do
something unique and useful. Also, we’re at the point where it’s
useful to think about drugs and materials at the molecular level.
[Then] optimisation problems are more efficient on a quantum computer
and optimisation is in everything: finance and AI, [and] there’s
speed up in search too.”
Kannan, lead of novel technologies activity at the Irish Centre for
High-End Computing (ICHEC) says that wherever one comes down on the
question, quantum devices are not mere simulators or hypotheses.
computing devices are a reality now, irrespective of whether they are
D-Wave systems or universal devices. The universal devices can solve
any problem but at a lower scale right now, but the researchers will
be scaling up,” he said.
are two categories of quantum computers at the moment,” he said.
“There are those from D-Wave or Fujitsu [that are] well equipped at solving a particular kind of problem – an optimisation or search problem – but they are not capable of solving a generic problem.
what we have for the moment from IBM, Intel and Google are universal
universal machines lag the specialist ones in terms of the number of
qubits they can simultaneously work with.
challenge there is having large scale processes that are useable in
the real world,” said Kannan.
“The universal devices can solve any problem but at a lower scale right now – but the researchers will be scaling-up. At ICHEC and Intel, we have projects ongoing where we solve natural language problems using quantum computing. [At the moment] We don’t process millions of words, we process thousands,” he said.
also notes that the EU has committed up to a billion euros towards
development of quantum technologies.
itself has five people working in quantum computing and targets
software and application development for programming the devices,
through industry and state funded collaborative projects.
Desplat, Kannan’s colleague and the director of ICHEC, says that
concentrating in development for quantum computers will ease the
development of software later as well as guiding the future
development of the machines themselves.
development of software is going to be extremely onerous. It’s
going to take a long time so we really must start now, and we can
because we have access to reliable quantum simulators. It allows them
to get early feedback that will guide the development of quantum processors, so in some ways it’s going to guide developers and give feedback to the people who are developing the hardware,” he said.
road to quantum has not been a smooth one, however. HPE, for
instance, abandoned quantum computing research just this year in
order to focus on near-term technologies more likely to deliver
back in 2018, speaking at HPE Discover, Ray Beausoleil, senior fellow
at HPE Labs and head HPE’s Large Scale Integrated Photonics
research group, said that people should keep their feet on the ground
when it comes to quantum computing.
computers should be viewed “as an accelerator to solve a certain
type of problem,” he said.
HPE is not saying that quantum computing is a dead end; rather that its future is in specific applications in the supercomputing domain – and alongside traditional computers at that – not in general purpose enterprise computing.
Beausoleil said that he was a ‘big booster’ of quantum computing
but that he did not think that the “enterprise is going to be one
of those places where those applications are found, unless you’re a
pharmaceutical or materials company.”
view coming from Beausoleil and HPE does seem to be building into
something of a consensus, though.
vice president and principal analyst Brian Hopkins, notes that the
current state of universal quantum computing is that we have a way to
think systematically about all of the pieces required to do quantum
described IBM’s 2019 announcement of what it called the first
quantum computing system as “one significant step in a journey of a
really have a long way to go in quantum computing,” he said in
January of this year, in his podcast.
director JC Desplat is, nonetheless, optimistic about the future of
would say there is genuine disruption here: we’re not quite sure
where it will go. It seems quantum computing will not [soon] replace
traditional computers but will work with them. It’s like an
accelerator, if you want,” he told TechPro.
“But will the replace them entirely in time?” Desplat thinks so.
due course, though, we expect computers to be purely quantum,” he