Industrial virtualisation

Pro
(Image: Stockfresh)

19 October 2017

As virtualisation has become the norm, the technologies behind it have expanded in use, bringing the best of automation, artificial intelligence and machine learning to the enterprise, becoming the flexible fabric on which so much relies. But having moved beyond hypervisors running operating system (OS) instances, what is next?

Traditional virtualisation, primarily virtual OS instances on servers, intended to squeeze maximum performance from the hardware, has become the norm, said Francis O’Haire, director of technology and strategy at Data Solutions.

Commodity stage
“It’s a commodity at this stage,” he said, “and the conversation has moved on. The big boys like Microsoft and VMware don’t [really] make their money from hypervisors, they tend to do more higher up.”

Francis O'Haire, Data Solutions

On the network side, virtualisation at the computing layer means being mobile and dynamic. This creates issues with the underlying network. Nowadays workloads can move not just to machines on-premises, but even to the cloud, so as well as the networking complexities there we need to simplify that and automate it. Automation is your friend, as it is what takes away the human error, Francis O’Haire, Data Solutions

This does not mean that virtualisation’s work is done, nor that it has solved every IT resource problem out there. In fact, traditional hypervisor virtualisation can prompt a push for further virtualisation of services.

“In itself it has brought a lot of benefits, like ease of use, but I think it has also increased a lot of complexity at other levels.

“So, it’s still consolidating multiple workloads onto fewer pieces of hardware, but that puts more pressure on other components like networking, security, and storage. There’s a lot of talk now about network virtualisation and storage virtualisation, rather than just hypervisors,” said O’Haire.

Today, regardless of where it is used, virtualisation tends toward software driven, software-defined services, with the goal of taking the complexity of the hardware out of the picture.

“On the network side, virtualisation at the computing layer means being mobile and dynamic. This creates issues with the underlying network. Nowadays workloads can move not just to machines on-premises, but even to the cloud, so as well as the networking complexities there we need to simplify that and automate it.

“Automation is your friend, as it is what takes away the human error,” said O’Haire.

ML role
“There’s a role for machine learning there, and there’s one vendor we’ve just taken on, Densify, and that’s what they do. They have an analytics engine in the cloud. You point that at [both] your on-prem systems and your Azure or AWS systems, and it will either give you more performance through intelligent workload distribution, or lower your costs,” he said.

Security is another key area for virtualisation, he said, driven by the fact that virtualisation itself creates a moveable feast.

“It can be hard to lock down a security policy with all of this stuff moving, including into and out of the cloud [so] you can now define policy not on IP addresses but on roles. As you spin up more servers and instances the polices follow the workloads,” he said.

Virtual clouds
O’Haire also said that for Data Solutions’ clients virtualisation is a stepping stone to cloud.

“They’re now thinking how they run their infrastructure as a cloud, in terms of what we call enterprise cloud, which mixes the virtues of on-premise with those of cloud. It’s about having that flexibility,” he said.

A container is a much lighter system. You run a system such as Docker on top of Windows Server and it creates a common set of libraries and binaries, and every time you create an application it becomes an independent instance. The benefits are flexibility, lightness and portability, Art Coughlan, Microsoft

Art Coughlan, product marketing lead for cloud infrastructure Microsoft western Europe, said virtualisation and the cloud are, today, inseparable.

“An interesting thing happened in the last decade,” he said.

“People were primarily interested in private cloud, so that was virtualisation in the data centre [but] now, they want a mixture of public and private.”

This, he said, means developments from the public cloud have fed back into the solutions used for on-premises and data centre virtualisation.

“What we have learned running millions of virtual machines for millions of customers, we are able to feed that back into the solutions that the customers want in their data centres.

“A lot of the technology that is found in Windows Server 2016 was born in the cloud. Other public cloud providers don’t produce operating systems and other virtualisation people don’t run hyper-scale cloud solutions.”

In practical terms, he said, customers can take advantage of advanced virtual services that, in theory, are not so different from traditional virtualisation.

“If you think of virtualisation as a single piece of hardware having multiple server instances on it, there is hardware, there is the OS layer and, within that OS you have a hypervisor. On top of the hypervisor you run your VM, and each of these has a guest operating system and an application on top of that. The binary files and libraries are shared,” said Coughlan.

Today the same principle is at work, but on a deeper scale, such as via containers.

Container work
“A container is a much lighter system. You run a system such as Docker on top of Windows Server and it creates a common set of libraries and binaries, and every time you create an application it becomes an independent instance.”

The benefits are flexibility, lightness and portability.

Aidan McEvoy, Zinopy

We’ve pushed into an area called ‘user behaviour analytics’. We’re now using AI and ML to build user profiles, similar to what the credit card industry does for fraud purposes, and develop internal risk scores, Aidan McEvoy, Zinopy

“I’m not having to create another copy of the operating system. It will boot in seconds rather than minutes, and will probably be no more than a couple of hundred megabytes. It’s completely self-encapsulated, so it makes the applications more reliable and portable. You get absolute parity between dev and the running environment, and it’s a lot more secure,” he said.

Kevin Bland, head of partners in EMEA for Red Hat, said that the containers model allows for ‘data centre v.2.0’ through virtualisations.

“The answer is found in containers, and making the applications sensitive to the environments they are being deployed in.

“If we can deploy those services or applets on-the-fly then, all of a sudden, you have a very optimised environment in the data centre–v2–and an agile environment,” he said.

Saturation point
John Long, director of business development at Sabeo, a major Red Hat reseller, said this move follows traditional virtualisation having reached saturation point.

“The question is, where do you go when you’ve fully virtualised?” he said.

The answer, he said, is to virtualise individual applications and services—or, at least, the right ones.

“It still tends to be traditional and quite monolithic applications [that remain on-premises]; middleware and web apps will technically be more distributed but the backend has not always been suitable; everything that falls outside that is good for virtualisation.

“You don’t do it by just taking traditional workloads out of the virtualisation environment and dropping them into the cloud. They have to be engineered,” he said.

Why you would do it, however, is straightforward, according to Red Hat’s Bland.

The containers model allows for ‘data centre v.2.0’ through virtualisations  …  and making the applications sensitive to the environments they are being deployed in.  If we can deploy those services or applets on-the-fly then, all of a sudden, you have a very optimised environment in the data centre–v2–and an agile environment, Kevin Bland, Red Hat

“The increase in optimisation, density and the ability to migrate in real time from one environment to another,” he said.

“People are looking at agility in the cloud: you take something you’d think as a lot of code and it could be [re-engineered as] microservices that can be moved and updated on the fly. You won’t write a pile of code in one monolithic block that is being aimed at a single operating system,” he said.

“Containers are a key part of that. What they allow you to do is create, change and deploy on-the-fly and know that the impact of that change isn’t going to bring everything else down with it.”

Virtual intelligence
But what of entirely new application areas?

Artificial intelligence (AI) and machine learning (ML) are very much part of virtualisation, said Adian McEvoy, sales director of Zinopy.

“It’s quite interesting, where it’s going,” he said.

“When you look at a lot of this stuff and where it’s emanating from, a lot of it is geared around automating manual tasks that people were required to do, many of them repetitive and monotonous.

“That was the first iteration of automation, particularly in support services where there are clear benefits.

“What’s changed since then is the introduction of machine learning and AI, where all of a sudden it wasn’t necessary for a human to intervene,” he said.

One of the first areas where this came to the fore was security, he said, where a mesh of threats requires sophisticated approach.

Security benefit
“In order to combat that the likes of IBM Watson came out, able to track and trace unusual behaviour and flag it, so it could be decided whether or not it was a false positive or a threat,” he said.

This methodology is all about the data, said McEvoy.

People are looking at agility in the cloud: you take something you’d think as a lot of code and it could be re-engineered as microservices that can be moved and updated on the fly. You won’t write a pile of code in one monolithic block that is being aimed at a single operating system. Containers are a key part of that, John Long, Sabeo

“One of our main partners, Citrix, has embraced machine learning and AI, [as well as] edge computing. The drivers we are seeing out there are very real: digital transformation has become a cliché, but what it really means is managing the security threat and managing GDPR on the one had, as well as considering the productivity metrics.

“When you factor those business demands in virtualisation helps to connect a lot of this together,” he said.

Citrix has invested in an Internet of Things (IoT) platform called OctoBlue it intends to drive a level of automation not seen before.

“It’s an open source platform, so anyone can use it,” said McEvoy.

Steve Wilson, Citrix’s vice-president of products for cloud and IoT says that alongside its role in developing virtualised desktop solutions and deploying them over thin networks, Citrix is increasingly working in artificial intelligence.

“There are a few different places where we’re looking at AI. One of them is [part of] our value propositions around security,” he said.

“We’ve pushed into an area called ‘user behaviour analytics’. We’re  now using AI and ML to build user profiles, similar to what the credit card industry does for fraud purposes, and develop internal risk scores,” he said.

However, in the Citrix vision, AI also operates closer to the end-user.

“The other thing that we’re looking at is using AI for productivity enhancement. We’re finding a tremendous amount of interest in contextual access, building up profiles on what people are doing and proactively suggesting applications or files based on what they might be working on.

“We also have built intelligent conference rooms based on IoT technology that recognise who you are and connect the right people; doing all that in a matter of seconds. We can save $50 or $100 in productivity in one of these meetings just by introducing relatively straightforward AI technologies,” he said.

Life on the edge
Rory Choudhuri, product marketing director at VMware, who specialises in virtualisation and software-defined data centres, said that the move toward ‘edge computing’ is an exciting new area for virtualisation.

Edge computing, put simply, is a method for optimising cloud use by performing more analysis close to the data source rather than at a centralised location.

“The edge bit is the most interesting,” he said.

“There is more and more data being collected: the average Airbus [aircraft] has sensors collecting petabytes of data per flight. If you take the aerospace industry, they simply couldn’t process the kind of data they need to process without a virtualised environment, which can [now] be brought close to the plane.”

The key, said Choudhuri, is that being able to encapsulate your own data as code brings a new level of agility, speed and responsiveness.

“You can spin-up multiple copies of an application or refactor your application in containers that can spin off thousands of versions of themselves.

Another area for edge computing is manufacturing, where every stage of every production line has more and more sensors collecting more and more data that, increasingly, has to be processed in ever closer to real time. There is no way to get there without virtualisation, Rory Choudhuri, VMware

“At the other end of the scale, there’s a lot of talk about self-driving cars. Well, you take a Tesla for example, it’s has Tegra processors from Nvidia. What that gives you is a level of processing ability within the car [but] today there isn’t [yet] a play for virtualisation in the car,” he said.

Once the data steps outside the car, whether connecting to other cars, street furniture, public transport systems, or traffic lights—and, in all likelihood, with AI—virtualisation will likely come into play. “This is moving so fast that it has to be repeatedly re-evaluated,” he said.

Choudhuri said another area for edge computing is manufacturing where every stage of every production line has more and more sensors collecting more and more data that, increasingly, has to be processed in ever closer to real time.

This is the play for virtualisation today, then: not just efficient servers, but a connected world consisting of autonomous devices, the internet of things and the cloud.

“There is no way to get there without virtualisation,” said Choudhuri.

 

Read More:


Back to Top ↑

TechCentral.ie