Virtualisation: the march of containers
16 March 2018 | 0
Containers have become the talk of the IT world in the last few years, offering an environment ideally suited. But is the progress as real as the hype suggests? And if it is, will it replace virtualisation and perhaps even lead to an internecine war of standards among container platforms themselves?
Analysts have suggested that the march of containers will kill off traditional virtualisation: they are better suited to the Web and mobile, we are told, and they are lighter and, in the end, easier to live with. After all, which of us actually wants to run multiple instances of honking great operating systems?
Indeed, in 2016 Jeremy Eder, Red Hat’s principal software engineer, raised the spectre of a war between traditional virtualisation and containerisation, though, in the end, he rejected the idea that the technologies were in competition. They are in fact, he argues, complimentary.
Industry opinion appears to agree. VMware, arguably the leading virtualisation provider does not reject containers; rather, it embraces them, including Docker via the open source vSphere Integrated Containers (VIC).
“What’s happening is interesting. Consolidation is nothing new—nothing really is in IT,” said Ed Hoppitt, VMware EMEA lead for applications transformation and cloud native.
According to Hoppitt containerisation’s current role is in the development arena.
“Developers are being pushed to develop faster as IT is being pushed toward agility. Are containers going to become the new hypervisors? What’s developed from that is that the devil is in the detail. Redeploying a containerised app is very easy, but running it means dealing with operational problems.
“There is a happy co-existence now, with containers solving the dev problem but the operations [side] stepping-up with virtualisation,” he said.
His point is that real enterprise computing always shies away from the absolute bleeding-edge, and for good reason: enterprise IT is there for business reasons; it is not IT for IT’s sake.
“When you want to take something shiny and new you do still have to deal with a real operational environment,” he said.
The strongest case for containerisation remains on the development side, with the benefit realised being abstraction away from fixed IT infrastructure. This, in turn, means developers gain control of the libraries that they wrap-up into the container and they can run that container locally or anywhere they want.
Which is all very well, but wouldn’t a container war, then, defeat the entire purpose of the move?
What is contained within?
Kubernetes, and the related Cloud Native Computing Foundation, is already winning the war—at least according to US tech journalist Derrick Harris. Of course, Kubernetes’ major support from Google cannot be hurting.
And yet, Docker is as close to a container standard as exists right now, not to mention the proliferation of both competing and complimentary technologies being deployed by and under the likes of Amazon Web Services, Microsoft Azure, Apache Mesos and others.
If a container war is looming on the horizon—or at least being spoken of—then it is essential to understand, in a broad non-technical sense, what the role of containers is, as it is here that the end decision on whether or not to consolidate around a single ecosystem or push for standardisation will be made.
It is a crude way of looking at things, but the history of operational computing can be broken into two eras: the quest for more, and the quest for efficiency.
The first era represents bulk of the client-server era: computing problems could be solved, but only with more: more computational power, more networking and more storage. Today, we have more of all of these than we know what to do with, so the question has become ‘how can we use the resources at our disposal in an efficient manner?’
The trend started with blade servers in the early 2000s, quickly followed by virtualisation and an explosion of distribution via cloud computing. Today, containerisation takes things a step forward.
As reported previously in TechPro (see October 2017), key benefits driving containerisation are flexibility, lightness and portability. Fundamentally, the model allows for ‘data centre v.2.0’ through virtualisation, but what does this mean in practice?
In addition, traditional virtualisation has reached saturation point. Almost everything that can benefit from virtualisation has, at this point, been virtualised, and so the question containerisation has answered is ‘where do you go when you have already fully virtualised your computing environment, and the answer is to virtualise individual applications and services—or, at least, the right ones.
Edwin Lewzey, technology lead at Ammeon, says that, as things stand, containerisation, while a vital technology, has yet to move into full time operations in many environments.
“We’re certainly seeing a big pick-up in containerisation. We do a lot of systems integration and professional work around DevOps—and certainly we’re seeing quite a big lift-off in containers there.
“It has certainly moved from, ‘should we use containers?’ to. ‘how do we use them?’. For enterprises that change started last year, and there’s a really strong take-up in financial services,” he said.
The move toward containers in operations is coming, he says, because the benefits proven in development environments, and mixed DevOps, are overcoming reticence about change. From Ammeon’s perspective as a service provider, one with a focus on DevOps in particular, the future for containers is a bright one.
“Virtualisation was a really big enabler, and containers do much the same, accelerating if further. We can create the pipelines and push it all the way to production,” he said.
Lewzey has no fears of any looming war because standardisation is already on display as the technology matures.
“It’s quite an exciting area. There’s some much work around it, and we’re beginning to see consolidation such as Red Hat’s acquisition of CoreOS. Kubernetes [meanwhile] seems to have won out on the orchestration level.
“The other interesting thing is the standardisation of containers, which really helps avoid vendor lock-in,” he said.
All for one
This, of course, is key: if Kubernetes has, in fact, won out in orchestration where does this leave Docker Swarm or Apache Mesos?
The most common opinion is that it may not matter at all: standard will develop and interoperability will become the norm.
“It’s the story of any market. At the early stages there is always an explosion,” said Tom Long, head of technical strategy for Cisco Ireland.
“Will we see an explosion of vendors? We will. We’re seeing lots of different companies and lots of different tools supporting it [containerisation]. This competition will drive innovation and drive industry to deliver on the promise,” he said.
For Long, this competitive environment is a pure positive as it will drive value for enterprise users—and, domestically at least, there is a long way to go.
“The impact that we’ve seen in Ireland is a varied adoption and interest in containers. in Ireland we have a small number of very, very large US multinationals and they benefit from the use of containers in terms of scale and agility. They also have the IT teams and resources [necessary] to take advantage of them and make them work for their business.
“Then we have another camp: commercial enterprise, [effectively] indigenous enterprise, plus government and semi-states. They’re very interested in what the promise to deliver is, but haven’t wholeheartedly embraced them,” he said.
Cisco produced a report with analysts IDC that has predicted that, by 2020, 79.5 percent of customers will invest in containers.
“Some organisations are still not fully aware of containers. A lot of customers are fairly busy [simply] maintaining their enterprise IT. That 20.5% would be small- to medium-sized businesses, but also customers who aren’t aware that this new technology is available,” he said.
Matthew Sherian, VMware software architect for implementors Asystec, told TechPro that adoption of containers differs according to market conditions.
Speaking shortly after returning from the US where he was working on a project, Sherian, who has worked on projects including Docker, Kubernetes and other tools, said that Ireland is among the territories that are lagging somewhat.
“It’s definitely more widespread in the States than it is in the UK and Ireland, and little bit more widespread in continental Europe than it is in the UK and Ireland,” he said.
“I find that in Ireland, in Scotland and the north of England, people are more reticent to change.”
This is the result of both a certain conservatism and the nature of the IT projects in these locations, as well the regulatory environment.
“Regulatory awareness was a big thing with cloud and it is with containers, too. Even virtualisation was a stretch for them at first,” he said.
“It’s computer security: no one say it’s ever fixed, but it seems to [now] be more secure.”
Slow adoption is not always a negative. At the very least it can mean avoiding pitfalls that seem obvious in hindsight.
“I’ve seen people deploy containers where they don’t understand Docker networking”
Sherian says that, as with all technologies, containers are a case of ‘horses for courses’.
“The buzzy, cloud native app set: people who have a disparate workforce or IoT devices; they can throw things out into Azure or AWS. But when you’re going back to multi-layer applications containerisation can just add a level of complexity. VMs make a bit more sense still there, particularly with persistent data. It’s not a one size fits all situation and that’s perfectly fine, but it [containerisation] does solve a problem in the DevOps space,” he said.
Containers in the port
One name has become so synonymous with containers that it is close to impossible to discuss the technology without mentioning, and that name is Docker. With so much at stake, however, will Docker be part of a container war? The company says no.
Speaking to TechPro from San Francisco, Docker’s chief developer advocate, Patrick Chanezon says that containers are following the typical path all technologies move through.
“This is all normal: there’s the Precambrian explosion, the strongest companies evolve and survive and [then] things settle down.”
Chanezon says that, from Docker’s point of view, containers are a natural next step in the evolution of virtualisation.
Beyond this, however, he says containers are driving new methodologies that themselves drive change on the business side.
“I can tell you, from an industry perspective there are three of them that are very significant. The first one is server-less. There’s an evolution and you can create your application. Lots of enterprise customers are tying this out. It’s all on a cloud-native computing foundation, of course, [so] when you’re running services on top of a container that is portable. the value proposition there is that it is server-less anywhere. You’re not locked into any one provider,” he said.
“The second trend is AI [artificial intelligence] and the third is edge computing. In the last 10 years we focussed, during the cloud revolution, on infrastructure, but now with [the] IoT [internet of things] we are seeing there is a need for lots of processing [going on] at the edge.”
According to Chanezon, containers add value not only for new developments, but also for the modernisation of existing .net and Java applications that today run on bare metal or inside a VM.
“They can add micro-services around these apps to create business value out of them,” he said.
He stressed this was a way for enterprises to break the old infrastructure and platform monopolies, adding that this in itself was why there would be no container war in any meaningful sense’.
“Docker first invented the technology. Containers have been around for fifteen years in various forms, and has really made them super simple to use… but we don’t keep the technology to ourselves; it’s all built on an open source platform,” he said.