The answer to enterprise infrastructure is HCI
TechFire hears that unless there is a clear reason otherwise, hyperconverged infrastructure is the way to go
7 June 2019 | 0
A good set of criteria for assessing enterprise IT infrastructure is “any app, any cloud, any scale”. This was applied in the specific context of hyperconverged infrastructure (HCI) at the recent TechFire.
Cisco Ireland’s head of innovation and industry solutions, Brian Jordan, said as IT departments are being asked to handle ever great volumes of traffic and scale applications efficiently and reliably, with less complexity, these were “three great characteristics to evaluate any technology.
“The default answer these days, is HCI, if you are looking at a new application — unless you can rule it out,” Brian Jordan, Cisco
As the major benefits of HIC were explored, such as reducing complexity, and lowering management overheads, as well as lower total cost of ownership (TCO), when compared with traditional architectures, an implication was that as less layer-specific expertise was needed for management, existing specialists’ time could be redeployed on developments to bring value to the business.
At the panel session, both Jordan and Gary Coburn, CEO, Island Networks, emphasised that this did not mean necessarily mean retraining highly skilled people, such as database administrators, for other disciplines, rather it meant that people could focus more how to drive value from their area of speciality.
Databases aren’t going away, said Jordan, in fact there may be more of them with HCI, but less times needs to be spent maintaining environments and servers, and also simplified licensing, would free time for other things.
Having recounted the experience of the Exmar Group in implementing HCI, David de Roock, ICT infrastructure project manager, advised any organisation considering a platform or infrastructure refresh, to push partners for what is possible.
De Roock emphasised the importance of picking the right partner, as he said when Exmar was considering such a move, existing solutions partners and vendors were just offering more of the same. However, one partner, said De Roock, asked leading questions and helped the group understand how HCI could be of benefit to increase capability while reducing complexity and management overheads. This led to what De Roock admits was something of a gamble, but its adoption of an early version of Cisco’s HyperFlex HCI platform proved to be the right move for the group, delivering on all aspects of performance and reliability.
In the panel session, the question arose of what organisations need to consider first when looking at an HCI implementation.
Cisco’s Jordan said that, as with any such move, organisations need to have a clear goal, to establish clearly what they are trying to achieve for the business, as well as the architectural piece.
Then he said it was important to evaluate the capability of the different HCI platforms on offer. He said, for example, that not all HCI infrastructures would be suited to running artificial intelligence and machine learning elements, as they may not be capable of integrating technologies such as graphic processing units (GPU), which are seen as vital for the performance required.
Coburn agreed, and said as a Cisco partner, Island Networks works with organisations to help them understand and properly express their goals to allow them to select the right solutions.
HCI is rarely implemented in isolation, said Shane Dunne, system engineer, data centre and multicloud, Cisco, it is generally integrated into an existing infrastructure, it is not another silo. It can integrate, he argued, with older storage types, for example, be they block-based or iSCSI, etc. It can and should be seamlessly integrated to achieve the best of both worlds, said Dunne.
The panel were asked about the common drivers of HCI adoption, and Jordan said that a change in mindset was apparent.
Something fundamental has changed, he said, from when we used to build out infrastructure and then tweak it for what the application needed. Now we are automating infrastructure for applications not tweaking it after the fact.
“That is what’s caused all of those operations costs,” said Jordan, “we are changing firewall rules, adding in LUNs on SANs – that dynamic has all changed.”
“The default answer these days, is HCI, if you are looking at a new application – unless you can rule it out,” he said.
Coburn agreed, and added, “it is not a question of looking at what you are already doing and then just doing it better, it is how can you do things completely differently.”
With regard to the upskilling and cross-skilling of IT people, Jordan said the new world of microservices and APIs held a lot of opportunity.
The new world is all about the programmatic, said Jordan, the API layer that makes automation possible. And that means a new set of skills around scripting, Python and how to use APIs, but within the context of existing expertise.
“It is more of an evolution of those skills to cope with the new model,” he said.
It is nothing to fear, said Cisco’s Dunne, your DBA in the new world is still a DBA, but the databases they look after are virtualised.
“Frequently, it is just the ability to do more with less,” he said.
The audience were asked how many were currently running converged technologies. About 10% responded affirmatively. A further show of hands found that around 15% were in the evaluation stage.
A question from the audience was about the viability of these solutions to small and medium enterprise, and whether service providers were able to sufficiently leverage the benefits to provide them as a service.
Dunne said that 2-3 node clusters were perfectly viable, and that the cost was the only determining factor. He said that it was not so much the size of the organisation that mattered but rather the nature of the workloads and what would suit the scale.
He added that the remote management and installation capability of HCI would be a key factor to allow management as a service.
Another audience question asked about a capability comparison between the various vendors. De Roock had highlighted in the customer interview, that for Exmar, the Cisco HyperFlex was by some margin the leading solution, something that was supported by Coburn’s citing of Gartner Magic Quadrant reports.
However, Dunne acknowledge the work done by other vendors, and the close ties that Cisco maintained with the likes of cloud providers and other players such as VMware, and network infrastructure providers, but added, “We weren’t the first to release hyperconvergence, but what we are trying to achieve is as close as to perfection as possible.”
Difficulties and exceptions
Another audience question asked of De Roock the difficulties and the items that did no suit the HCI infrastructure. He had already highlighted an old application running on a near 10-year old platform, but gave further detail of communications and collaboration applications such as Jabber, that did not lend themselves to the initial implementation. However, technological developments during the adoption period now meant even these instances could be virtualised to enjoy the benefits of the HCI infrastructure.
De Roock also detailed a recovery incident where a hard drive was swapped while a server was powered down which caused a failure that was exacerbated by a bug discovered in the management software. He said that Exmar worked closely with the Springpath team within Cisco to rapidly identify the bug and develop a fix. This fix was later developed into a general update for all users before being superseded by a point release. The issue, he said. still resulted in less downtime than would have been the case with a tradition back-up recovery situation due to the capabilities of the HCI infrastructure.