Inside Track: Future proofed

Business man pointing
(Image: Stockfresh)



Read More:

12 December 2016 | 0

Cisco built a solution called Tetration that aims to give complete visibility of all data and activity across the network for this purpose.

“Simplify is probably the easiest stage to understand — it means virtualising everything,” said Jordan.

While the industry is quite familiar with the virtualisation of compute power, it’s less familiar with software defined storage, software defined networking and even virtualising security functionality. These are relatively new but by virtualising all these, it gives companies huge flexibility in treating resources as a pool and allocating them to applications depending on their need.


We are at the intersection of what we call traditional IT and new IT. Organisations need a solution provider that focuses on the business’ needs, has the technology to solve its issues, and the expertise to help navigate through these complexities. Our goal is to avoid the traditional/new split and to focus on the compute-centric and software driven model, Dermot O’Connell, Dell EMC

“We have to move towards policy-driven infrastructure and we can only do that when we simplify our network by virtualising compute, storage, networking and our security,” said Jordan.

“This gives us the flexibility to apply the resource pools we have in a much more flexible manner to the application and when the application moves location we can redeploy those resources in a different location whether that’s in your own datacentre or even out in the cloud.”

To move to the next phase of the process and automate as much as possible, it’s necessary to develop policies.

“An application needs a policy that defines the infrastructure it needs. The only way we can automate is when that infrastructure exposes application programming interfaces or APIs.  If it’s not programmable we can’t easily provision what’s needed for the application so this is a really important point,” said Jordan.

“I’d go so far as to say if your infrastructure doesn’t expose APIs, whether that’s hardware or software, it probably doesn’t belong in the next generation of datacentres. We have to be able to programme them so that we can automate the provisioning of the resources to support the applications.”

The final part of the ASAP model is protect, and it is here that virtualised security becomes important.

“From a security point of view, one of the reasons we need to virtualise a lot is so that we can move. When the application moves to a different location we can move the compute, storage, networking and also the security with it.”

Means to an end
According to Matt Foley, EMEA hybrid IT presales for Hewlett Packard Enterprise, one of the keys to futureproofing an IT estate is to view it the same way a consumer views the collection of applications on their smart phone — as a means to an end.

“It makes sense to think in terms of the applications you want to use and then build out the infrastructure to support that. To futureproof that, there needs to be ongoing maintenance of the catalogue to determine over regular intervals whether that application is still appropriate.” he said.

“If it is still appropriate, then you should ask is this the best way to deliver it? And if it’s not the best way then you can make changes behind that application in order to source it from an external provider or change other features or characteristics, so that this becomes a case of application maintenance.”

Foley pointed out that the IT industry tends to adopt new technologies in waves. But just because the rest of the industry is adopting a technology doesn’t necessarily mean that’s a good choice for a specific company. It all depends on fit.

“Just because you can migrate or move an application to the cloud, change or replace it, doesn’t mean you should. Part of futureproofing IT is constructing your IT environment so that you can reasonably look at each application independently and regularly and keep them all current on their own particular schedules in their own particular ways,” he said.

Refresh cycles
Rather than keeping to three or four year refresh cycles, with dramatic changes happening as new generations of infrastructure get introduced, it is better to perform change incrementally.

“The viewpoint is from the service catalogue on down. For example, I don’t know how many smart phones I’ve had over the years, but the applications that I use have lasted much longer than the handsets and I’ve been able to make those changes more gracefully as a result of that architecture,” said Foley.

Other methodologies for future proofing IT investment focus on fine tuning attributes. According to Dell EMC, there are five in particular which need attention. Future ready enterprises should be workload-ready, virtual-ready, software-defined, cloud-ready and optimised for big data.

“We are at the intersection of what we call traditional IT and new IT. Organisations need a solution provider that focuses on the business’s needs, has the technology to solve its issues, and the expertise to help navigate through these complexities. Our goal is to avoid the traditional/new split and to focus on the compute-centric and software driven model,” said Dermot O’Connell, vice president and general manager for OEM and IOT solutions for Dell EMC.

“We believe that a singular IT organisational design can both support and provide this at a much lower total cost of ownership overall. Through leveraging the emerging software-defined data centre technologies and combining that with our new infrastructure hardware designs, we deliver a superior future-ready approach that preserves the benefits of our virtualisation and consolidation efforts of the past half-decade while simultaneously supporting new IT with hyperscale-inspired efficiency.”

Preserving value
O’Connell is particularly clear on the need for enterprises to optimise for big data with regards to preserving the value of their IT investment.

“I mentioned the importance of the enterprise being big data-optimised. Experts predict that by 2020, 25 billion devices will be connected to the Internet, generating a massive amount of data which creates an opportunity for organisations to capitalise on the insights this data can provide,” he said.

“Getting started with IoT requires an approach that is grounded in experience and pragmatism because it is not easy to capture, analyse and leverage the data in an optimised way that drives business growth.”

Dell EMC believes it is best to start small with IoT projects, using devices and data that you already have, and then companies should build off their real-world success.

“Move past the hype and identify realistic use-cases. From there create strategic plans based on return on investment analysis. A practical approach leverages your organisation-wide technical and domain expertise and builds on your current technology investments,” said O’Connell.

Read More:

Comments are closed.

Back to Top ↑