Cloud versus on-premises
20 June 2016 | 0
Applications and systems run on computers. Even quantum computing, though well down the line, will need boxes, circuits, switches and whatever. So Cloud continues to be a metaphor, not a technical term in any real sense. It is as much a description of our leading current approach to computing as anything else. Alternatively, it is an IT architecture that uses potentially vast arrays of servers globally to deliver services with power and scalability — and flexibility if the organisation and its expert advisers and the service providers can deliver.
But the cloud does not run in some ethereal skyscape. It runs on millions of very tangible servers in data centres around the globe. Some of them are owned by the big guys like Amazon Web Services or Microsoft or Google, some are data centre facilities for hire and some are owned and run by large corporations and governments. Mostly what the user organisations want are reliability, trust and performance. For the most part, consumers are not fussy and simply want their cloud services to be always on and reasonably speedy. Glitches tend to get ascribed to the ISP or carrier rather than the cloud service.
Organisations, on the other hand, have to be choosy. It is exactly the same as purchase decisions and partner or supplier choices have been through the ages. What do we want to achieve? What choices are there? Are there advantages (or indeed negatives) in this or that choice? Barry Lowry is just a few months in the role of Government CIO and is utterly clear about the decision making drivers for cloud services versus on-premises solutions: “The first question is always ‘Is there a choice and what is it?’ After 30 odd years in government IT, in Northern Ireland and in Britain, that is, I think, the key question. Then you can start to look at the details and alternatives and so on.
“Right now we have a data centre, we have made good investment in kit and also in skills. So we have the opportunity to refine that into a private cloud model if we want to, which is what has happened in Northern Ireland and been successful. It is also working in Scotland whereas in England a couple of decades ago direct investment in IT resources was not seen as a priority. So they got into a spiral of outsourcing and currently have less choice than we have, especially given market conditions and costs in London.”
Ireland’s government departments are at a stage, Lowry says, where the next question is about whether we can create a model, using the economies of scale available, that is as compelling as a private sector offering. “The Northern Ireland model is now as mature as anything comparable on these islands. It built a private cloud that opened up things like orchestration, service and capacity planning and the ability to burst up and down to meet the needs of specific applications. That’s a strong position because when you then look at public cloud it’s no longer about the economics.”
There are other issues, he points out, like the types of data and documents involved and in turn the sensitivity or longevity of those items. “Further education is a good example of where hybrid can be a good solution. Students have co-ownership of their data, it is not essential to the business of the institution and from year to year they may delete or just ignore. On the other hand, their academic and administrative records are the permanent responsibility of the institution, and you may argue that it would be sensible to keep such data on-premises.”
“Equally or even more important, in my view, is the need for control. When it’s our own data centre and team we have that to whatever degree is required and in an answerable and auditable way. It’s not just security and various levels of confidentiality. Do you foresee the need to change the data or how it’s hosted in the future? Will it need to be re-used in different ways? What is the volatility of the business? For example, we have just re-structured the responsibilities of various government departments, although not perhaps in a very major way. That can pose serious issues of change control, which in turn has to be part of the decision making about cloud or on-premises.”
Understandably, Google’s Matt McNeill comes at these issues from a different perspective. As head of Cloud Platform business for the UK and Ireland, he starts with the concept of cloud and Google services as a hyperscale platform in a world where hybrid is just today’s IT. “What’s more I think we are just at the start of a journey rather than the end and we are at the fulfilment of something like 40 years of IT architectural decision making. Most people now have an understanding of cloud as industrialised infrastructure as-a-service.”
Which is by and large what it delivers. But what McNeill refers to as the ‘architectural primitives’ — CPU, data storage and network — have been in many respects superseded by Google’s cloud architecture. “The unit of computing in Google is not the server but the data centre itself, with developments like Borg [large scale cluster manager] and containerisation enabling major forward strides in efficiency and utilisation. Everything in Google runs in a container and we spin something like two to three billion containers every week.”
Everything in Google architecture is designed to be as friction-free as possible, he says. “There are certainly challenges with certain workloads. In a large organisation which is decades old there will be a very mixed bag, a wide spectrum of tasks and some of them will just not fit this next generation of computing. In fact, that was one of the factors that led us to extend Google cloud platform and IaaS so that it offers a comprehensive set of services.