Driverless, serverless, IT-less future?
It could be called consumerisation, it could be called simplicity, it could be called a lot of things, but the relentless drive to make enterprise applications and architectures more manageable is expected to come of age this year.
Simplicity is the new complexity – hopefully.
Looking back, we can see how more and more in terms of configuration and plumbing is being hidden behind intuitive, intelligent configurators that allow users to requisition what they want, be it an application, a server, a container or an entire infrastructure, without the need to know how to build that platform.
Virtualisation took us from the application/server model to multiple virtual machines (VM) per server, then point and click interfaces allowed users to provision VMs without IT being there at all. Then containers took it to another level in terms of architecture, and now AWS has added Kubernetes support to its own container management service. At every turn, the specifics of configuration and assembly, for even the most sophisticated of architectures, are being placed behind the curtain as it were.
“At every turn, the specifics of configuration and assembly, for even the most sophisticated of architectures, are being placed behind the curtain as it were”
More and more, complexity is being removed, without taking from the sophistication or capability of solutions, allowing those who know what they need, but not necessarily how to construct it, to specify their requirements and have intelligent services simply build the blocks of their solution.
Perhaps one of the best expressions of this trend is serverless computing.
Serverless computing, also known as functions as a service, is a bit of a misnomer, because it still runs on servers. But what it means is that when those servers have been aggregated and orchestrated within a platform, developers can ignore the usual requirements for application development that pertain to the server or VM on which the application runs. Instead, the application can receive event and client calls directly from the platform, instead of having to go through other layers or APIs. It simplifies architectures and improves efficiency, as well as potentially reducing development times.
It has also been argued that it also increases the likelihood of lock-in, as applications are written for a specific environment, but that will sort itself too, as cloud vendors are already hard at work on seamless migration tools to allow you to go from one environment or ecosystem to another, without having to worry about stuff being stuck in one cloud or another.
However, the trend is the same—abstraction from the plumbing that allows users to focus on the value creating bit, without having to worry about the infrastructure.
So, does this mean that there is, in the near future, the prospect of an IT-less enterprise?
Well, it would appear to be the classic situation of yes with an if and no with a but.
The ‘yes’ bit certainly points to the fact that less requirement for infrastructure management means more time for IT professionals to look at big picture stuff, facilitating and driving innovation, and working with business units to identify and solve their issues, thus providing greater value to the business. This has long been the goal of IT as a service, which has not really come to fruition as standard, and is still a relative rarity.
The ‘no’ bit may point to the fact that enterprises are also moving applications back from public cloud to either colocation or on-premises facilities. A survey by 451 Research in 2017 found that 41% of enterprises globally had taken an application back from public cloud to colocation or on-premises. Some reported having tested and developed an application completely in the cloud, but deployed it in a colocation facility.
If that trend is to continue, then traditional IT has a long life ahead of it.
However, all of these developments would seem to indicate that the bar for this kind of intelligent self-provisioning is only going to rise higher.
Therefore, those IT people who understand how to build solutions can continue to develop that knowledge and experience, while actually having less to do in these practical terms, allowing them to be more of a consultant to the business in helping them understand how better to leverage technology to achieve their business goals.
Who knows where this will lead? When software as a service began to deliver the likes of CRM, few thought that it would soon be moving on to those stalwarts of monolithic apps the ERP system and the enterprise database. Yet now, those to things are easily deployed, backed-up and delivered from the cloud.
Soon, we may see configurators where an entire enterprise is specified from drop down menus, drag and drop interfaces and icon-driven tools. Or better still, AR/VR visualisations will allow us to use pure voice control to dot he same, while a 3D representation allows us to walk around it, manipulate it and see how works.
Indeed, visualisation technologies may well be the next leap in this broad trend. At VMworld Europe this year, Pat Gelsinger demonstrated a VR control for VM orchestration. He plucked a VM running a workload from one server cluster and ‘threw’ it into a VMware cloud. But what will happen when people are able to don a VR or AR device and see their entire application estate, IT infrastructure or enterprise depicted visually with all its inherent detail? What new insights will be available to those who can literally see the numbers?
Like some savant of old with a synergy of synaesthesia, will a new generation of data immersed professionals be able to imagine greater due to glorious visualisations of data made possible by complexity disguised as simplicity?
It has been said that data is the new oil, currency, plutonium and (my favourite) bacon, but is it possible that a new generation of data scientists will turn it into the new art giving us visualisations that are both beautiful and insightful?
We can only hope.