Code

Data pressures call time for isolation and data lakes

User demands drive vendors to open up tools and repository technologies
Pro
Image: Stockfresh

21 November 2018

Old organisations of data, in terms of storage and classification, are yielding to the demands of the hybrid world.

Just as tiered storage allowed the concept of regularly accessed data to be more readily accessible on faster media, so now cloud infrastructures are allowing for data hierarchies to produce the same result.

This trend is being observed alongside the idea that data lakes, so favoured by certain sections of enterprise, can now represent barriers to access to where the data is needed.

Writing for Dataversity, Andrew Brust, senior director, Market Strategy and Intelligence, Datameer, said “the more of your data that you land and maintain in a given Cloud platform’s storage layer, the more work you will do on that data on that same cloud platform: data preparation, analytics, predictive modelling, and model training (on high-end GPU-accelerated virtual machines). The cloud battle is the data storage battle, and the winner may have you quite locked in.”

However, analysts and commentators have noted that vendors are now beginning to act on lock-in and allow users to store, classify and access data as they wish, to ensure the best use can be made of it.

Earlier this year, MicroStrategy, business intelligence tools and services provider, made available a raft of connectors for what many had seen as competitor tools, opening a range of new possibilities for users.

These efforts across the market are seen as a direct response to disillusionment with data lakes in recent years, and a perceived lack of return on investment, prompting many to say the practice was doomed, if it were to continue as is.

Speaking to CIO Magazine, Monte Zweben, CEO, Splice Machine, said, “The Hadoop era of disillusionment hits full stride [in 2018], with many companies drowning in their data lakes, unable to get a RoI because of the complexity of duct-taping Hadoop-based compute engines.”

With the pressures of a potential data deluge from the coming Internet of Things adoption, as well as those to adopt edge computing architectures, data sources and repositories in the world of hybrid cloud and on-premises are coming under scrutiny to ensure they can enable the data-driven enterprise, not hobble it.

TechFire, in association with Logicalis and NetApp, will examine how an integrated data fabric can enable your business to draw together data from all sources, where ever it lies, to allow it to be transformed into intelligence.

Developing the vision for data-driven enterprise, the event will provide insights on how your business can utilise all of its data, unifying it into a singular resource to support strategic decision making.

Expert speakers will describe how to not only manage and unify data from multiple sources, across on-premises, clouds and services, but how to harness those sources to drive digital transformation.

As business leaders are under tremendous pressure to harness the wealth of data available and apply it to create new value across the entire organisation, within a limited time, and often hard pressed for skills and budget, the event will hear from Gerard Grant of Pramerica who has tamed the data beast, and gained advantage through insights.

This free event on Tuesday 27 November at the Aviva Stadium, Dublin, will help organisations understand their data sources and how to harness their value, wherever they are. To register, see techfire.ie

 

TechCentral Reporters

Read More:


Back to Top ↑

TechCentral.ie