HP Logo

Tech developments driving storage as foundation of enterprise data strategies

Pro
(Source: HP)

1 October 2014

At its recent storage focused event, HP announced the availability of a free StoreVirtual virtual server appliance (VSA) licence for HP and other server hardware manufacturers. The company also outlined how the software defined trend was impacting the world of storage, changing what can be expected of and delivered by, storage infrastructure.

A key part of this is how underlying storage technology is enabling better integration of services that rely on the storage infrastructure, such as data back-up, protection and, critically, analytics.

Within HP, this manifest itself in a closer cooperation between the now well bedded-in Autonomy acquisition and the storage division.

Patrick Osborne, director of Product Management and Marketing HP Storage, and Stephen Spellicy, senior director, Product Management, HP Autonomy, spoke exclusively to TechPro about developments in the technologies, usage trends and what it means for the end user.

When asked about increases in processing power, developments in in-memory technologies and the widespread use of flash storage, Spellicy said these developments had allowed HP to bring together elements of performance and capability that were not previously possible.

Search and analytics for the information management tier is enabled through the marriage of the software and middleware, and the application connectors with the scalable hardware, said Spellicy, “so you can’t have that whole situation without both elements [Autonomy and HP storage] working together”.

“It’s about using the Autonomy technology with highly efficient, high performing storage,” said Osborne. “It gives customers who are looking at improving their back-up and recovery process to make it more efficient, improve their SLAs, decrease their recovery time objectives (RTO), the power to deploy it selectively in their network, whether it be in software or hardware or a combination of both.”

“There is also an intelligence layer that provides visibility into the back-up infrastructure using operational analytics which we call Adaptive Back-up and Recovery, and that looks into the infrastructure to see how to retool to make the best of what is there, for the jobs, the targets, the highest levels of utilisation, and availability, and if not, to make decisions to achieve that. It iteratively looks at the environment using analytics at a machine level to improve back-up and recovery runs on a hardware and software basis.”

Having that smart level of software, working in conjunction with the storage layer is really powerful for customers, said Spellicy.

Flash specifically has had a major impact too, not just in terms of pure speed, but also changing the way data processing is structured.

The introduction of flash as a standard component in the storage arrays is a great enabler for improving performance, said Spellicy, reducing seek time for finding information, searching and indexing. The technologies that are available on the hardware platform to make better use of the storage itself, letting us have a more scalable back end to grow transparently. Those two aspects help the upper layer management of information.

Osborne added that it has also enabled optimisation between the two platforms where it is now possible to do some of pre-processing activity on the storage level and feed that into the analytics framework.

“For example in StoreAll,” said Osborne, “we can selectively take portions of that data and feed that into the analytics engine. You can do a pre-processing pipeline that is enabled in the storage so you are really only doing work on the part of the storage that you need it on.

“Cutting that time to processing makes the overall analytics pipeline quite a bit shorter,” said Osborne.

The cumulative effect is that end users are able to do more, in real time, without negatively impacting the basic functionality of storage, back-up and retention.

Added to this is the fact that use cases are now driving adoption, as organisations of all sizes are seeing benefits.

Spellicy cites the car rental giant Avis, who has implemented a geo-location based pricing system for services that allow it to specify deals for consumers based on the most up to date location information. Similarly, the streaming entertainment box maker, Roku, is using an analytics platform to determine the best content suggestions and offers for users.

“I think the ones that are more interesting use cases are the commercial, because they drag other products behind to make a solution,” said Spellicy.

“Now we are beginning to see some of these use cases become more robust, in terms of how they are classified and what they can do,” added Osborne.

But while these examples are from the upper end of the market, Spellicy said that services such as IDOL onDemand, offering a set of application programming interfaces (API) in an SaaS model, has allowed even the smallest of organisations the ability to bring analytics tools to bear on unstructured data to derive intelligence.

Spellicy said that there is accelerating demand for such services showing that even in the mid and lower end of the market, there is not only interest but requirements for such capabilities.

 

 

TechCentral Reporters


Back to Top ↑

TechCentral.ie