Enterprise storage dip can be combatted

Longform
(Image: Stockfresh)

16 July 2014

Investment protection
“A core advantage of software defined storage is investment protection. Unlike traditional array, it does not require a forklift replacement after a few years”
PFH Technology Group : Declan Van Esbeck
Enterprise storage tends to vary by organisation. There are several inter-related storage challenges facing organisations at the moment and the absolute priority depends on things like the user response time demands, the extent to which an organisation is using virtualisation and business continuity requirements.Traditional hardware-assisted storage is still required for high performance demands. At the very top end of the scale SSDs are needed. Hardware-assisted storage also does a better job on data reduction (thin provisioning, deduplication, compression) than a software defined storage solution.Software defined storage really comes into its own for data that does not require a high level of throughput, and while that again varies by organisation, it’s not unusual to see more than 75% of an organisations data classified in that manner. So we see a lot or organisation keeping 50%, 75% or even more or their data on an expensive tier one storage array when a software defined solution would facilitate centralised management of an environment with a more “horses for courses” approach.

In that scenario, tier one demands can be satisfied with a traditional high performance array from one of the major manufacturers while the high percentage of storage that is required to be online, but with lower level access, can be deployed on commodity hardware and centrally managed using a software defined storage technology.

In addition to providing centralised management and a cheaper cost per GB, a core advantage of software defined storage is investment protection. Unlike traditional array, it does not require a forklift replacement after a few years so investment in both the software and skills, expertise and training is outstanding.

As one last consideration, as an industry we talk a lot about investment protection but technology development at the present time is happening at breakneck speed. We are replacing IT equipment more frequently than we did 10, 15 or 20 years ago. It’s not unusual to find that supporting and maintaining equipment that is only 4 or 5 years old is considerably more expensive than replacing it with the latest technology when all facets of running costs, including power and cooling, are factored in.

 

Decision criteria
“To navigate this ever broadening sea of products and technologies requires a good understanding of your existing workloads and a strategy for how you want to deliver IT services”
Novosco : Eddie O’Rourke
The decision criteria which can determine where to make the next storage purchase are not always clear. The storage market landscape is constantly broadening and at a quickening pace. With the advent of Solid State Drives (SSD) came hybrid storage arrays, where new SSD drives were used to bolster performance by decoupling IOP’s from spindle count.More recently came the march of the ‘All Flash’ arrays, based entirely on solid state technologies. More recently still, the emergence of the Server SAN or ‘Software Defined’ Storage has heralded the era of the ‘hyper-converged’ platform. To navigate this ever broadening sea of products and technologies requires a good understanding of your existing workloads and a strategy for how you want to deliver IT services in the short to medium term. There can be a tendency to equate ‘new’ with ‘better’ but the real answer is based on requirements.There is little point in opting for a hyper-converged platform if you have physical assets which also require storage to be provisioned. The hybrid array which can provide NAS and Block, provide sufficient IOPs, capacity and do iSCSI, in addition to Fibre Channel, will still be the best solution for a lot of organisations.

There are a number of criteria that should be considered before the purchase of new enterprise storage, namely:
• Performance & Capacity
• Workloads
• Management
• Existing Infrastructure
• Existing and Planned IT Services
• Backup
• Business Continuity
• Cost

Knowing the answers to the above questions will help towards making the right decision.

 

Identifying ROT
“Through the use of advanced data analytical technology, we can assist the organisation to defensibly delete data that is no longer required. This results is significant ROI for any storage migration project”
Ergo : Jimmy Sheahan, technical director
An organisation needs to understand its storage requirements. When Ergo assist a customer to gain this insight we have the capability and experience to address the following factors involved in this decision making process:
1. Classification of the data: This involves classifying the data that sits on existing storage to allow the organisation to make a decision on what data to retain prior to any potential migration to a new storage platform. The early win is to identify Redundant, Obsolete and Trivial (ROT) data, this is data which is duplicated, irrelevant or non-business related. Ergo, through the use of advanced data analytical technology, can assist the organisation to defensibly (from a legal perspective) delete data that is no longer required. This results is significant ROI for any storage migration project as at the end of the data clean-up process you are now only sizing for qualified data that the organisation actually needs.2. Best location for the data: This involves identifying the best location to store data. Factors influencing this include proximity to applications/users, regulatory considerations, elasticity, scalability and cost. For example an organisation may need to locate non-sensitive image and text data proximate to web applications and users in North America with an elastic cloud model; the best location in this instance would be Azure (Blob storage).

3. Performance Requirements: This involves identifying the performance requirements that applications and users impose on the Storage and data. This is where Ergo, through the use of advanced utilities, can identify the Storage platform requirements and map these needs to features such as autonomic management, dynamic optimisation and thin persistence technology.

4. Storage Platforms: This involves taking the findings from “best location” and “performance requirements” and mapping those findings to the most appropriate Storage Platforms to meet the customer’s requirements. Ergo has expertise across a wide range of Storage platforms to meet the needs of our customers. One option available is to leverage existing storage for data workloads deemed appropriate following the assessments as detailed above.

Ergo can modernise a legacy SAN by deploying virtual appliances or we can derive the best IOPS performance possible (and deduplication) from legacy storage with the deployment of a Microsoft Windows Server 2012 R2 Scale Out File Server (SOFS) cluster.
So, in summation we limit the footprint requirement by migrating qualified data only, we then identify the best location for the data to be stored, identify the performance requirements for the data and finally map the location and performance requirements to best fit storage platforms.

 

Read More:


Back to Top ↑