Cost optimisation key for data centre future

Pro

1 March 2009

If one thing seems to be certain over the coming 18 months, it is that spend on data centres throughout the continent will increase, as evidenced by a recent study from Digital Reality Trust. Widely recognised as the world’s largest wholesale data centre provider, its study showed that four out of five companies surveyed are planning data centre expansions within the next two years. This is just one of the findings from the data centre market in Europe, with a quick comparison with a similar survey from last year showing that there has been a 22% increase in projected average data centre space requirements from 1,300 square metres to 1,600 square metres as well.

If there is more space, and indeed more spend then, just what types of technology and methodologies will be seen in the near future? One particular idea is termed as an “evolution of data centres”. Peter Hannaford, who is VP of business development, enterprise and management systems in the EMEA/LAM arm of APC, part of Schneider Electric, is an advocate for the development of the high density/high availability data centre concept which may well be a short, sharp, shock for data centre operators and developers who have been sitting on their hands of late. He said the concept is as simple as realising that as racks become smaller “people are cramming more and more in".

Now those in the know, he claimed, are concentrating on “building up the density within the rack”, with a Schneider Electric spokesman telling how the concepts of high density and high availability will mean maximising data centre energy efficiency while allowing the measuring and monitoring of the physical infrastructure to “as granular a level as possible”

 

advertisement



 

POSSIBLE BOTTLENECKS
BT’s Mark Fagan, who is head of data centre services with the company, noted that high density may have its issues however. “Data centres have bottlenecks, whether that’s at the main supply coming in, the generating capability, or the cooling systems, somewhere you’re going to hit a limit. The problem with power dense solutions, it’s a great concept and yes you will have 20 times the processing power in the fifth of the space but the reality is you’re using up 10 times the power. You would have very empty data centres with a corner taking up with a very dense solution, which has in fact sterilised your footprint.

He continued, “Moving over the next year, what we’re seeing is some of the network manufacturers – Cisco and VM Ware and EMC – coming closer together, and from that what we’re seeing is solutions whereby you’re removing some of the appliances and virtualising them within the software and that will be hugely interesting in terms of being able to scale correctly and sort of scale upwards and downwards as the needs may arise.”

Hannaford added that, “Anyone that hasn’t built a data centre in a couple of years is going to have a big surprise when data centre racks that a few years ago were one or two kilowatts and, hey presto, they’re now 20 kilowatts. Of course, they’re not all 20 kilowatts and it’s not all high density. Although myself and others may bang on about high density the real issue today is variable density. Moving onwards, it’s much more a case of how do I deal with some storage devices that are fairly low to medium power consumers and therefore not putting out a great deal of heat. To someone who is putting the latest blades into the rack and they’re using a lot of power, generating a lot of heat so you may have a data centre that has some low, medium and high density inside. The solution is something that can deal with the whole range, almost a hybrid solution.”

THIN PROVISIONING
The rest of 2009, and indeed to 2010, may well see many adopt technologies such as storage virtualisation, thin provisioning and tamper-proof archive appliances. That’s according to Karl Jordan, who is sales manager for enterprise storage in the Technology Solutions Group of HP Ireland. He believes that these technologies will become common among organisations managing in-house storage infrastructures, but will also find their way into the data centre, where cost optimisation, storage capacity optimisation, and storage management optimisation play “a significant role in raising service level agreements and reducing costs”.

Thin provisioning is a technique that allows administrators to allocate an amount of ‘virtual’ storage to an application “based on the capacity requirements of the application for, say, three years – say 500Gb,” he added. The application sees this 500Gb as real storage, even though the administrator has only allocated 100Gb actual physical capacity to the application initially.

“As the application consumes this capacity, the administrator can add physical storage to the virtual pool transparent to the application. Just like the bankers’ algorithm in the finance industry, this allows administrators to provision large virtual capacities to many applications while operating a just-in-time provision of real capacity automatically in the background. This reduces the amount of disks that need to be purchased over time, reduces management overhead and significantly reduces power, cooling and space costs.”

DATA ARCHIVING
Data archiving technologies too have advanced significantly under the influence of regulatory requirements and the pressure on organisations to manage data growth. Companies are looking to separate operational and reference data which up to now have been largely been treated in the same way. Noted Jordan, data classification can identify data that is essentially unchanging, or “reference data” as he put it.

“This can now be taken off production tier-one storage, taken out of the back-up cycle, and located on lower cost, online, tamper-proof, archive appliances. Content addressing allows this data to be stored and indexed based on the actual data content and stored using WORM and encryption capabilities to ensure it cannot be altered outside agreed policy.”

According to Jordan, this reduces the back-up load, reduces cost of primary storage, reduces power and cooling and at the same time providing fully protected, active archive data available online to users. “The availability of active archive appliances will drive the implementation of archiving projects across a wide variety of organisations in both regulated and unregulated environments in the coming year.”

KNOCK ON EFFECT
As with any major business developments in the coming year and beyond that indeed, the banking sector may have a huge knock on effect, data centres are no different said Darren Thomson, senior technical director with the EMEA arm of Symantec Security Response. “I would expect to see banking regulated even further in the next two years,” he said and in turn further adoption of IT service management standards such as ITIL will have an impact as businesses in both the private and public sector protect their data assets more comprehensively.

But more to the point, he told how vendors and manufacturers will spend their money and energy on full support for virtualised environments. “Also, many vendors will start to look further at providing their services in ‘the cloud’ via software as a service (SaaS) and so on,” he added. Datapac’s John Casey, who is the company’s sales manager, agreed that spend on virtualisation can begin in earnest for manufacturers and vendors.

“Yes, it will make its mark even though it may have had rocky beginnings,” he said, “people have now got their heads around it, it was an ‘out there’ technology really for want of a better phrase for quite a while but people have been introduced to it now and see the benefits and have seen it deployed across all of their network infrastructure really. I suppose server virtualisation is only the start of it. Cisco is working very hard with VMWare for one thing anyway.”

Tanya Duncan, general manager of Interxion Ireland commented that of Interxion’s customers “there’s a real mix as we have well over 1,000 customers around Europe and each one of them is different in their knowledge and requirements of virtualisation”. She continued, “People are beginning to implement but we hear about more it than see it being done, if you like. Those technologies, plus cloud computing, may be the big things going forward. If the data centre doesn’t offer it directly a client or partner of theirs would.”

Jordan also told how virtualisation in the IT environment is currently focused on virtualisation of the individual elements (storage arrays, servers et cetera.). “As we go forward we will see the development of integrated virtualisation – bringing together these elements and optimising them, and automating availability to meet business SLAs. The ultimate vision is the complete IT utility, where all heterogeneous IT infrastructure resources act as a utility automatically ensuring supply meets business demands in real-time,” he added.

Ed Byrne, general manager of Hosting 365 pointed out that already, VMWare in particular, has a massive marketing organisation and channel partner community, and many deployments coming into the data centre environment now, that have been consolidated into a virtualised infrastructure. “Secondly,” he added, “some data centre providers are actually offering virtualisation solutions as well as standard data centre space. Virtualisation offers so many benefits in terms of management, reporting, and economic efficiencies, that I think it will continue to gain traction for the next few years.”

Read More:


Back to Top ↑

TechCentral.ie