DC cooling efficiency improved with higher water temps
18 May 2018 | 0
The cost of chilling water for data centre cooling systems can account for as much as 80% of over all cooling costs, which means it is a target for improvements in data centre development.
A study by Schneider Electric of two data centres found that utilising higher water temperatures resulted in energy savings of between 41% and 64%, whilst driving significant improvements in Power Usage Effectiveness (PUE).
To achieve the efficiencies some cooling system changes were required, but the study estimates that the costs would be offset by savings in as little as three years.
In a free white paper entitled “How Higher Chilled Water Temperature Can Improve Data Center Cooling System Efficiency”, Schneider Electric says that water chillers account for between 60 and 85% of overall cooling-system energy consumption. Consequently, data centres are designed, where possible, to keep usage of chillers to a minimum and to maximise the amount of available “free cooling”, in which less power-hungry systems such as air coolers and cooling towers can keep the temperature of the IT space at a satisfactory level.
One approach to reducing water chiller energy consumption, says the white paper, is to design the cooling system so that a higher outlet water temperature (CHW) from the chillers can be tolerated while maintaining a sufficient cooling effort. In this way, chillers consume less energy by not having to work as hard, and the number of free cooling hours can be increased.
As with any complex system, attention needs to be paid to all parts of the infrastructure, as changes in one area can have direct implications for another. The white paper examines the effect on overall cooling system efficiency by operating at higher chilled water temperatures, and outlines the various strategies and techniques that can be deployed to permit satisfactory cooling at higher temperatures, whilst discussing the trade-offs that must be considered at each stage, comparing the overall effect of such strategies on two data centres operating in vastly different climates.
Among the trade-offs addressed were the need to install more air-handling units inside the IT space to offset the higher water-coolant temperatures, in addition to the need for redesigned equipment such as coils, to provide adequate cooling efforts when CHW exceeds 20C. The paper also advises the addition of adiabatic, or evaporative, cooling to further improve heat rejection efficiency. Each approach requires an additional capital investment, but results in lower long-term operating expenses due to the improved energy efficiency.
The white paper details two examples in differing climates, the first is in a temperate region (Frankfurt) and the second in a tropical monsoon climate (Miami). In each case, Schneider says data was collected to assess the energy savings that were accrued by deploying higher CHW temperatures at various increments, whilst comparing the effect of deploying additional adiabatic cooling.
The study found that an increased capital expenditure of 13% in both cases resulted in energy savings of between 41% and 64%, with improvements in TCO between 12% and 16% over a three year period.
Another inherent benefit of reducing the amount of energy expended on cooling is the improvement in a data centres PUE rating. As this is calculated by dividing the total amount of power consumed by a data centre by the power consumed by its IT equipment alone, any reduction in energy expended on cooling will naturally reduce the PUE figure.
The study found that PUE for the two data centres, Miami and Frankfurt, was reduced by 14% and 16% respectively.
The findings are broadly in line with other recent studies that have found that increased cooling media temperatures are possible without impacting infrastructure performance or longevity, with Microsoft and Intel running similar experiments.