fbpx
The Stack Archive

Why indirect adiabatic cooling is key to data centre resiliency

Wed 18 Jan 2017 | Mark Collins

Adiabatic cooling data centre

mark-collinsMark Collins, director at Excool, discusses the inherent resilience of indirect adiabatic cooling systems…

The dizzying rate of growth in demand for data centre space in recent years shows no signs of slowing anytime soon. The global data construction market in 2015 was estimated at $15bn and is set to reach $23bn by 2020. Driven by a worldwide move towards cloud adoption, and with drivers such as data sovereignty, there is no foreseeable end in sight to this vigorous expansion.

That growth in demand for data centre space in recent years shows no signs of slowing anytime soon: the global data construction market in 2015 was estimated at $15bn and is set to reach $23bn by 2020. Driven by a worldwide move towards cloud adoption, and with drivers such as data sovereignty, there is no foreseeable end in sight to this vigorous expansion.

Buoyant markets encourage healthy competition and drive costs down to levels unimaginable just five years ago. Data centre developers are increasingly challenged to lower capital expenditure to deliver what is rapidly becoming a commodity at competitive costs, while still managing to turn a profit. All the while the client’s expectations of low PUE and robust resilience remain the same.

To square this circle, many of these developers now view indirect adiabatic cooling as their first design choice for cooling new build data centres. The dramatic reduction in capital and operating expenditure that these systems offer over traditional DX and chilled water systems is now universally recognised. What is a little less understood is the markedly improved levels of resilience that indirect adiabatic coolers bring to the table.

Why indirect adiabatic cooling?

Traditional chilled water systems normally rely on two or three chillers to provide the total cooling capacity for a single data hall, with one of these machines being the standby or ‘plus one’ unit. Large centrifugal or screw-type compressors, along with twin circuit evaporators, are typical on these units. Poor oil management, control failures and inadequate maintenance are a few of the issues that could lead to compressor failure and, in extreme cases, catastrophic evaporator failure through freezing.

However rare these events may be, the heightened exposure to risk to the operator is clear. With the ‘plus one’ unit out of action for a considerable period of time, the ability to control the operating environment in the data centre depends on the full availability of the remaining units. There is also the pipework connecting the system to the indoor CRAC units to consider. Unless the system has a redundant chilled water distribution system it represents a single point of failure.

Data centre developers are increasingly challenged to lower capital expenditure to deliver what is rapidly becoming a commodity at competitive costs, while still managing to turn a profit.

Indirect adiabatic coolers reduce this risk significantly. For a 1MW IT load we would typically have six 200kW machines providing N+1 unit level resilience. The system relies on fans to move air and pumps to move water, and on extreme days a top of maybe 50kW of compressor-based cooling.

Each 200kW unit would have a total of eight fans, two pumps and two compressors. The failure of any single item is therefore not a ‘show-stopper’ as it does not represent a single point of failure on the individual unit. For instance, a fan failure results in a small increase in fan speed on all the other fans while raising an alarm. The change out of the failed component is also rapid ranging from 1 hour for a fan to approximately 4 hours for a compressor.

Failure of electrical components such as an auto transfer switch will result in the outage of the complete unit whether this is an indirect adiabatic cooler or a chiller. In this example, the IT load is spread over six units rather than three chillers. Typically IT utilisation rarely exceeds 80% of the installed load. Therefore the failure of a single indirect adiabatic cooler reduces the remaining capacity to 5 x 200kW, which represents 100% coverage with a resilience of N+1 for 80% of the load. In the case of a chiller failure, although 100% coverage remains, the resilience is reduced to 50%.

The system decentralisation that can now be achieved with indirect adiabatic coolers extends to the water storage system, further increasing resilience and the case for adoption of the technology.

[DCM-footer]

Experts featured:

Companies featured:

Tags:

cooling data centre feature
Send us a correction about this article Send us a news tip