The Stack Archive

Why data centre cooling is set to become a hot topic

Mon 24 Oct 2016

Timothy-Arnold-Colo-Technology-Director-at-Six-Degrees-GroupTimothy Arnold, Colocation Technology Director at Six Degrees Group, discusses the options for addressing data centre cooling, and the growth in data storage that makes the right strategy essential…

It is no secret that data centres can be both energy and emissions intensive; they are estimated to use approximately three per cent of the global electricity supply and create about two per cent of total greenhouse gas emissions, the same carbon footprint as the airline industry. As data centre growth shows no sign of slowing down, experts have warned that globally, data centre energy consumption will treble in the next decade.

The demand for data centres is being driven by the huge increase in data growth.Billions of constantly connected users, running millions of apps, are creating new data around the clock. The growth in data is consuming huge amounts of storage; indeed IDC predicts that almost 2 zettabytes of data will exist by 2020 (a zettabyte is equivalent to around 1 billion terabytes).

With this kind of growth it is essential that data centres become more energy efficient to ensure energy usage doesn’t rise too drastically, cost customers more, increase the burden on the national grid, and ultimately the planet.

So, in the spirit of Energy Saving Week, starting on the 31st October, what can companies that use or provide data centre space do to improve energy efficiency?

Keeping cool

Historically the main focus of making data centres more “green” has been around altering the cooling systems (air conditioning) in legacy facilities as these have represented 30-60% of the non-IT (server equipment) energy usage.

Legacy systems have also used Hydrofluorocarbon (HFC) greenhouse gasses for cooling data centres. HFCs were introduced in the 80s as a replacement for ozone-depleting gases but are now considered as dangerous to the climate. Indeed this month, a global deal was reached to limit the use of HFCs in the battle to combat climate change.

Now, the latest technology being installed in the newest data centres uses indirect fresh air-cooling, which is essentially a fan in a box with a heat exchanger that uses air at ambient outdoor temperature to cool the server equipment. If the ambient temperature is too high for the servers, it then uses adiabatic cooling (water) to reduce it. This method of cooling is more easily installed across existing facilities and uses less energy that legacy systems.

Northern exposure

For those who are slightly braver, in the right climate (cold) and chasing maximum efficiency (such as Facebook), there are direct fresh air systems where it is possible to simply pull the outdoor air into the data centre, negating the use of a heat exchanger, saving energy and power. For this, however, you need a good level of filtration. Traditionally, customers tend not to be keen on this technology, preferring either indirect or a chilled water system instead. However, if energy prices rise, this means of cooling could provide a realistic solution.

There is no point having a really efficient data centre if the hardware is only being used to 10% of its capability

Cooler climates can certainly help create green data centres – for example, facilities in Iceland and Norway can stave off the requirement for the adiabatic water. However, accessibility is also key; most don’t want to travel to Iceland to maintain their server equipment. The UK offers a good balance of temperatures and, perhaps more importantly, prime network connectivity – this is part of the reason behind why we chose Birmingham for out latest data centre facility.

To get technical for a moment, the key to all variants of cooling technology is the server equipment inlet temperature. Modern IT hardware can now deal with higher inlet temperatures – going from a traditional 18 degrees centigrade to 27, or above, which makes any cooling system more efficient, assuming you’ve used cold or hot aisle containment of course.

There are other smaller, more surprising technology advancements such as ‘eco-mode’ UPS systems and lighting that data centre operators should take advantage of; we saved 5kW alone by changing the light bulbs in one of our facilities! As data centres have such a long life span and their power utilisation is constant so making small energy efficient changes can have a large and sustainable impact.

Data efficiency is energy efficiency

There is no doubt that data centres and IT hardware have become more efficient; however, the difference between an average provider and one that is really first-class is how they use the hardware to reduce their energy footprint.

There is no point having a really efficient data centre if the hardware is only being used to 10% of its capability. The server power utilisation will vary based on the utilisation rate, however not as dramatically as some might think, although it is getting better. This is where virtualisation and ‘the cloud’ come in. By using virtualisation technologies the utilisation rates for servers can be increased and therefore generate a more efficient use of energy.

On a wider scale, individuals also have a duty when it comes to ‘digital efficiency’. Do we really need to watch that YouTube video of a cat dancing, or sending that 10 megabyte attachment to fifty people? All of this requires energy to create, transmit and store digital data. Consumer education is key and every little helps.


cooling Data Centre feature
Send us a correction about this article Send us a news tip