Energy efficiency and saving CapEx and OpEx in high performance industry compliant data centres
Mon 19 Nov 2018 | Michael Akinla
The energy costs rising and market pressure to reduce energy consumption, for environmental as well as return on investment (ROI) reasons, competition in the data centre market continues to intensify. The momentum to improve energy efficiency has been helped by developments driven by hyperscale operators, customer requirements (reducing PUE, reducing cost per kW utilisation) and manufacturers equipment warranty parameters (certifying higher operating temperatures).
The long-accepted policy of over specifying cooling systems, was based on the principle that equipment should be operated at a temperature that staff were comfortable working in. Today, operators and most customers understand that with energy costs for cooling systems outpacing energy used in the technology suites themselves, it is time to design for performance and efficiency.
A well-designed white space with a monitored and controllable cooling/environmental system may use a greatly reduced level of energy. In many cases, the latest developments in thermal planning, monitoring and cooling optimisation are saving hundreds of thousands of pounds in energy costs, as well as pre-empting problems and providing a more resilient and reliable data centre.
Data has become an increasingly valuable corporate asset and the requirement to develop systems that guarantee data availability and delivery have led more board-level IT decisions toward standards-based solutions.
International standards such as ASHREA TC 9.9, ETSI EN 300 and EN 50600-2-3 are driving acceptance of best practice in the technology suites and data centre environments. ASHRAE TC 9.9, provides a framework for compliance and determining suitable Information Technology Environments (ITE). These industry guidelines provide detailed technical information to allow data centre operators to implement cooling strategies that allow optimised equipment operation at carefully monitored and controlled airflows, temperature, humidity and other significant variants.
- Cabinet/Rack level – measurement of inlet temperature and Relative Humidity (RH) for racks at the bottom, middle and top of the cabinets. Maintaining a recommended (18-27oC) as well as allowable (15-32oC) thermal ranges
- Containment level – in addition to 1. – With cold aisle containment system, the hot aisle temperature can be in the range of 50oC; instrument and monitor the outlet temperature at the top of the rack and cabinet. When using hot aisle containment system, then monitor temperatures across the room
- Data Hall level – in addition to 1 and, or 2 – Humidity and temperature needs to be monitored near each CRAC/CRAH at the Supply and Return. Relatively Humidity is recommended at 60% RH and allowable at 20% – 80% RH
- Airflow Management and Cooling System Control – Airflow management and cooling system control strategy should be Implemented. With good airflow management, server temperature rise can be up to 15oC; with inlet temperature of 27oC the hot aisle can be 55o
A current Panduit data centre client has designed out mechanical refrigeration to the technology suites, utilising instead an N+1 Indirect Evaporative Cooling (IEC) system, which provides highly efficient climate control, while offering a resilient back-up capability in the unlikely scenario of a unit failure. This system design also incorporates free-cooling technologies which result in increased reliability, higher energy efficiency, increased sustainability and lower operating costs. The site is compliant with the ASHRAE Thermal Guidelines (2011 and 2015) and is one of the first European data centres working with the Open Compute Project’s data centre program to standardise data centre designs. This cooling system is designed for a temperate climate and regional variation will require modification to ensure maximum efficiency is gained.
Each data centre Technology Suite has a capacity of up to 2.2MW, whether shared or individual customer use. These suites offer a single span open hall, and are designed to utilise hot aisle containment enclosures, such as Panduit’s Net-Contain system, offering higher density racks, up to 40kW, to be optimised using industry leading cooling and monitoring technologies.
Energy efficient data centre cabinet systems allow higher data centre temperature set points and reduce cooling systems energy consumption by up to 40%. Controlling small air leaks in the cabinets and enclosures maintains air separation between hot and cold air streams, this contribute to large savings in cooling system energy costs.
In this situation, the regulated cool air from the Indirect Evaporative Cooling (IEC) system, is diffused into the Technology Suite. Utilising hot aisle containment allows the operator to effectively manage airflows across devices, such as server racks, facilitating cool air to be drawn into the front of the enclosure and cabinets through the hot equipment and directs the hot exhaust air up and away from the equipment through exhaust ducts into the ceiling space and recycled to the IEC system for heat transfer.
New high performance equipment and greater compute densities have increased the heat generated in each rack, this requires the installed cooling and containment system be even more effective at airflow management to meet this higher performance requirement. Higher performance equipment may require specific airflow inlet ducts to direct air into the server or switch intake maximising the front to back airflow pattern, and reducing energy use by lowering equipment fan speeds.
Today’s white space processing equipment are designed to allow higher operating temperatures and this permits warmer white space operational temperature, meaning that less energy is needed to equalise the ‘Air Inlet’ temperature. Device Inlet temperatures between 18-27°C and 20-80% relative humidity (RH) will usually meet the manufacturers operational criteria. What does become increasingly important is the capability to monitor and control the Recommended Environmental Range, including temperature and relative humidity (RH) and to maintain an Allowable Environmental Envelope, where the systems are operating at optimum performance.
Effective Environmental Management
Monitoring systems, such as SynapSense, provide various levels of data, so it is important to understand the level of granularity that is required for your needs. Once airflow management is optimise it should offer active control to mitigate temperature control risks associated with fan failures, maintenance schedules, relocations, changes in IT load and software patches and failures. The chosen solution should offer an advanced wireless sensor mesh network, where sensing devices, gateways, routers, server platforms and a comprehensive software platform provide connection and monitoring across the entire Technology Suite. The system needs to integrate data sets from every key piece of equipment to provide management with a comprehensive and versatile tool for analysis and intelligent trend gathering.
Data centres are an increasingly important hub within the digital economy, many older sites with legacy technology, expensive cooling equipment and minimal monitoring and analysis capabilities are becoming inefficient to the extent where it is change or lose your clients to higher performance, more efficient sites. All data centres are different, whether it is the construction, the region, the availability of energy and at what price. As such, they require individual solutions to achieve the most effective position within the market. Today, the market is evolving faster than ever, but the constant remains; the data centre must be – more efficient, customer focussed and offer 100% uptime.