What 2017 has in store for the data centre
Tue 13 Dec 2016
From a growing focus on green initiatives to a shift to the Edge, The Stack compiles a selection of comments from industry-leading experts on what the future holds for the data centre in 2017 and beyond…
The industry has made significant progress in recent years on improving its energy efficiency but this is only one element of data centre sustainability. Life cycle assessment (LCA) studies have found that source energy and embodied impacts also have an important bearing on a data centre’s environmental impact.
Some operators purchase renewable energy and report on their carbon footprint. However, environmental impacts include other pollutants and phenomena which may be categorised into four areas of protection: Human Health, Ecosystem Quality, Resource Depletion and Climate Change.
Analysis has shown that electricity generated by fossil fuels has an environmental impact many thousands of times greater than that from renewables. Embodied impact relates to that associated with the lifecycle phases before and after operation, including the impact of extracting minerals to create components, manufacturing processes, transport and disposal.
In data centres, the IT equipment is responsible for a large embodied impact due to the materials and processes associated with IT hardware, which may be refreshed several times in a facility’s lifetime. M&E plant is also relevant. As we move into 2017, work in this area will continue to raise stakeholder awareness of wider environmental impacts and encourage positive change.
We are living in a time of change, and one new way improved data centre reliability can be delivered in buildings is to control the clean air environment. The use of air monitoring and Eurovent A+ rated low energy air filters are a sound strategy.
However, preventing downtime is a priority, so reduced plant running costs can only be achieved with the correct clean air solution. Air filtration by ISO 16890:2016 tested filter products can give greater peace of mind.
The new filter test standard classifies particle filters as PM1, PM2.5, PM10 efficient or coarse for the lowest grade. The intention of the new standard is to provide ‘closer to real life’ performance data.
The increasing trend to use direct free cooling means energy running costs for facility temperature control can be reduced dramatically but supply air may need cleaning.
Where corrosive gases pose a data centre reliability problem, molecular gas filtration products tested to the new ISO 10121:2016 standard can provide a ready solution cleaning both supply air and recirculation air. People also work more efficiently in a healthy clean air environment.
2016 demonstrated a massive increase in global data traffic, breaking through the one zettabyte milestone towards an estimated 2.3 zettabytes per year by 2020. Given the growth of streaming driven by the success of TV, video and online gaming through global phenomena such as Pokémon Go, these numbers may well prove to be underestimated!
Not all fibre cabling systems are created equal and many low-cost solutions are inadequate
With the prospect of massive volumes of data, the capability of current data centre infrastructure to handle this inundation must be scrutinised.
Whether on-premise or outsourced, a fundamental imperative for data centre service providers is assured availability and uptime. As data becomes more valuable colocation, web-scale and financial data centres must also provide low latency network infrastructure for near real-time transactions. In this context, fibre cabling solutions are rapidly gaining ground.
However, not all fibre cabling systems are created equal and many low-cost solutions are inadequate to meet the requirements of an increasingly complex and unclear standards landscape.
This could not only limit the technical capability and value of the data centres installing them but also significantly reduce their useful operating life cycle without significant further investment. Only by investing in proven standards-compliant fibre infrastructure can we guarantee that sites will consume the data tsunami heading our way.
Data centres everywhere are moving quickly to address bandwidth demands. In fact, switch-to-switch connections of 40 and 100 gigabit Ethernet are expected to make up 26 percent of data centre infrastructure by the end of this year, according to research organisation BSRIA.
This rapid transition will require fibre systems that are flexible and scalable. For example, a 24-fibre MTP®-based cabling infrastructure can give data centre managers the flexibility to meet current needs, and then use the existing fibre backbone when it comes time to upgrade to higher speeds.
In addition, new 25GBASE-T and 40GBASE-T standards give data centres an affordable and flexible copper option at the access layer. These networks will be supported by Category 8 cable and connectivity, which brings the familiarity of an RJ45-based infrastructure. We can expect to see new 25/40 GbE equipment and Category 8 solutions available in 2017.
Wilkie of Brand Rex continues…
Another interesting trend is the proliferation of small mobile and micro data centres at the network edge. Deployed remotely from a main data centre, micro data centres act as a satellite location closer to the end user. This is important for processing data in real-time and reducing latency, as we continue to add billions of data-generating devices through the Internet of Things, from vending machines to fitness trackers.
Micro data centres require efficient ultra-high-density fibre solutions, and in some cases high-density copper cabling solutions. For example, there are systems available today that can patch up to 144 fibres in an enclosure or patch panel that takes up only one rack-unit of space.
The growth in importance of Edge computing will continue apace as issues such as latency, bandwidth limitations, security concerns and other regulatory requirements encourage many applications to be hosted away from large centralised data centres. This leads in turn to the need to address infrastructure at the Edge, many of the same issues that first encouraged the growth of cloud computing around centralised data centres in the first place.
The industry will see the migration downstream to smaller data centres of capabilities such as redundant power supplies and cooling, dual network connectivity, remote monitoring enabled by software, and a greater concentration on physical security. This will be reflected both in management practice and in the development of products aimed directly at the small data centre market, with features such as redundancy and security built into standardised micro data centre solutions.
Read more 2017 predictions from Schneider Electric on The Stack.
Modern digital working practices are key to business success. Our recent research found that IT decision makers felt that digital success equated to both increased customer satisfaction and revenue growth. But with the IT landscape becoming more complex, and legacy IT potentially holding back innovation, the road to digital transformation isn’t straightforward. Without a data centre strategy that can support both old and new IT functions, organisations will fail to meet their digital ambitions.
Modern data centre capabilities, able to support hybrid IT, are now essential. But beyond this, they must also have inbuilt resilience in order to maintain robust security and constant availability of services in order to meet the demands of the business. End users – whether internal or external to the business – will settle for nothing less.
Juggling data centre modernisation while maintaining constant up-time and appropriate security is no mean feat. With the vast array of aspects involved in delivering the right infrastructure to deliver business outcomes, organisations increasingly need expert help to achieve these data centre environments.
The coming years will see a rise of businesses recognising the benefits of partnering with an expert provider who understands and caters to the progressively complex needs of their organisation.
In 2017 we will see even greater adoption of microservices, as companies look to accelerate application delivery by breaking down bigger applications into smaller building blocks, which can be advanced and deployed independently. To support this, companies will turn towards a whole new underpinning software-defined data centre environment, to enable improved automation, easier deployment and greater scalability of workloads.
CIOs need to be looking at AI and advanced machine learning to detect degradations before they affect end-users
As with any IT revolution, there are hurdles to overcome and pitfalls to avoid when making this transition. The speed with which new services can be launched and the constantly changing nature of dynamic, self-optimising data centre environments makes it extremely difficult for IT teams to keep track of the impact that IT infrastructure has on the performance of digital services.
The application transaction path is constantly changing in a software-defined data centre, which means that it is no longer possible to monitor digital performance in the conventional way. There are simply far too many moving parts for human operators to manage manually, so CIOs need to be looking at AI and advanced machine learning to detect degradations before they affect end-users. In effect, it’s a bit like using a giant magnet to instantly draw the needle out of the data centre haystack, regardless of how often it moves.