fbpx
The Stack Archive

Microsoft’s Paul Slater on the future of the data centre

Wed 4 Mar 2015

Paul_Slater-MICROSOFT[1]As the lead architect at Microsoft’s Modern Data Centre Initiative, Paul Slater has a comprehensive view of current issues and trends in the data centre environment.  Slater is also an accomplished IT journalist and the author of over 30 white papers – in addition to a formidable output of books on subjects including IT Security, Cloud Strategy and IT Portfolio Management. Paul will be speaking at Data Centre World Expo at the ExCel Centre in London (11th-12th March). Click here to find out more about the most important Data Centre event of the year.

If there is a failure in the modern data center, it is in missing some of the opportunities for improvement that we have found from running cloud services at the highest levels of efficiency. Some of the lessons we have learned will only apply in the kind of huge-scale, huge-growth environments that we own. But others will apply to any organization, and many of them are not acted upon or understood today.

We are in an industry that is being increasingly influenced by the economics of public cloud, and even organizations that do not use public cloud services are benefiting from this change. However, despite the downward pressure on cost, reducing costs is no longer enough to compete effectively. We have to demonstrate that our value proposition is tied to real, verifiable business outcomes.

We have made the strategic decision to be at the heart of the public cloud revolution, and, as such, efficiency is at the core of everything we do. This means adjusting our thinking to be from the service down. How do we provide our services in the most efficient, cost effective way possible? The reality is that our customers don’t measure us by the price, availability and performance of a data center – they measure us by the price, availability and performance of services. New service design paradigms allow you to retain high levels of service availability and performance while spending less on the underlying data center design. Most data centre designers have not fully taken advantage of this yet.

If current economic trends continue, the ultimate goal will be to store everything that you can in the public cloud, and everything that you must in the private cloud

Our transformation has taught us the importance of extreme standardization and ruthless automation – and that any initiative that stresses standardization and automation will pay off more than we previously thought.

Virtual transformation and the challenges of Big Data

Virtualization as a whole, particularly server virtualization, has been a powerful force in IT for some years now. As compute power increases we continue to see increased financial benefits from server virtualization. However, there is still much work to be done, particularly for organizations that have simply replaced physical servers with their virtual counterparts. Underutilized virtual machines, coupled with virtual machine sprawl, can lead to almost as much inefficiency as with previous all-physical environments. We encourage organizations to focus on how to achieve high levels of efficiency by using virtualized data centers, rather than assuming that technology alone will be the source of much greater efficiency.

Software-defined solutions will also prove transformative, and many organizations are not fully realizing the benefits they can provide. The software-defined solutions are inherently adaptive, and can improve over time. Data centers are highly complex environments that change according to technology and business disruption, so adaptive designs prove to be very beneficial over time.

Trends in IoT and Big Data point to a data explosion unlike anything we have ever seen before, and new storage models in public cloud are making it very cheap to store data. For many organizations it’s a complex equation involving cost, risk and business benefit. Increasingly organizations are recognizing that there is that there is real value in retaining data, but this has to be balanced against the risks and costs associated with holding certain types of data.

Dealing with this challenge starts with building an effective information management strategy, and supporting it with effective classification of data. If current economic trends continue, the ultimate goal will be to store everything that you can in the public cloud, and everything that you must in the private cloud.

The challenge of business migration to the modern data centre

Data centers in many organizations are containers for equipment that varies from brand new to a decade old, or more. Managing this complex, heterogeneous environment is challenging enough, let alone transitioning to something that is much more efficient. To achieve change, organizations need a top-down mandate, and change across people, process and technology that is far from straightforward.

They need to shift thinking from the facility to the service, and they need to build the right blended model, most likely involving elements of homegrown data centre, colocation, managed hosting and cloud services. They also need to focus on building environments that have flexibility at the core, as any projections about the future of technology, the future of their business and future demand for their services, will almost certainly be wrong.

Data centers, are large, complex, constantly changing ecosystems that today and in the future will be ripe for optimization. The first step is detailed insight into the behavior of a data center, particularly around the environ metals, but over time, this will evolve to predictive analytics about the data center, used to manage it, and to determine how or if workloads should migrate to the public cloud.

The data center of the future will likely combine IoT, Big Data, Internet of Things, software defined data centers and robotics to provide the most efficient, self healing data centers over time.

The energy issue

From a global perspective the situation of energy efficiency has been obscured somewhat, with reduced emphasis from some national governments on the issue, many enterprises moving capacity into the public cloud and some energy sources becoming cheaper. Ironically, the shift to public cloud means that the real costs associated with providing data center capacity are better known than ever, and pressure to provide greater efficiencies will only increase over time, both in the form of commercial economics and government mandate.

Against this backdrop, demand for core data center capabilities is and will continue to explode, meaning that that this will become more relevant over time. One way to outsource this problem from an enterprise perspective is to build new capabilities and/or existing ones into the public cloud.

The way in which we power data centers is very inefficient today, effectively using three power sources – power station, UPS and power generator – to try and ensure continuous power to the data center. As more workloads move to the public cloud, the challenge to provide capacity will likely become less of an issue, but this may be replaced with a challenge on how to create efficiencies for the capacity that remains, and organizations may be required to measure energy demand – regardless of whether they fulfilled it themselves or not.

A lot of innovation is happening in this area – particularly in the field of increasing efficiency from power generation to power consumption, but also in finding ways to provide service redundancy in a more reliable way, and circumvents the need for redundant power sources. The data center of tomorrow will most likely not be powered in the same way as today.


You can hear Paul Slater speak at Data Centre World 2015 (11th-12th March, ExCel Centre, London). Click here to register for the data centre event of the year.

Tags:

Data Centre feature IoT Microsoft PUE
Send us a correction about this article Send us a news tip