The heart of the cloud doesn’t beat, it whirrs
Tue 17 Jun 2014
Who’d have thought a data centre could be so full of surprises?
I am grateful to CenturyLink Technology Solutions for accepting my request to visit their site in Slough – after all, if I am to write about them and their use every day, I should at least have witnessed this beating – or whirring – heart of the stack.
The data centre is the modern powerhouse driving business – shovelling data around, storing it, recovering it, processing it and the yet the only audible noise is the fans, big and small, driving air around, hot and cold, out of the wrong places and into the right ones.
But the surprises started earlier than that. This data centre is on an anonymous industrial estate, and carries neither corporate signage nor the signature of an architect stamping modernism over such an iconic building. I drove passed it twice before finally identifying it.
Apparently that anonymity is part of its security. When you’re dealing with expensive kit and even more valuable customer data, nothing is left to chance. And we had been warned. We had to identify who was coming on the visit days well in advance and bring photo ID on the day.
Once passed the security guard who verified our credentials we pass through ‘airlock’ double doors before we got into the inner box – according to the video in reception the glass and walls of the outer box are bomb resistant – and then every door needed a security pass – even our host was barred from a couple of areas “strictly need to access” basis only”. Oh, and by the way, don’t build data centres in a floodplain or an airport flight path.
Then I have my next surprise – our host, Andy Huxtable, who says it’s not the equipment in this temple of technology that makes the difference. “It’s the quality of the people you have running it,” he says. It’s a Tier 3 facility, which means it is probably at the highest standard of a commercial centre, and offers 100% uptime service level agreements. So where can it go in terms of performance from there? How can its people make a difference?
“Put a bad crew in charge of the best boat in the world and it’ll be a disaster” says Huxtable. “But the best crews could get even a poor boat home”. It is about the dedication and passion they have for the centre to ensure its smooth running and about the level of support, help and advice they give their customers.
“These guys love their work. They are the best; they want to work with the best and once here they rarely leave”. He adds, again surprising me, that the average tenure of the staff here is 16 years.
So, beyond keeping it working, what are the challenges for the team? Efficiency is a word that comes up a lot – mainly in the context of power management and driving down the power usage effectiveness (PUE) ratio to save money and the world’s resources. They use a hot aisle cold aisle system where cold air is pushed up from the floor in one aisle, drawn through the server racks into a hot aisle then into the ceiling to be recycled. So the efficiency solutions can be simple sheets of plastic to keep the air where it supposed to be as well as expensive high tech processing kit.
Efficiency is also a feature of the service they provide which included installing and commissioning a new customer in 10 days over last Christmas, but it could be about advice on the designs of the kit their colocation customers put in, or even planning and maintaining the volume of business installed – “the most effective data centre is a full data centre” says Huxtable.
CenturyLink Technology Solutions offer two main types of service: colocation where the customer takes space and power but fits out their own equipment, and managed services such as hosting and cloud where CenturyLink Technology Solutions runs everything for them. And the 50+ centres around the world are all run on the same principles.
Walking around the heart of the operation I couldn’t help but be reminded of the semiconductor clean rooms I used to tour in a previous life. The whirring noise was similar – the need to move air from the floor to the ceiling was more to do with dust but the effect is the same, the security, the cleanliness – a data centre is spotless – the airlock doors to get in, the minimal human presence although, thankfully, data centre engineers aren’t required to wear the “bunny suits,” masks and hair nets.
Another surprise was that to keep the four halls of this 36,000 sq ft, 33MW facility operating all day, all year needs a minimum of 10 people – although the teams are larger and work on a rolling round the clock programme..
On the tour the surprises kept coming. Now, I was always taught you should fight electrical fires with water but this data centre does – would, I should say. A fire in a data centre is most likely to be an electrical one and this one uses localised water sprinklers activated by the heat. Yes, it would damage that system – which is probably already damaged anyway by the fire – but using other systems like halon or compressed carbon dioxide can cause massive disruption to the whole data centre possibly causing a shut down and particulate damage to the equipment, made worse by having floor and ceiling tiles being blown out by the suppression systems. Huxtable added: “We also consider this a green approach.”
And the final, pleasant, surprise was that it had taken close to three hours to cover the ground, and not once did it flag or fail to be fascinating. Don’t let anyone tell you data centres are dull.
* My thanks to CenturyLink Technology Solutions for their hospitality and being so generous with their time: director Andy Huxtable, Paul Scorr, the data centre support manager and Nick Scott, the facilities manager.
For an alternative look at another data centre as a piece of art, please click here: The cloud is very real and very weird. Here’s a peek inside