Analysis Cooling a datacentre the wrong way is like cooling a hot kitchen by opening the fridge door: it makes more sense to open a window and pump fresh air in.
That's the view of Bladeroom boss Paul Rogers, whose CV includes designing industrial kitchens. He and engineering partner Red Engineering Design ended up building what's claimed to be the UK's only modular air-cooled datacentre. It saves energy and delivers what Bladeroom calls the lowest power usage effectiveness (PUE) rating of any design in the world. If you want them to build you a datacenter you order how ever many modules you need, and they arrive on the back of truck.
A low PUE means that more of the power going into the facility is used for computing, instead of ancillary services designed to keep the machines running, especially cooling. It works because of the huge amount of attention to detail that the company has paid to ensuring that sensors track air temperatures and humidity levels. As a result, it's achieved a PUE of 1.13, where most facilities hover around 2.0, which means a datacentre uses twice as much energy than it needs to run its servers.
A key reason for the low PUE is Bladeroom's decision to only cool its datacentre modules to 24 degrees centigrade, while most run at around 18-20 degrees. According to Red Engineering director Nick Vaney, all servers are certified to run hotter than this, so there's still plenty of headroom.
Kitchens datacentres of distinction?
Vaney said that during testing he found that the native PUE of the design sits between 1.03 and 1.06. However, the datacentre includes a UPS and transformer that reduces efficiency by one percent. In addition, standby generators incorporate heaters to warm lubricants and water in winter, which between them push efficency down to its final figure of 1.13.
The air-cooling system consists of a series of modules that condition incoming air in a number of ways. First is a set of silencers - thick sound-deadening material followed by filters designed to remove particles larger than one micron, the F9 filtration standard.
The air is then pulled by fans into an cooler works using the adiabatic process - effectively, evaporative cooling. The cooler consists of panels containing corrugated sheets of material made of glass fibre combined with a china clay compound. This is soaked from the top using ordinary mains water that trickles down, cooling the air as it passes through gaps in the material.
The number of panels wetted determines the amount of cooling provided; the hotter the incoming air, the more panels the system will automatically activate. In normal operation, the system's main energy usage consists of the 200w water pump that keeps the cooler wet.
In the event of over-cooling, a bypass damper allows some or all the air to enter at ambient temperature. According to Vaney, under-cooling has not been an issue, apart from in rare climates such as Singapore, with both very high temperatures and very high humidities.
But does it trap cooking odours
If that becomes an issue, traditional chilling coils are in the air's path and can be activated if necessary. But according to Vaney, "we get 99.74 percent free cooling in the UK, or if humidity reaches 80 percent, all cooling is free".
After that, the air is pulled through further silencing material, making it one of the few such facilities in which you can hold a conversation without shouting. And because they're warmer than most, you don't freeze either.
Once in the data centre, the air passes along cold corridors and through louvred doors. These louvres are designed to open and close to keep incoming air at the right temperature. The cold aisle is completely enclosed - there's no raised floor - so that all the cooled air passes through the rack, impelled by the servers' fans into the hot aisle, from where it is exhausted into the environment.
The system has been independently tested using ambient air temperatures of up to 35 degrees c, reckons Vaney, and the system works in freezing ambient temperatures by mixing warm exhaust air with incoming air.
The facility is littered with sensors and feedback air passages to allow the control system to understand exactly what's happening to the air at all stages of the process, as a result of which it needs little human intervention. Rogers reckons that the company is claiming patents on several of the techniques used, and so was loath to tell me exactly how it works. Among the benefits of this tight control is that the system only delivers the volume of air demanded by the servers, so saving more energy.
Bladeroom makes its data centre available as modules weighing up to 20 tonnes, with everything all wired up and ready to go, including servers. If you want one, you buy the number of modules you need, and they arrive on the back of a series of lorries, shipped from the company's HQ on the outskirts of a Gloucestershire village. And if you want to make it bigger, you just buy more modules.
Is this the future of datacentre design? ®