Computer data centres are under pressure to accommodate ever more information – and keep it safe from dangers such as temperature fluctuations. Facilities designers could profit from new modelling software 6Sigma

As you sit at your computer whizzing emails about and making use of all the data at your fingertips, you probably never spare a thought for the vast data centres that exist behind the scenes making it all possible.

But while these anonymous-looking buildings, home to row upon row of servers, switches and routers that enable modern-day computerised data storage and communication, may be out of sight and mind for most people, they are in ever increasing demand. The upshot of this is that there is added pressure on services designers to deliver facilities that allow the IT equipment to run at optimum performance 24 hours a day, 356 days a year.

In the past, it was common for data centres to tick over at a relaxed 30-40% capacity. But as volumes of data traffic increase, and with the modern breed of blade servers with their higher heat loads, that kind of luxurious margin is eroding away.

“It used to be that data centres were typically designed for loads of around 1,500 W/m2,” says Andrew Harrison, a director at Arup. “But very few were run even close to full capacity.”

The main reason for these comfort margins was the need for resilience. Servers are unable to withstand a break in power of more than 10-20 milliseconds and require closely controlled environments, typically of 21°C+/-1 1°C and 50% relative humidity. High heat loads also lead to problems such as local overheating and hot spots.

Any breakdown in conditions could result in loss of data and have a serious knock-on effect for business. As such, uninterruptible power supplies are essential; back-up power for the mechanical systems is also vital. If the cooling system failed, for example, in a 30,000 m2 centre, with a 1,600 W/m2 load and a floor-to-ceiling height of 4.25 m, it would experience a 1°C rise in temperature every second – with a drastic effect on the IT equipment inside.

“A lot of clients, especially in financial services, get really nervous when you get over 70% utilisation,” says Harrison. “But that’s where you’re now starting to be.”

So for designers and data centre managers, the pressure is really on to both optimise and manage the facilities a lot more carefully. Harrison says they have seen a couple of incidents recently where, on a day-to-day basis, everything seemed OK, but a failure of a power distribution unit (PDU) led to electrical loads being transferred and overloading the system, which in turn led to a cascade failure.

“When you were operating the facility at only 30-40%, you had a bigger margin, but now that you’re using it harder, that margin is reduced and the probability of a cascade increases.”

But it’s not all doom and gloom. Help is at hand in the form of a software tool called 6Sigma. Developed by Future Facilities, the tool was created to carry out computational fluid dynamics (CFD) analysis at design stage, and has been extended into a complete data centre modelling and management computer package.

In the first stages the tool can be used to prepare 3D layouts, which are populated with server cabinets, computer room air conditioners (CRACs), PDUs and floor grilles. These can be tested to see how well they work and to check if there are any projected problems with floor grille layouts, cabinet heat loads, air temperature and pressure distribution in the floor void, overheating risks, floor void air movement, or any risk of air short-circuiting.

Harrison says the software is relatively quick and easy to use. “You can do the CFD analysis a lot earlier in the process and the benefit for us is we can make it more of an engineering task than a CFD specialism.

“The modelling tool is particularly useful for highlighting potential surprises. It picks up things you wouldn’t expect, such as getting hot air returning into the floor because the pressure in the floor void is not stable enough or things like air short-circuiting. People know they exist, and this helps you understand better than just sticking a finger in the air.”

The other advantage of 6Sigma software is its ability to provide an ongoing management tool for the facility. Data centres are dynamic spaces, with new servers regularly being added and older units replaced with more powerful versions.

The software can be used as a database to keep track of what IT equipment is where in the data centre – in a fast-changing environment, something that has not always been done very well.

It can also be used to assess the impact that, for example, introducing new servers or rearranging the cabinets might have on the environmental conditions within the space (see point 8). Importantly, it would allow solutions to be explored before any physical changes are made.

Harrison says that in the past there has been a serious lack of connection between the design stage and the ongoing operation of facilities.

“Now we’re seeing the opportunity of integrating that monitoring data with design data so you can test what happens in certain scenarios. It seems ridiculous that we do all this design work and it gets thrown away. But there is the realisation now that a much more integrated solution is needed because we’re working the facility a lot harder than in the past.”

Arup is currently co-developing two more modules for the software with Future Facilities. One concerns thermal transients. The other is an electrical module that will keep track of which servers are in which cabinets, and connected to which uninterruptible power supply (UPS), as well as allowing failures to be simulated to check for overloading. The additional modules should be available later in the year.