Skip to content Skip to footer

Reduce the cost of cooling

Electrical Review Logo

Facility teams and data centre managers know that to survive in a world where low cost cloud infrastructure dominates, they need to cut costs to the bone. The hardest thing to cut has always been the cost of cooling. As air temperatures creep up inside the data centre techniques such as free air cooling and airside economisers become effective cooling solutions.

The two biggest improvements in data centre cooling have been the introduction of aisle containment and the ability of servers to manage higher input temperatures. Containment is a solution to air management that can be retrofitted to data centres. It has been responsible for both extending the life of older data centers as well as enabling higher densities without having to invest in expensive refits of cooling systems.

Higher input temperatures have been promoted by Ashrae, the industry body responsible for data centre standards. Just 10 years ago, many data centres were still cooling input air to 65F (18C) while today they are working at 79F (26C) and even higher. This ability to manage higher input temperatures has also been helped by new generations of silicon and motherboards.

Using natural resources to supplement and replace cooling

Despite these changes, more still needs to be done to help cut the costs of cooling. This has led to a group of techniques known as free air cooling. The idea is to use as much ambient air as possible to remove the heat from the data centre. The stated goal of most of these systems is to have no mechanical cooling at all.

It sounds great but the reality is that there are few places on the planet where the outside air temperature is low enough to cool most data centres all year round. It is not just ambient air that is a challenge, the type of technology under the free air cooling banner that is chosen comes with a number of additional challenges from data center design to particulate matter.

The challenge of just using ambient air inside the data centre

Using pure ambient air inside the data center is not a technique that can be retrofitted to existing facilities. The first challenge is getting a large enough volume of air below the room. A large volume is needed to create the pressure to push the air through the data hall. The Hewlett Packard data centre in Wynyard, UK uses a five-metre hall to create the required volume of air.

To help create the right pressure to draw the ambient air into the data hall, the hot air has to be expelled via a chimney. This needs careful design in order to not only extract all the hot air but do so in such a way as to create a partial vacuum which then draws in the cold air behind it.

To ensure that the air does not contain any particulates that would impact the internal performance of the equipment in the data hall, you need very large filters. Air inside cities tends to have high lead and other particulates, especially from diesel vehicles and general dust. It also tends to be warmer than air in the countryside and this can severely limit the number of days where ambient air can be used without secondary cooling.

The air in country areas can be even dirtier from a data centre perspective. Pollen, dust, insects even swarms of bees and wasps have been reported being caught on the filters that guard the large air halls. The ambient temperature is often lower than in city areas but here wind can be a problem as high winds can force small particles of dust through filter screen.

Humidity and dew point

Data centre managers are acutely aware of the risk of humid air inside the data centre. Too little humidity and the risk of static electricity rises. When this discharges over electronic equipment it causes havoc and destroys circuit boards. Too much humidity leads to condensation which can not only short out systems but can cause corrosion, especially in power systems.

When using conditioned air inside the data centre, this problem is handled through the use of the chillers and the dehumidifiers. Free air cooling, however, creates its own problem. The most obvious of these is on rainy days or when there is a very cold ambient temperature being drawn into a hot data centre. In both these cases, water in the air tends to condense very quickly and if not handled properly is a disaster waiting to happen.

Some data centres are being built near to the sea to take advantage of the natural temperature change between the land and sea. This looks like a good strategy at first but the massive damage from salt water in the air can destroy data center equipment in a fairly short period of time. Any use of free air cooling in these environments requires a significant investment in technologies to clean the air of all salt before it is used for cooling.

To get around this problem, free air cooling systems mix existing air from the data centre with the air being drawn in from outside. Where the air is extremely cold, this helps to heat the air and reduces the risk of cold damp air condensing on processors, storage devices or power systems. Where the air is simply very heavy because of the external humidity, there must be dehumidifiers available that can be brought online even though this adds extra cost to the power budget.

Airside economisers – Direct and Indirect

A technology that gets the most out of free air, reduces the particulate and dewpoint issues as well as being capable of retrofitting into an existing facility is airside economisers. They bring ambient air in, filter it and then mix it with exhaust air to raise the temperature if required. The air is then passed either through an Air to Air heat exchanger (Indirect) or direct and through a backup water or DX air coil in order to get the right air input temperature for the room.

The advantages of using airside economisers is that they are not a single approach to free air cooling. By dealing with the issues identified and by having the ability to filter, have both direct and indirect exchange and additionally cool or heat air, they can reduce the cost of cooling and get the most out of ambient temperatures.

The Green Grid estimates even data centers in hot environments such as Florida, Mexico, Texas , Portugal, Southern Spain and even the Middle East should be able to manage 2,500-4,000 hours per year of free air cooling. Much of this will be at night and during the winter months.

In more temperate climates such as the UK, Northern France, Netherlands, New York and parts of California, this can rise to 6,500 hours. Further north and data centre owners should expect up to 8,000 hours although there will be additional costs in removing excess humidity and heating air before injecting it.

To get the most from airside economisers, however, it is essential users understand the requirements of the technology. One of the most common failure points is poor pressure management. If there is insufficient pressure to draw the air through the data halls then air will remain stagnant and just increase in temperature as it is poorly circulated.

It is also important to ensure that the temperature sensors are effectively placed. These should be integrated with the data centre information management (DCIM) systems so that operators can quickly identify any temperature hotspots. One problem caused by poorly placed sensors is that too much return air is added into the airflow causing the input temperatures to rise unexpectedly. The opposite is also true if they are located too close to a heat source where air will be given additional cooling creating a large difference between hot and cold and exacerbating the risk of condensation.

When integrating air side economisers into modular solutions, it is essential to allow enough exterior space in order to install the equipment. This is why modular equipment manufacturer Cannon Technologies has designed its own solution specifically for modular data centers. Depending on the climate, the target PUE can be as low as 1.1.

Conclusion – Money is there to be saved

In one survey, Intel looked at the impact of airside economisers where the outside air temperatures were 90F (32C). They estimated a 10MW facility would save almost $3m per year and that the risk of increased equipment failure was so low as to be insignificant.

In the temperate climate of the UK and mid US, free air cooling is able to deliver a Power Usage Effectiveness (PUE) as low as 1.05 against an industry average of 2.0. What this means is that for every 1kW of power consumed in the data halls during winter months, just 1.06kW of energy is actually consumed. This means that non IT equipment usage of energy is 6% of the total energy bill.

The type of free air cooling used is a mixture of flood cooled / hot plenum and chimney cabinet deployments. For summer months when temperatures can rise to above 20C a water or DX cooling solution is also required for when only mechanical cooling will do.

You may also like

Stay In The Know

Get the Electrical Review Newsletter direct to your inbox, and don't miss a thing.