There has been a surge of investment recently in data centres in the remote north of Sweden, including Hydro66 at Boden and Facebook at Luleå. The obvious reason for this is the cold climate, which supports ‘free’ cooling and so reduces the considerable cost of taking the heat out of modern data centres.
This article describes how Hydro66 has implemented a direct fresh-air cooling system, supported by evaporative cooling, to produce one of the most efficient data centres in the world. Other factors in this region can reduce the capital and operating costs of a data centre, and will also be discussed.
Challenges and Actions
Hydro66 has recently completed and populated the first phase of a 20MW data centre in Boden, which – together with the towns of Luleå, Älvsbyn and Piteå – has formed the Node Pole Alliance. The aim of this association is to make the most of the vast natural resources of this area to attract data centre investment. The key advantages are low-cost cooling and power, and practically unlimited renewable generation capacity and grid infrastructure to support large centres.
A key objective of Hydro66 was to design the data centre to operate at a power usage effectiveness (PUE) – the ratio of total amount of energy used by a data centre to the energy delivered to computing equipment – of less than 1.05. This could only be achieved using fresh-air cooling supported by the most efficient uninterruptible power supply (UPS) and power distribution.
Boden lies on the 66th latitude and, consequently, is cold, with a record low temperature of -40°C and a maximum of 32°C, although it rarely exceeds 25°C. In this climate, a simple ventilation system can maintain compliant temperatures for much ofthe time, but in the few instances a year when the ambient exceeds 25°C, a supplementary cooling system is required. The options for this are chilled water, direct expansion (DX), and direct or indirect evaporative cooling.
A well-designed chilled water system can be very efficient but expensive. It is also only viable for larger cooling loads. The market for a data centre such as Hydro66 is in ‘colocation’, where equipment, space, and bandwidth are available for rental to retail customers. The final cooling loads are not known at the time of construction, so it is impossible to size a chilled water system to reflect an unknown load.
IT equipment is normally installed in 600mm-wide racks, and rack loads can vary between 2kW and 15kW, depending on the type of equipment installed. This is why most data centres take the DX route; units of up to 100kW of cooling load can be added in a modular way, reflecting the cooling load as the computer racks are populated.
Indirect cooling is the next option; outside air passes on one side of a heat exchanger and the hot air from the centre passes on the other side. The amount of heat extracted is dependent on the outside temperature. On warm days, the ambient (outdoor) air is adiabatically cooled using wetted pads or sprays, and – on very hot days – a DX coil is added to maintain compliant conditions.
The obvious key advantage of an indirect ventilation system is that any contaminants in the external air are not brought into the data centre. Because indirect cooling systems require large heat exchangers, the equipment is big and can be expensive. Also, in the case of Hydro66, the Boden planning authorities preferred solutions with no external plant.
Hydro66 decided to use a direct ventilation system supplemented by evaporative cooling (EcoCooling ECT10800 Nordic range coolers). The equipment is modular and installed internally, thereby avoiding planning issues. Electronically commutated (EC) axial fans are used for air movement because they are easier to install and take up less space than centrifugal fans. With very low pressures, axials can also accommodate the larger flow rates and pressure, and their motors are efficient, quiet and have simple speed controls.
Lessons & Results
The efficiency of a fan is approximately proportional to the cube of the speed. Data centres require redundancy of N+1, 2N or 2(N+1), so equipment is operated at part capacity. By controlling all of the EC fans as a group – and reducing the air flow rate to that required by the IT equipment – reductions in consumed fan power can be achieved, producing remarkable efficiencies.
On average, 1MW of IT equipment will require an airflow of 90m3 /s of air at compliant temperatures. A ventilation system, based on EC axial fans, can support 1MW of cooling for approximately 40kW of fan energy use. This adds 0.04 to the PUE of the data centre. If – as in the case of Hydro66 – this is used in conjunction with a rotary UPS solution (a flywheel driven by an electric motor) where losses are <1%, a PUE of 1.05 is attained. Since the data centre has both redundancy and spare capacity, the ventilation rate is reduced and further savings are made. For example, running a fan at 80% reduces energy use by half and, at 50%, to 12.5%.
An intelligent control system is used by Hydro66 constantly to optimise the fan energy use to reflect actual cooling requirements in a dynamic environment. On warmer days, the adiabatic cooling is enabled, bringing the supply air down to approach the wet-bulb temperature of the ambient air. In Boden, this means the supply air will never exceed 22°C, which is compliant with all standards without the need to use additional mechanical refrigeration. The use of adiabatic cooling will increase the moisture content, while reducing dry-bulb temperature, so increasing the relative humidity of the air.
With reference to the ASHRAE 2011 Thermal Guidelines, high relative humidity (RH) will normally only cause corrosion with other contaminants in the air. If gases such as sulphur or chlorine are in the ambient air, these, plus high RH, can cause corrosion. Boden has ‘clean’ air because there are no local industries producing contaminants.
The combination of high RH and dust or particulates can also create problems, so all incoming and recirculating air is filtered. In relatively clean conditions such as those in Boden, EU4 is a suitable level of filtration. Increasing this can result in significant increases in capital cost, maintenance requirements and fan energy use.
A direct fresh-air system operating in arctic conditions at the coldest time of the year can result in very low RH in the data centre. Low RH, in conjunction with other factors, can cause problems with electrostatic discharge (ESD), which can damage IT equipment. The Hydro66 cooling system incorporates a recirculation loop, where – in low RH conditions – the warm air from the data centre is passed over the adiabatic pads to humidify the air above the ASHRAE 2011 Thermal Guidelines’ allowable level of 20%. This novel solution therefore uses the adiabatic pads for two functions – cooling in hot weather and humidification in cold weather conditions.
Hydro66 has constructed a low capital cost, flexible data centre, which has achieved a PUE of less than 1.05. The direct fresh-air cooling system complements the commercial strategy with a modular system that supports this progressive development.