Essential data centre cooling systems explained

An overview of key components and how they work together.

Essential data centre cooling systems explained
Photo Credit: Paul Mah

How do data centres keep cool? This miniature model shows all the important components you need to know about this important topic.

Keeping data centres cool

Keeping data centres cool is critical. Servers, networking equipment and GPU servers emit a lot of heat that must be dissipated on pain of hardware failure or crashes.

In 2023, a mistake at the Equinix SG3 data centre interrupted the flow of chilled water. The result? Servers in affected sections failed en masse.

This triggered a catastrophic failure at DBS Bank, downing services such as ATMs, digital payments, credit cards and online banking - and took many hours to resolve.

Chilled liquid and measuring heat

Did we just mention chilled water? Yup! It's cheap, readily available, and extensively used in most data centres where it's circulated to absorb and remove heat. More on this.

You might have noticed a strong emphasis on kilowatts (kW) per rack for tracking both electricity consumption and cooling needs.

Why use KW to measure cooling requirements? Because nearly all electricity used by IT systems turns into heat. So, knowing a rack's power draw tells you exactly how much cooling is needed.

Flow of chilled water

Data centre cooling is typically two-stage: mechanical plant systems to produce chilled water, and in-room cooling equipment within data halls.

Plant systems
⇩ ⇧
In-room cooling equipment

Chilled water from the plant systems is at its coldest when it enters the data hall. After absorbing heat, the water returns warmer, ready to be re-cooled.

Key components

Let's take a deeper look at key components.

(a) Plant systems

  • Chiller: Removes heat via mechanical refrigeration.
  • Cooling tower: Removes heat by evaporation.

In places like Southeast Asia, many data centres use both chillers and cooling towers in tandem to cool water to the desired temperature.

It's possible to go with chillers only or cooling tower only - but it gets complicated quickly. There are also systems like StatePoint Liquid Cooling which is a category of its own.

(b) In-room cooling

  • CRAC: Cools data halls by blowing cold air in.
  • CDU: Distributes liquid coolant for liquid cooling.

The chilled water is piped to in-room cooling equipment such as CRAC units which does the actual cooling of rack-mounted IT equipment.

With liquid cooling, Coolant Distribution Units (CDUs) manage a closed secondary loop that circulates coolant directly to individual servers.

And that's all I can squeeze into this post. Comments or questions?