5 Data Center Cooling Mistakes to Avoid

There's more to data center cooling than simply blowing air around to cool equipment. Here's how to avoid five common data center cooling oversights.

Christopher Tozzi, Technology Analyst

September 13, 2023

5 Min Read
green fan
Alamy

Cooling your data center may seem simple enough. You install conventional HVAC equipment, blow air through your data center, and call it a day, right?

Well, not necessarily. Data center cooling is a complex topic, and it can be easy to make mistakes that reduce the efficiency and effectiveness of cooling systems.

With that reality in mind, here's a look at common data center cooling oversights and tips on how to avoid them.

1. Settling for Air Cooling

Air cooling — which means circulating air inside a data center to dissipate heat from servers and other equipment — is the traditional way to cool a data center. It's also the simplest and, in terms of upfront cost, the cheapest because air circulating equipment is relatively inexpensive to install.

But there is an alternative to air cooling you should also consider: liquid cooling, a method that uses fluids to dissipate heat. In fact, liquid cooling can be up to 10 times more effective at moving heat from data center equipment than air. The downside is that liquid cooling systems are considerably more expensive to install and more complicated to maintain.

Thus, when determining whether liquid cooling is right for you, you need to factor in your budget, as well as how much heat your data center hardware generates and how quickly you need to dissipate it.

Related:Data Center Power & Cooling in 2023: Top Stories So Far

The point is that it's a mistake to assume that air cooling is the only solution available. Be sure to look into liquid cooling systems, too, when planning how to cool your data center.

2. Placing Too Many Servers in Each Rack

If you use air cooling, the ability of air to circulate inside server racks is critical for efficient heat dissipation. Cramming too many servers into each rack could impede air circulation.

For this reason, think about the cooling impact of filling each rack to full capacity. Although you also want to make sure you're not wasting too much rack space, leaving a few open spaces (especially if they are distributed through the rack) can help prevent constrictions that might make air harder to flow. Aiming for 85% or 90% rack space utilization is a reasonable goal.

3. Suboptimal Rack Placement

The arrangement of server racks on a data center floor also impacts air cooling efficiency in a major way. There are different ways to optimize rack placement for cooling purposes, and the best one for you depends on the extent to which your data center facility can contain airflow.

Related:Is Liquid Cooling Adoption by Data Centers Becoming a Zero-Sum Game?

The traditional strategy for optimizing cooling efficiency is known as hot aisle/cold aisle. Under this approach, the fronts of server racks face each other so that hot air, which typically comes out of the back of server racks, is dissipated into the aisles between racks.

Hot aisle/cold aisle is usually the best cooling method if hot air is simply absorbed into the data center facility as a whole. But if you have built air containment into your facility, you can have the backs of servers face each other and direct the hot air toward closed-in spaces, where it can then be dissipated.

The point here is that you must think about the design of your overall data center facility to determine how to place your racks. If you have a wide open space, a hot aisle/cold aisle rack layout is the best way to manage cooling, but more advanced data centers offer air containment methods that provide more efficient alternatives to hot aisle/cold aisle arrangements.

4. Lack of Cooling Data

The way you expect your cooling systems to perform when you design them may not align with how they actually perform. But unless you track cooling performance on an ongoing basis, you won't know it.

That's why it's important to monitor temperatures across your data center. Collecting temperature data from many locations within the facility allows you to pinpoint bottlenecks for heat dissipation — which could occur if, for example, a constriction inside a server rack is causing hot air to collect in a place where it shouldn't. Temperature monitoring also helps identify instances where equipment failure (like a broken fan) is causing cooling inefficiencies.

5. Forgetting About the Roof

Your data center's roof might not seem like an important consideration for data center cooling, but it is. The color and material of the roof impact cooling efficiency, especially in regions where outside temperatures and exposure to the sun are high.

Optimizing the roof for cooling purposes is less important than measures like optimizing server rack layouts, but the roof should nonetheless be on your list of items to consider when planning a cooling strategy.

Conclusion

Data center cooling is more complicated than it often appears. To do it right, you must consider a variety of factors — like which type of cooling system to use, how to arrange equipment inside your data center, and how to collect data about cooling performance. Simply blowing air around to cool equipment might get the job done, but probably not in the most cost-effective or energy-efficient way.

About the Author

Christopher Tozzi

Technology Analyst, Fixate.IO

Christopher Tozzi is a technology analyst with subject matter expertise in cloud computing, application development, open source software, virtualization, containers and more. He also lectures at a major university in the Albany, New York, area. His book, “For Fun and Profit: A History of the Free and Open Source Software Revolution,” was published by MIT Press.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like