Skip navigation
Sets of cooling towers in data center building. wei cao / Alamy Stock Photo

Data Centers Are Losing Their Cool

Are data centers really losing their cool? The latest findings from the State of the Data Center report showcase some fascinating stats. Learn how data centers are getting their cool back!

Now that I have your attention to the title above, I want to be very clear: I still think data centers are super cool! But data centers being incredible isn't entirely what we're here to discuss. I've had the chance to review the latest AFCOM State of the Data Center results. You can expect a full report in about a month or so; I'm still writing it. However, I do want to share some key findings that are key to start thinking about now.  

In the report, we took the time to remove some old questions and throw in some new ones. For this year, we asked about cooling and cooling capacity. The results were indeed interesting. When asked about their current state of cooling capacity, 46% of respondents came back saying that their current cooling solution meets all of their requirements. Hang on a sec.  

Does that mean that most respondents are running out of cooling capacity? 

Depending on whom you ask, it's a mixed answer. Based on conversations with data center leaders and analysis of cooling capacity versus cooling load, problems with cooling effectiveness are often the result of poor cooling resource management rather than a lack of cooling capacity.  

"With this growing awareness, data center leaders are becoming much more creative about managing density and improving airflow management," says Lars Strong, Senior Engineer and Company Science Officer at Upsite Technologies. "However, demand in this industry is robust. So, leaders must find ways to ensure optimal cooling capacity and airflow." 

The fascinating thing about increasing cooling capacity is the creativity we're seeing from data center leaders. More are looking at converged systems, improving airflow with unique containment solutions, and even turning to liquid cooling to help offset traditional cooling requirements. Let's stay on some of these points for a minute. 

Bringing the Cool Back to Data Centers 

Creativity in digital infrastructure is the means to get ahead in this industry. So, it's essential to consider what leaders are doing to offset cooling challenges and overcome capacity issues.  

First, liquid cooling is gaining momentum. In the AFCOM State of the Data Center report, we saw that about 40% of data center respondents had adopted liquid cooling. Almost half of the vendor respondents have seen the adoption of liquid cooling in their customers' facilities.   

According to a recent Data Center Knowledge post, enthusiasm for liquid cooling is being driven by data center operators, particularly cloud service providers. Revenue for the liquid cooling market could top $3 billion by 2026, with a 50.4% compound annual growth rate (CAGR). That CAGR extends from 2021 to 2026.   

That's according to Data Center Thermal Management Market Analysis - 2022, issued by Omdia (whose parent company Informa also has Data Center Knowledge in its portfolio of publications).  

"Fueled by the need to run compute-intensive workloads, cloud service providers' investment has had a particularly positive impact on the market's growth potential," said Dr. Moises Levy, senior principal analyst for data center power and cooling for London-based Omdia. As it relates to liquid cooling, remember these two points: 

But maybe you're not quite ready to dive into liquid cooling. Are there other options? You bet. 

A Rising Temp Helps All Servers 

Very recently, Equinix announced that they'd be bumping the temperatures of their facilities to 80 degrees Fahrenheit. Was this a smart move? Some data center experts say that this might damage the equipment. However, this is much less of a concern when you keep your gear up-to-date. Remember, new infrastructure hardware can operate just fine within the ASHRAE (The American Society of Heating, Refrigerating and Air-Conditioning Engineers) recommended 64.4 to 80.6 degrees Fahrenheit (18 to 27 degrees Celsius). 

"We are aligning with ASHRAE because its standard range has been extensively tested and proven safe. The ASHRAE A1 Allowable (A1A) is an internationally recognized standard operating envelope for mission-critical IT equipment. It has been widely adopted by IT equipment manufacturers since 2011," Equinix told Data Center Knowledge

If you ask Equinix, they believe in the concept. By increasing the temperature at each of its 240-plus data centers, the company said it would reduce energy consumption by 10% in some locations, decrease its carbon impact and help its customers reach their supply chain sustainability goals. 

"We are anticipating a long-term benefit in reduction of infrastructure energy costs from efficiency improvements," Equinix told Data Center Knowledge. "Our actions to improve energy efficiency in the data center should help further insulate our customers against a volatile energy market and advance their sustainability goals." 

Here's to a Cooler Tomorrow 

A part of the challenge in achieving greater cooling capacity revolves around supply chain constraints. While we hope these issues will lessen soon, data center leaders are still faced with cooling problems. 

It's a big reason why data center operators are examining existing systems to gauge their levels of efficiency. Just because a piece of infrastructure works doesn't mean it's bringing you any value. Legacy infrastructure is one of the most significant barriers to adopting more efficient systems requiring less cooling capacity. Here's what Moises Levy, Ph.D., a senior principal analyst at Omdia, thinks on the topic: 

"Changes to existing environmental conditions – temperature, humidity, airflow – need to consider the IT equipment requirements," Dr. Levy told Data Center Knowledge. "Legacy IT equipment may have different operating environmental conditions established by the manufacturer when compared to newer technologies. Compliance with the manufacturer's temperature guidelines is important to guarantee the desired equipment's performance and lifespan."

I recall recently speaking with a major university on this topic, raising temperatures. We were diving deep into our conversation when they informed me that most of their gear was seven to nine years old. When we examined the infrastructure, we saw that almost none of their equipment could handle higher temperatures. And so, they went back to the drawing board on how to improve density and efficiency. 

Bill Kleyman, Data Center Knowledge contributing editor, and Data Center World program chair 

Bill Kleyman, Data Center Knowledge contributing editor, and Data Center World program chair

As you look at your systems, take a moment to understand the impacts of simply having hardware that works. There's a good chance you're paying way too much to keep those servers and storage pieces operational. And, when coupled with savings from power, cooling, and density, the ROI on new infrastructure gear can offset your purchase price for new servers, networking gear, and storage. Get creative if you're scrambling to find ways to cool down your data center. Some will look for better containment solutions or work with simply airflow modifications to improve how cooling is delivered. Others will need to reflect on their digital infrastructure to determine the best course of action to deliver cooling capacity.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish