Skip navigation
The Green Grid Backs Warmer Data Centers

The Green Grid Backs Warmer Data Centers

In a white paper released this week, The Green Grid makes the case that servers are indeed tougher than widely believed, and counsels data center operators to consider raising temperatures and humidity just a little more to save on cooling.

Are servers tougher than widely believed? Can providers up the temperatures and humidity just a little more to save on cooling? Relaxing tight control over temperature and humidity equals less power required to cool the data center; but historically, higher temperatures have been viewed as potentially detrimental to equipment reliability and service availability.

In a white paper released this week, The Green Grid makes the case that servers are indeed tougher than widely believed. The Green Grid is perhaps the most prominent industry group on the energy efficiency front, with a goal of improving the resource efficiency of IT and data centers worldwide. It also has developed and championed Power Usage Effectiveness (PUE), the leading metric for data center power efficiency.

Using recently published American Society of Heating Refrigeration and Air-Conditioning Engineers (ASHRAE) data, the Green Grid is providing guidance to its members based on the influential ASHRAE recommendations, and also examines some of the reasons users resist higher temperatures. It’s an attempt to dispel misconceptions about cranking up the temps a bit. It asserts that data center efficiency can be further improved by employing a wider operational range without substantive impacts on equipment reliability or service availability.

Questioning historic practices

The environment inside a data center has historically been tightly controlled. However, rising energy costs, and carbon taxation, coupled with equipment advances, have many questioning these previously acceptedsafe operating ranges. For many years, between 20C and 22C has been considered the optimal operational temperature.

Just how old is this optimal range? Think punch cards. Yes, the report finds evidence that this operating range was initially selected because it would help avoid punch cards from becoming unusable. “Historically, many of these perceived tight thermal and humidity tolerances were based on data center practices dating back to the 1950s,” says the report. In all this time “possible affects have rarely been quantified and analyzed”

The first edition of the ASHRAE guidelines in 2004 created a recommended temperature upper limit of 77 degrees Fahrenheit. The second edition in 2008 recommended an upper limit of 81 degrees. ASHRAYE previously defined two operational ranges: “Recommended” and “Allowable” When Data Center Knowledge last covered ASHRAE, we mentioned that some in the data center industry have asserted that ASHRAE TC 9.9 wasn’t moving fast enough to recognize the potential gains from higher temperatures. The Green Grid report points out that many vendors support temperature and humidity ranges that are wider than the ASHRAE 2008 allowable range, as well as outlines potential gains

ASHRAE is the “agreed-upon intersection between multiple vendors” says the report. The new ASHRAE 2011 class definitions expand way beyond “Recommended” and “Allowable,” with two new operating definitions with higher Allowable operating temperatures (class A1-A4, with A1 and A2 referring to the two previous classes).

Guidance, Not Blanket Statements

The wisdom of raising the thermostat really depends on your data center. There is wide variability between vendors’ supported thermal and humidity ranges, presenting a challenge in data centers with multiple vendor products – i.e. most of them. Vendor support for these ranges alone is not enough. Operators cite a lack of clarity, even in light of new equipment often supporting a wider temperature and humidity range. There’s been a lack of good data on implications of running in the higher “allowable” range (as opposed to “recommended”). Few users are in position to quantify the risks or impact of using allowable or even recommended ranges. There’s been a conservative approach.  Instead, most efficiency has been reached through better airflow management rather than lower cooling.

The ultimate conclusion is that there are a wide range of cases where efficiencies are not being captured. Looser environmental controls within the wider range of vendor-established temperature and humidity limits can mean big savings with no substantial losses to reliability or availability. Industry improvements in IT equipment efficiency have made this possible. Some emerging IT solutions are specifically designed to operate at higher temperatures with little or no increase in server fan energy consumption.

So, archaic boundaries for temperature and humidity have generally been the rule, despite advancements. Data center operators have been reluctant to change, despite increasing evidence that higher temperatures can be both safe and eliminate costs.

Cloud Computing is Driving Changes

So why now? What’s the cause of innovation now? The paper cites cloud computing as one major driver. The rise of cloud computing has been “the catalyst for innovation across the whole spectrum of IT activities."

“(Cloud]) triggered a radical change in the way IT services are being delivered, and they have changed underlying cost structures and cost-benefit approaches in IT service delivery,” says the Green Grid report. Early cloud innovators see data centers as a major element of service delivery costs. “Any opportunity to reduce overhead and facility costs can reduce net unit operating costs and potentially enable the data center to operate more economically.”

Change is afoot. The Green Grid reports that it has observed a slow, but steady increase in the adoption of wider operating envelopes. One example given is Deutsche Bank, which recently built a production data center in the New York market that is capable of handling nearly 100 percent of its cooling load by using year-round air-side economization. The bank is able to cool its data center with no mechanical cooling necessary for at least 99% of the time through a combination of facilities innovations and the willingness to operate IT equipment at an expanded environmental range.

The report is chock-full of information. It’s an incremental step forward, but a positive step forward in energy savings through raising the temperature a bit.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish