ASHRAE: Data Centers Can be Even Warmer
February 22nd, 2011 By: John Rath
The leading industry group for heating and cooling professionals, ASHRAE (American Society of Heating, Refrigerating and Air-Conditioning Engineers), says that it will continue to widen the temperature and humidity ranges for servers through a soon-to-be-published third edition of the datacom book, “Thermal Guidelines for Data Processing Environments.”
For some time now, leading players in the data center industry have been raising the temperature in their data center, savings hundreds of thousands of dollars in cooling costs in the process. The list of companies singing the praises of savings through higher baseline temperature settings in the data center includes Google, Intel, Microsoft and HP.
History of the Issue
Data center management has advanced quite a bit since 2004, when the first edition of “Thermal Guidelines” was published by ASHRAE’s Technical Committee 9.9, which examines the needs of with data center and telecom facilities. The first edition of the datacom book created a recommended temperature upper limit of 77 degrees Fahrenheit, promoting the use of higher temperatures and endorsed by all of the IT manufacturers. The second edition in 2008 recommended an upper limit of 81 degrees.
ASHRAE’s press release did not indicate what the new recommended range will be, but predicts the new guide will be “as equally groundbreaking as the previous editions,” in that it will enable compressor-less cooling (all cooling through economizers) in many applications.
“The numbers are still being worked on,” said Don Beaty, founder of DLB Associates and active member of TC 9.9. “We know that there will be opportunities for equipment outside the current range but the details are still being finalized.
“Different locations, applications and business philosophies make it ineffective to force all equipment to be capable of the same high temperature tolerance (in some cases higher thresholds would negatively impact the return on investment),” Beaty added. “To address this, the third edition creates multiple server classes and therefore provides freedom of choice. This is particularly important since the thermal guidelines are used throughout the world.”
But some in the data center industry say ASHRAE will simply be acknowledging advances that are already taking place in some of the largest working data centers. “Most companies in the cloud business are already procuring servers that operate well outside of the ASHRAE specs to allow for aggressive economization to drive much greater efficiencies then what is achievable using the ASHRAE specs,” said Christian Belady, the General Manager of Data Center Research at Microsoft Global Foundation Services “My guess is that they realize now that they are no longer driving the industry environmentals, and are now going to broaden to what the cloud providers have already made as the de facto standard.”
Belady was among those who encouraged ASHRAE to adopt a more aggressive revision of data center operating temperatures. “At the time I was arguing that (ASHRAE) should be leading the industry and drive vendors to broaden operating ranges well beyond what they ultimately published in 2009 so that the industry can aggressively adopt economization,” Belady said, adding that ASHRAE “elected to be conservative.”
ASHRAE’s embrace of economizers has not been without its rough spots. Last year ASHRAE experienced tension with leading data center operators when a proposal to establish fresh air cooling as the preferred method for cooling prompted an unusual joint statement opposing the proposal. Leading executives from Google, Microsoft, Amazon, Digital Realty Trust, DuPont Fabros Technology and Nokia (and later Facebook) urged ASHRAE to establish cooling performance-based efficiency goals for data centers, rather than favoring one economizers over methodologies.
ASHRAE said that its committee that formulates its key guidelines had made a commitment to “work with TC 9.9 to develop appropriate performance criteria. … There is an ongoing effort to establish a working group with interested parties and stakeholders to ensure that appropriate technical expertise is available to generate such criteria.”
Rich Miller contributed to this story.
[...] This post was mentioned on Twitter by Web Hosting, Nabeel Mahmood, networkreading, Hiro Kishimoto, Guillermo Preciado and others. Guillermo Preciado said: ASHRAE: Data Centers Can be Even Warmer: The leading industry group for heating and cooling professionals, ASHRA… http://bit.ly/heFD7k [...]
[...] This post was mentioned on Twitter by Schneider Electric, TFM newshounds. TFM newshounds said: ASHRAE: Data Centers Can be Even Warmer « Data Center Knowledge http://ow.ly/41cw2 [...]
[...] in virtualized environments will be held alongside discussions on the facilities side, like ASHRAE’s latest cooling guidance or free-air cooling designs like Yahoo’s “chicken coop”-modeled data [...]
While ASHRAE TC9.9 wrestles with widening the recommended thermal envelope for data centers, they must pay very close attention to the total effect. Increasing temperatures for enhancing cooling efficiency may prove to have an unintended and adverse effect on IT equipment energy consumptions. Studies have been conducted that suggest a sharp increase in server fan power at about 77°- 80°F. At or above these temperatures server fans may dramatically increase airflow to maintain silicon junction temperatures. This increase of airflow comes with a significant increase of energy consumption. While the PUE may improve (Infrastructure went down and IT went up) the total power consumption may actually increase.
The 90.1 Energy Standard does not specifically require fresh air economizer, and does provide for other economizer architectures including water side systems. It is anticipated that there will continue to be pressure to move from a prescriptive based compliance path for data centers, to a more performance based path, without the complexity of the Energy Cost Budget Method.
[...] ASHRAE: Data Centers Can be Even Warmer (datacenterknowledge.com) [...]