Skip navigation

Google on Data Center Efficiency: 'Just Do It'

<img src="/sites/datacenterknowledge.com/files/wp-content/uploads/2011/05/google-networking.jpg" alt="" width="470" height="365" /> Projects using best practices to capture "low-hanging fruit" efficiency improvement was the theme of the Google European Data Center Efficiency Summit held Tuesday in Zurich, Switzerland.

A look a networking gear in a Google facility after the company completed an energy efficiency retrofit in which it installed clear curtains at the end of the aisle to separate hot and cold airflow.

ZURICH - The state of affairs inside one of Google's network operating centers was "not very Googley." Cool air from floor tiles was rushing past the equipment and cooling the upper areas of the data center. Exhaust heat from one rack was venting near the intake for the computer room air conditioner (CRAC), warming the return air and making the CRAC believe the room was hotter than it was - causing it to waste energy as it over-cooled the room.

While Google is renowned for its data center efficiency,this particular networking center needed work. "Almost everything about it was wrong," said Urs Hoelzle, Senior Vice President of Operations at Google. The company invested $25,000 in efficiency improvements, and expects to see savings of $67,000 a year from the project.  A similar process was implemented at four other network equipment hubs around the world.

These type of efficiency projects to capture "low-hanging fruit" opportunities was the theme of the Google European Data Center Efficiency Summit held Tuesday in Zurich, Switzerland. The morning sessions focused on proven techniques for improving the efficiency of data center cooling systems, which continue to be a key area for capturing savings.

"It doesn't take cool or sexy technology that just Google or Amazon or Microsoft can use," said Joe Kava, the Senior Director of Data Center Construction at Google "Just do it."

ReallyNo Magic'

The most important step? Taking action on the easy, known techniques to improve data center operations. "There's really no magic in data center efficiency," said  Hoelzle, "Many of the things that get you a PUE of 2.3 to 1.5 are very simple applications of best practices."

Managing the airflow inside a data center is critical to efficiency. Understanding the airflow is often the most expensive portion of an efficiency project. Google conducted an airflow analysis using computational fluid dynamics (CFD) software, which provides a detailed 3-D analysis of how cold air is moving through a data center, identifying potential "hot spots" where equipment is receiving too little airflow. Thermal mapping can also find areas in a data center that are receiving more cold air than needed, wasting cooling and energy.

Many smaller data centers may believe CFD is not affordable, but Kava said the information it provides can easily pay for itself in savings. "We believe it is a really important component," said Kava, who said a CFD analysis could run from $5,000 to $10,000. "It's not really that expensive. There are engineering firms who can offer this if you don't have it in-house."

Rigid Containment vs. Curtains

Many times, airflow analysis will showcase the benefits of containment strategies that isolate the hot and cold air within the data center. This can be accomplished by capping the aisle, but that also creates challenges with fire suppression systems, as Google learned when it began the project at its networking center.

Local fire officials informed Google that they could not use containment that used fusible links, which allow sections of containment systems to fall away when heat is present, allowing overhead sprinkler systems to put out fires that might occur within an aisle. Instead, fire officials said that Google would have to extend its sprinkler systems below the containment system and into the aisle - which is problematic in a working data center.

Instead, Google opted to use clear plastic curtains at the end of each cold aisle, providing a barrier to prevent hot and cold air from mixing. The company also used curtains to surround a bank of batteries for its uninterruptible power supply systems (UPS), making it easier to keep them cool and stable.

More of a Choice Than A Skill

The details of efficiency projects will vary from facility to facility, but examples of best practices are available for a wide range of designs and configurations. Seizing that opportunity is particularly important for smaller data centers, which account for up to 70 percent of the industry's power use. Google executives said.

"It's a waste of energy and money to operate things inefficiently," said Hoelzle. "These companies don't know how simple it is. They think it's complicated or they need to be rocket scientists to do it. Efficiency is more of a choice than a skill."

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish