A diagram of an "air wand" indicating the location of cooling vents in the wand, a key feature of a patent application by Google data center engineers.
has revealed some of the secret technology inside its mighty data centers, but its engineers are busy cooking up new secrets.
An example: Google is seeking to patent an advanced data center cooling system that provides precision cooling inside racks of servers, automatically adjusting to temperature changes while dramatically reducing the energy required to run chillers.
The cooling design, which could help Google slash the power bill for its servers, reinforces Google’s focus on its data centers as a competitive advantage in its battle with Microsoft and other rivals for leadership in cloud computing. The company has customized much of the operation of its data centers, which serve as the engines powering its massive Internet business. Google builds its own servers 
and networking switches 
, and now appears to be customizing the racks that hold them.
Precision Cooling via ‘Air Wands’
The innovative rack cooling design features an adjustable piping system, including “air wands” that provide small amounts of cold air to components within a server tray. The chilled air enters the top of a rack through two vertical standpipes, which branch off into air wands – long, thin pipes lined with vents that release cold air.
The air wands can pivot to target cold air on specific components, or be swung to one side to allow equipment to be removed from the rack. Dampers on each standpipe can open and close to regulate the volume of air flowing into the pipe and air wands, while the vents on each individual air wand can be adjusted to point up or down, allowing for a highly configurable system. (See A Closer Look at Google’s New Cooling Design 
for a diagram).
Exaflop and Its History
It’s not clear whether Google is already using the cooling system. But the patent application 
was submitted by Exaflop LLC, whose 2008 patent for a UPS system 
integrating batteries with server power supplies helped Google achieve 99.9 percent UPS efficiency 
and record low Power Usage Effectiveness 
(PUE) scores. The address for Exaflop is 1600 Amphitheatre Parkway in Mountain View, Calif., which is Google’s headquarters. The inventors listed on the patent are Google employees Jimmy Clidaras and Winnie Leung.
The system designed by Clidaras and Leung addresses many of the most vexing challenges in data center energy efficiency. It allows Google to apply small amounts of cold air precisely where it is needed, rather than cooling an entire server room and seeking to steer the airflow into each rack and across the hot server components.
Going Beyond Containers
Google has used data center containers 
to isolate hot and cold air and gain greater control over airflow to its servers. The new design takes this concept to a more granular level of management. The air wands can apply cool air directly to the “hot spots” inside a server tray, meaning less air is wasted or misdirected in the server room or container. This could allow Google to use a smaller chiller plant in its data centers, saving energy in the process.
Chillers, which are used to refrigerate water for use in data center cooling systems, require a large amount of electricity to operate. With the growing focus on power costs, many data centers are trying to reduce their reliance on chillers.
This has boosted adoption of “free cooling,” the use of fresh air from outside the data center to support the cooling systems. This approach allows data centers to use outside air when the temperature is cool, while falling back on chillers on warmer days. The new design could be used as supplemental cooling in a data center using free cooling, or in facilities located in areas where fresh air cooling isn’t feasible.
Limitations of Free Cooling
Google is operating a chiller-less data center 
in Belgium, where the climate allows nearly year-round use of free cooling. But this strategy will only work in cooler regions, and Google’s global ambitions may eventually require data centers in hotter climates unsuitable for free cooling.
Google can gain additional control over its cooling system through automated monitoring and management, as the system is designed to respond to changes within the rack as temperatures fluctuate. “The temperature sensor output can be fed to a computer program that triggers air distribution in the event of the board temperature crossing a threshold,” the patent reads. “Each temperature sensor may be connected to a PID control loop 
with a damper, so the corresponding damper is opened … with an increase in temperature sensed for a particular area.”
Some Secrets Revealed, While Others Incubate
Google’s data center designs were kept secret for many years, consistent with the company’s belief that its data center innovations gave it a competitive advantage. In April Google discussed its data center operations for the first time, joining a growing industry conversation about best practices for energy efficiency.
The company revealed its data center containers, custom server design and on-board UPS, among other innovations. But some industry observers concluded that there was more in the pipeline that Google wasn’t discussing.
“Both the board and the data center designs shown in detail where not Google’s very newest but all were excellent and well worth seeing,” James Hamilton 
noted at the time. “I like the approach of showing the previous generation technology to the industry while pushing ahead with newer work. This technique allows a company to reap the potential competitive advantages of its R&D investment while at the same time being more open with the previous generation.”
Article printed from Data Center Knowledge: http://www.datacenterknowledge.com
URL to article: http://www.datacenterknowledge.com/archives/2009/11/30/google-patent-reveals-data-center-innovations/
URLs in this post:
 Google: http://www.google.com
 servers: http://www.datacenterknowledge.com/archives/2009/04/01/googles-custom-web-server-revealed/
 networking switches: http://www.datacenterknowledge.com/archives/2007/11/16/report-googles-secret-10gbe-switch/
 A Closer Look at Google’s New Cooling Design: http://www.datacenterknowledge.com/a-closer-look-at-googles-new-cooling-design/
 patent application: http://appft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.html&r=1&f=G&l=50&s1=%2220080204999%22.PGNR.&OS=DN/20080204999&RS=DN/20080204999
 UPS system: http://www.datacenterknowledge.com/archives/2008/02/15/google-files-patent-on-ups-architecture/
 99.9 percent UPS efficiency: http://www.datacenterknowledge.com/archives/2009/04/01/efficient-ups-aids-googles-extreme-pue/
 Power Usage Effectiveness: http://www.datacenterknowledge.com/archives/2008/10/01/google-the-worlds-most-efficient-data-centers/
 data center containers: http://www.datacenterknowledge.com/archives/2009/04/01/google-unveils-its-container-data-center/
 chiller-less data center: http://www.datacenterknowledge.com/archives/2009/07/15/googles-chiller-less-data-center/
 PID control loop: http://en.wikipedia.org/wiki/PID_controller
 James Hamilton: http://perspectives.mvdirona.com/2009/04/05/DataCenterEfficiencySummitPosting4.aspx
 Rich Miller: http://www.datacenterknowledge.com/archives/author/richm/
Click here to print.