Skip navigation
How Google Cools Its Armada of Servers
Hot aisle in one of Google’s data centers (Photo: Google)

How Google Cools Its Armada of Servers

One of Google's best-kept secrets has been the details of its cooling system, which allows Google to pack tens of thousands of servers into racks. Google's Joe Kava discusses the design of its cooling system, where all the magic is focused on the hot aisle.

Here's a rare look inside the hot aisle of a Google data center. The exhaust fans on the rear of the servers direct sever exhaust heat into the enclosed area. Chilled-water cooling coils, seen at the top of the enclosure, cool the air as it ascends. The silver piping visible on the left-hand side of the photo, which carry water to and from cooling towers. Click for a larger version of the image. (Photo: Connie Zhou)

Google has shared some of its best practices over the years, but other parts of its data center operations have remained under wraps. One of the best-kept secrets has been the details of its cooling system, which allows Google to pack tens of thousands of servers into racks.

Google Senior Director of Data Centers Joe Kava discussed the design of its cooling system with Data Center Knowledge in connection with the company's publication of a photo gallery and a StreetView app that provide Google's millions of users with a look inside its data centers. If you're one of those data center managers who worries about having water in close proximity to the IT equipment, what you're about to read might make you nervous.

In Google's data centers, the entire room serves as the cold aisle. There's a raised floor, but no perforated tiles. All the cooling magic happens in enclosed hot aisles, framed on either side by rows of racks. Cooling coils using chilled water serve as the "ceiling" for these hot aisles, which also house large stainless-steel pipes that carry water to and from cooling towers housed in the building's equipment yard.

Following the Airflow

Here's how the airflow works: The temperature in the data center is maintained at 80 degrees, somewhat warmer than in most data centers. That 80-degree air enters the server, inlet and passes across the components, becoming warmer as it removes the heat. Fans in the rear of the chassis guide the air into an enclosed hot aisle, which reaches 120 degrees as hot air enters from rows of racks on either side. As the hot air rises to the top of the chamber, it passes through the cooling coil and is cooled to room temperature, and then exhausted through the top of the enclosure. The flexible piping connects to the cooling coil at the top of the hot aisle and descends through an opening in the floor and runs under the raised floor.

Despite the long history of water-cooled IT equipment, which dates to IBM mainframes, some managers of modern data centers are wary of having water piping adjacent to servers and storage gear. Many vendors of in-row cooling units, which sit within a row of cabinets, offer the option of using either refrigerant or cooled water.

Kava is clearly comfortable with Google's methodology, and says the design incorporates leak detection and fail-safes to address piping failures.

"If we had a leak in the coils, the water would drip straight down and into our raised floor," said Kava, who said pinhole leaks and burst coils could be slightly more problematic. "We have a lot of history and experience with this design, and we've never had a major leak like that."

Focused on Efficiency, Not Frills

Kava says the design - known as close-coupled cooling - is significantly more efficient than facilities that use a ceiling plenum to return the hot exhaust air to computer room air conditioners (CRACs) housed around the perimeter of the raised floor area. "The whole system is inefficient because the hot air is moved across a long distance as it travels to the CRACs," said Kava.

Nearly all facets of Google's design are focused on efficiency - and that doesn't just mean efficiency with power. It also includes efficiency with cost. An example: Google's use of plastic curtains instead of rigid containment systems to manage the airflow in its networking rooms.

Gogole's custom servers also have a bare bones look and feel, with components exposed for easy access as they slide in and out of racks. This provides easy access for admins who need to replace components, but also avoids the cost of cosmetic trappings common to OEM servers.

"When you pull out one of our server trays, it's like a cookie sheet with a couple of sides," said Kava. "It's inexpensive. We're not going for fancy covers or sheet metal skins."

Kava said Google-watchers can expect more information on the company's best practices in coming weeks. "Our intention is to follow this up with a series of blogs highlighting our technology," he said.

The server area in Google's data center in Mayes County, Oklahoma provides a look at Google's no-frills servers. (Photo: Connie Zhou for Google)

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish