A Glimpse Inside Google's Data Centers

Google has released several new photos showing servers in its facilities. What can these images tell us about Google's data center operations and production environment?

Rich Miller

May 22, 2012

4 Min Read
Google Servers

google-servers-1

A photo Google released last week showing servers in one of its facilities. Click for larger version of the image. (Photo: Google)

It's been a while since we've had a good look at the inner realms of a Google data center, where the company houses its servers. Oh sure, we've seen networking rooms, chillers and piping and water treatment plants. But we haven't seen one of the company's production server environments since 2009, when Google published video of containers packed with servers. By then Google had already moved away from containerized designs, and some Google-watchers indicated that the servers and designs featured at the event were already several generations old.

Last week Google published photos of some of its 900,000 servers as part of a larger presentation that explains how an email makes its way across the Internet. The images (one shown above, another at the end of this article) show rows of racks, fully packed with servers and bathed in the green light of the LEDs on each server tray. So let's take a closer look and see what else these photos might be able to tell us about Google's data center environment:

Servers

Each rack in the photo is populated with 20 servers, typical of 2U servers. Google designs its own servers, which include an on-board battery in each tray, a configuration that allows it to save power by avoiding the AC-to-DC conversions  required with a centralized UPS and battery banks. Google uses its most newest and most powerful servers in its search service.

"We optimize our fleet by repurposing older servers for services that don’t require the same processing power, such as Gmail or Picasa," Google says. This strategy has translated into huge savings in Google's hardware budget, allowing it to avoid buying up to 90,000 servers.  For context, only a handful of companies admit to having more than 90,000 servers.

Racks

An interesting wrinkle: the racks are on wheels. We've heard lots about "rack-and-roll" deployments, in which servers arrive pre-packaged in their racks and are rolled into place. Leaving the wheels on the racks allows for mobility and ease of configuration, but raises other questions.

For example: how would this configuration perform during earthquakes? Seismic isolation systems provide some leeway for rack movement during temblors, but having them on wheels could make life interesting during a major quake. It's possible that these racks are either deployed at a Google data center in an area where few earthquakes occur (such as the Carolinas or Atlanta) or may not represent a full production environment.

Raised Floor

One of the photos shows rows of racks on either side of a narrow corridor, with a row of perforated floor tiles. This is typical of a design in which cold air enters the server area through a sub-floor plenum in the "cold aisle", with air pressure guiding the cool air up and through the racks. As the air passes through the racks, it removes heat from the components and exits through the back of the rack into the "hot aisle," where it is then routed to cooling equipment for recirculation.

But is this Google's primary cooling design? What we've heard in recent years is that Google data centers featured raised floors, but used them primarily to house piping for water, rather than as a cool air plenum. The water was used in a cooling system that features cooling units in the hot aisle, which function much like rear-door heat exchangers, removing the heat from the air and returning it back to the room at the supply air temperature to prevent recirculation of hot exhaust air. The water used in the heat exchanger then goes to a cooling tower and is recirculated. But the reports we've heard don't appear to align neatly with the configuration shown in these latest photos.

For what it's worth, Google has also sought to patent an adjustable cooling piping system, including “air wands” that provide small amounts of cold air to components within a server tray. The company has also patented a design for a “server sandwich” in which two motherboards are attached to either side of a liquid-cooled heat sink.

Mysteries Remain

The question remains: Is this Google's production environment, a computing lab or testbed? Google has always treated the details of its data centers as proprietary, saying its expertise and innovation in its data centers represents a competitive advantage. Given that, and the fact that the company's 2009 disclosures showcased older technology, it's possible that these photos may not represent the current state of Google's production environment.

Or perhaps the Story of Send marks the beginning of a shift, in which Google discloses more of its current technology, as Facebook has done with the Open Compute Project.

If not, we'll continue to analyze and speculate until more photos emerge. And yes, we're still waiting for the photos of the moats filled with sharks with friggin' laser beams on their heads.

Here's the second photo:

google-servers-2

Another view of Google servers. Click for a larger version. (Photo: Google)

Read more about:

Google Alphabet
Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like