Skip navigation

The Illustrated Data Center

Over the past eight years, Data Center Knowledge has taken our readers inside many of the world's most advanced data centers. On one level, these facilities are all about functionality - moving bits from one place to another and keeping servers online and working. But they have also become striking examples of the beauty of the built  environment, with a look and feel unlike other buildings. With the right perspective, the inside of a data center is a visual feast.

Today we kick off The Illustrated Data Center, a regular series that showcases some of the most unique and visually striking data centers we have seen. We begin with a look at the world of blinking lights that keep the Internet running, followed by photos illustrating the "Four Cs" of the inside of a data center - Corridors, Cabling, Cooling and Containment.

If your data centers is among those featured here, we invite you to share this page on Twitter (#illustratedDC) and other social media channels.

Into Darkness

Why are many data centers dark? Some attribute it to the desire for secrecy, to cloak systems in darkness so competitors can't see what kind of equipment is running in your cabinet or cage. Some folks just think it looks cool. The darkness is interrupted by the lights on the servers, which blink away as they work.


A close up of a row of lights illuminating equipment inside a Savvis data center in Slough, outside of London. (Photo of Savvis Slough Campus by Luben Solev).


Blue LED lights adorn servers along rows of racks inside a Google data center in Douglas County, Georgia. Google uses LEDs because they are energy efficient, long lasting and bright. (Photo by Conne Zhou for Google)

A look at the blue-lit servers inside the cold aisle of the Facebook data center in Prineville, Oregon. Facebook opted for blue lights over slightly cheaper green ones because they looked cooler. (Photo: Facebook)


Red and green lights provide a Christmas-style contrast on equipment for the correlator in the ALMA Array Operations Site Technical Building, the world's highest data center at 16,500 feet in the Chilean Andes. (Photo Credit: ALMA, S. Argandoña)


A field of blue lights on high-density servers is just part of the color palette at SuperNAP 7, the massive data center operated by Switch in Las Vegas. The facility's three redundant power systems are all color coded. (Photo: Switch)

The interior of a 40-foot container inside the new Microsoft Chicago data center, packed with servers on either side of a center aisle (click to see a larger version of this image).

The interior of a 40-foot  data center container inside the Microsoft Chicago data center, packed with servers on either side of a center aisle (Photo: Microsoft).


Some are brightly lit, and others are cloaked in darkness, with dim blue or purple light to guide you to your destination. As data centers have increased in size, these interior walkways are now so long that some provide a "vanishing point" effect in which the corridor shrinks.


The corridor of an Equinix data center in Silicon Valley, framed by colocation cages on either side and cable trays overhead. (Photo: Equinix)

The long main hallway of the NYSE Euronext data center provides a sense of the immense scale of the 400,000 square foot facility in New Jersey. (Photo: Rich Miller)

The long main hallway of the NYSE Euronext data center in Mahwah, New Jersey provides a sense of the immense scale of the building. (Photo: Rich Miller)


The SuperNAP in Las Vegas can support up to 84 MVA of UPS power to customers in the 400,000 square foot building. Here's a look at one of the walkways between data halls. (Photo: Switch)


The main hallway along the center of the DuPont Fabros ACC5 data center in Ashburn, Virginia runs a quarter-mile down the length of the building. (Photo: DuPont Fabros)


The humble network cable may not be particularly high tech. But these cables make the Internet possible, connecting servers, storage and networking gear to speedily deliver data from the data center to your browser. Keeping all this cabling organized is no small feat, and we've featured some of the masters of this art.


One of the distinctive features of the RagingWire data center in northern Virginia is the attention to detail on cabling management, which is on display in this view of one of the tenant equipment areas. (Photo: RagingWire)


A look at the full cable trays in an Equinix data center, part of the "symphony of cables" within the company's facilities. As one of the world's largest provider of interconnection services, cable management is a priority for the colocation provider. (Photo: Equinix)


The network connection area is a busy Internet intersection within Server Central's data center in Elk Grove Village, Illinois. (Photo: ServerCentral)

There is also visual appeal and symmetry as the cables branch out and make their way to their destination at a server, switch or patch panel. Recognize the yellow cabling? Yep, it's Equinix, this time from the new DC11 data center in northern Virginia. (Photo: Equinix)


Here’s a look at some of the overhead cabling infrastructure at a data center supporting Codero’s services. (Photo: Codero)


Servers generate heat. When you assemble thousands of them in a small space, keeping servers cool is a critical task. Cooling is one of the primary functions of a data center, and an area in which designs have advanced rapidly in recent years, as the industry's largest players assessed cooling as the "low-hanging fruit" in improving the energy efficiency of their data centers. Here's a look at some of the cooling infrastructure at major Internet data centers, which is both functional and striking in appearance.


Here’s a rare look inside the hot aisle of a Google data center. The exhaust fans on the rear of the servers direct sever exhaust heat into the enclosed area. Chilled-water cooling coils, seen at the top of the enclosure, cool the air as it ascends. The silver piping visible on the left-hand side of the photo, which carry water to and from cooling towers. Click for a larger version of the image. (Photo: Connie Zhou for Google)


A look at the infrastructure supporting the water and cooling systems in a Google data center near Atlanta, which includes facilities to clean and purify "grey water" for use in its cooling towers. Also pictured is a G-Bike, the vehicle of choice for Google staff to get around the company's data centers.(Image: Google)


Some of the rooftop cooling infrastructure at a data center for Interxion, a European provider of colocation and wholesale space, photographed at dusk. (Photo: Interxion)


The Switch SuperNAP in Las Vegas is cooled by huge, versatile units known as WDMD – short for Wattage Density Modular Design – a custom-built system housed outside the data center that can automatically switch between four different cooling options to deliver the most efficient cooling for current conditions. (Photo: Switch)


The DuPont Fabros ACC5 data center in Virginia has 16 huge chillers to provide cooling to the data halls, which house servers for some of the Internet’s largest companies. (Photo: DuPont Fabros)


The latest frontier in cooling is containment: the discipline of directing cool air to the servers and preventing any mixing with warm exhaust heat from the servers, a routine challenge in traditional "hot aisle/cold aisle" designs. Here's a look at some cool-looking containment systems we've featured here on Data Center Knowledge.

A fish-eye lens view of an example of hot aisle containment in a Savvis data center near London in the UK. (Photo of Savvis Slough Campus by Luben Solev)


An overhead view of a cold aisle containment system at Interxion, with the perforated floor tiles visible through the clear "roof" on the server area. The clear roof panels in these designs can either flip down or drop away when a heat condition is present, allowing for fire suppression within the containment area, (Photo: Interxion)


Here's a look at a containment system used in the Yahoo "Computing Coop" design. Fresh outside air enters the building through louvers and then flows into the front of the servers and exits into a contained hot aisle, which is topped by a chimney that leads into the upper chamber of the “coop.” Depending on the conditions, the warm air can either be recirculated or vented through the cupola. (Photo: Yahoo)


The SuperNAP's high-density cooling design is a hot aisle containment system known as Thermal Separate Compartment in Facility (TSCIF). Chilled air drops into the cold aisle and enters the cabinets. The hot aisle containment system delivers waste heat back into the ceiling plenum, where it can be returned to the chiller. (Photo: Switch)