Wild New Design: Data Center in A Silo


A diagram of the design of the CLUMEQ Colossus supercomputer, from a recent presentation by Marc Parizeau of CLUMEQ.

Here’s one of the most unusual data center designs we’ve seen. The CLUMEQ supercomputing center in Quebec has worked with Sun Microsystems to transform a huge silo into a data center. The cylindrical silo, which is 65 feet high and 36 feet wide with two-foot thick concrete walls, previously housed a Van de Graaf particle accelerator. When the accelerator was decommissioned, CLUMEQ decided to convert the facility into a high-performance computing (HPC) cluster known as Colossus.

We first noted the development of the CLUMEQ site earlier this year when Marc Hamilton of Sun discussed its unique design, but offered scant details. Additional information about the design of the facility and its cooling system were discussed at the Sun HPC Consortium last month in Portland, Oregon.

CLUMEQ silo data centerThe CLUMEQ Colossus cylinder features an interior “hot core” (as opposed to a hot aisle) in the center of the building and uses the outside ring of the facility as the cold air plenum. The cabinets are arranged in a ring on each floor, facing the outside of the silo. The floors supporting each ring of cabinets are comprised of grates rather than solid flooring to facilitate airflow through the facility.

The cooling coils and air handlers are located in the basement. Chilled air flows upward through the outside cold aisle and through the racks of servers. The waste heat exits the rear of the racks into the hot core, and is returned to the basement via the cold aisle.

Cooling fans at the CLUMEQ siloThe air flow pattern is maintained through differential air pressure – maintaining a higher air pressure in the cold aisle than the hot aisle. This keeps the air moving through the facility, which has a blowing capacity of 180,000 CFM and can cool up to 1.5 megawatts of electrical load. Up to 300 kilowatts of cooling capacity can be supplied by free cooling using fresh air from outside the facility.

“CLUMEQ silo totally blows up the paradigm of data center design,” says Nicolas Dube of Sun, who began work on the project as a graduate student at Universite Laval in Quebec. “The silo, by itself, is the CRAC (computer room air conditioner). The whole facility cools itself.”

As for computing horsepower, Colossus will have a peak of 86 teraflops of compute power. It’s equipped with a Sun Constellation HPC systems featuring 10 fully loaded Sun Blade 6048 chassis, 1 petabyte of Lustre storage and Sun J4400 storage arrays.

The data center racks are spread over three floors, with the switches on the second floor to keep the cable runs as short as possible.

For a full description of the CLUMEQ design, check out this video from Sun, which runs about 6 minutes.

Additional details are available in PDFs of presentations by Marc Parizeau of CLUMEQ and Nicolas Dube of Sun.


Get Daily Email News from DCK!
Subscribe now and get our special report, "The World's Most Unique Data Centers."

Enter your email to receive messages about offerings by Penton, its brands, affiliates and/or third-party partners, consistent with Penton's Privacy Policy.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)


  1. Thom

    This is a good idea. Here in the states, we have many of these silos that used to house Titan ICMB's. The largest, and probably a game stopper, problem is that they are in the middle of no where. Really. They were put there for that very reason. The closest civilization is usually a small farming town. Then again, if you worked there, housing would be cheap as well as the cost of living.

  2. Thom

    A second thought. Vandenberg AFB, next to Lompoc, CA, has several of these silos that are moth balled. That may be a solution. Due to Lompoc's proximity to VAFB, there are quite a few high tech personnel.

  3. Da ve

    The air handling equipment would not need to work as hard if it was at the top of the structure rather than on the bottom.

  4. Dave: We're seeing a lot more data centers with the rooftop air handlers to take advantage of the natural tendency for cold air to fall and warm air to rise. Microsoft's Dublin data center is a good example of this, and Digital Realty's current design also places air handlers on the roof. In the case of the CLUMEQ silo, it may be that the design of the structure made it difficult to install and/or support the cooling equipment on an upper floor above the cabinets.

  5. Wichita, KS and Kansas in general, has a couple hundred decomissioned silos dotting the country side. Centrally located, cheap housing, several close Universities to farm for talent (K-State, KU, UMKC, MU, WSU, OK-State, etc).

  6. Edward

    Installing equipment on top of the silo may seem like a good idea but for one reason. Silos to hold grain are not designed for vertical loads but for the pressures of containing grains. Steel mesh within the concrete walls act as bands or belts to prevent busting from internal pressures. At the bottom of the silo you will find very strong concrete foundations to support the weight above. Typically there would be gratings and conveyor belts to move grain from the silos to either trucks or trains. Placing heavy ventilation and cooling equipment here or in an adjacent structure are the best choices. Why do I know this - I had looked at purchasing a decommissioned grain elevator in the mid 80s. It was a little unique in that the silos were contained within a rectangular structure so that it could support three floors of grain cleaning and other equipment on top. I had several ideas for the silos including multi-floored storage. A computer facility was one of my ideas.

  7. This video is almost fluffy enough to sleep on. I'm not seeing the innovation here. It seems pretty similar to any number of other isolated cold aisle setups which utilize thermal convection. Beyond uniform cable lengths, and a sad reclamation of an awesome space which might have otherwise been used to throw some killer campus parties.

  8. One added advantage of utilizing a decommissioned ICBM silo - besides the environmental and cost (locale) advantages - is their ability to withstand disaster situations. Tornadoes, earthquakes, the occasional errant warhead (kidding on this one, hopefully) is little problem - they were engineered by the USAF to withstand a direct hit, either natural or manmade. Personally, I'd prefer the quality of life that comes with living - and raising a family - in a rural area. IMO urban life is overrated.

  9. PGT

    seems like a lot of wasted space - narrow and deep servers and cabinets weren't meant to be placed in a circle.

  10. PGT: That's why we don't see many round data centers. A bit of history: a previous approach to placing servers inside cylindrical concrete structure was HavenCo, the colo operation housed in the SeaLand platform off the coast of England, which was famously profiled in Wired but didn't end well.

  11. Nathan

    This seems backwards I would use the core as the cold air plenum and the outside for the hot air as the external portion could be heated by the sun..

  12. Jole

    A pretty cool design if you are stuck having to work with short Infiniband runs - although wonder how necessary that will be when 10Gig-E finally becomes the dominant networking technology in a few more years. As other folks have commented ... seems like a pretty darned inefficient use of space - square pegs(racks) in a round hole(silo). At least people in colder climes are starting to use outdoor cold air to their benefit. Nothing rocket science, but really good to see we are finally on the path to more efficient datacenter cooling designs.

  13. I wonder if the cost of building these silos from scratch would be any better than just building the typical warehouse style datacenter, which would allow more room for one. I also agree with putting them in the higher region of the silo as opposed to the bottom. I'm sure there are plenty of other abandoned war facilities that would be equaly useful for a datacentre, but as Thom suggested they are all probably in the middle of nowhere. Although I'm sure there are dozens of paranoid admins who would love that idea!

  14. Although the video states that the bar has been raised when constructing data centers or clusters, I beg to differ. Besides the obvious of having equal cable lengths (not new, CRAY has been doing this for years) a fundamental flaw with the design..... - Cooling is not at the "U" level. Temperature gradients within each rack still exists. In fact, I would suspect that the top level nodes are hotter than nodes lower in the infrastructure. Lastly, re-claiming or re-using Silos has no value here. The cost of building new structures would be lower than trying to use a corn or bean drying silo. Not to mention the cost of putting sufficient power to remote locations......

  15. That's a really good "out of the box" thinking design. Maybe they can do the same here in the UK and save some energy in the process.

  16. jimbonics

    Master Control Panel.

  17. Interesting, and are all the silos above ground? Perhaps the coil system is more eco-friendly indeed... I wonder if the servers sound louder in silos?

  18. Intrigued by the circular design. A few questions: 1.) As noted, cold air falls - why not introduce the cold air at the top? 2.) Separation - is there absolute separation from the cold surround to the heated core? 3.) Volumes - do you see expansion of the air as its heated as a factor? If so how does it affect air movement? 4.) Did you calculate the air-exchanges per hour based on the BTU output potential of the occupying equipment? If so, how did this affect the chosen volume of the cold and/or hot areas? (one could vary the diameter of the racks to vary volumes) 5.) Solar orientation - did you elect south/east side for the stairwell to mitigate thermal bridging/sinking of the concrete outer layer? 6.) Sensors- is the operation of the system/facility adjusted based on sensor readings? If so, how? Thanks in Advance! Great project. Corey..