NOAA Targets Hurricanes With Computer Power

7 comments

Darren Smith, the NESCC Project Director for NOAA, is overseeing the completion of a new data center in West Virginia that will house the agency's next-generation supercomputer.

A new supercomputing facility in West Virginia could help improve weather forecasters’ ability to predict the power of huge hurricanes, which could eventually help public officials make better decisions about when to call for the kind of mass evacuations seen during Hurricane Irene.

Next month The National Oceanographic and Atmospheric Administration (NOAA) will bring a new data center online. The facility in Fairmont, West Virginia will support a 383-teraflop supercomputer designed to develop more powerful tools for analyzing the behavior of hurricanes. NOAA staffers discussed the project in a presentation last week at the AFCOM Data Center World fall conference in Orlando.

Computer Models Improving

Hurricane scientists have developed sophisticated computer models to analyze the threat posed by hurricanes. This has helped forecasters make significant strides in projecting the path a hurricane will take. But it remains more difficult to predict the intensity of hurricanes, which can fluctuate in strength as they approach landfall. Hurricane intensity is ranked on the Saffir-Simpson scale, which assigns a category from 1 to 5, with Category 5 storms being the most powerful.

“The damage caused by the hurricane has almost everything to do with intensity,” said NOAA’s Darren Smith, who said predicting the intensity requires granular data.”The physics of a hurricane occur at a 1 kilometer resolution.”

The new supercomputer in West Virginia will bring more horsepower to test new models designed to better measure intensity. But it will also require about five years of development before the new model is ready for use in NOAA forecasts.

More Capacity Required

Analyzing hurricane data requires lots of computing power. NOAA has 9 supercomputers already, but none of its existing facilities had the space or power capacity to house the new machine. Smith and his team began an unusual site selection process in which they picked a location for the new facility without knowing what kind of supercomputer it would house.

They didn’t know, for example, whether the supercomputer would require water cooling or air cooling. The agency eventually chose an SGI Altix ICE cluster in which some cabinets will be water-cooled (through a rear-door cooling unit) while others will use air cooling.

Cabinets in the high performance computing cluster will use up to 33 kilowatts of power per rack. The data center will use full outside air cooling (air economization) for 40 percent of the year, while using a mix of economizers and chillers the remainder of the time. The economizers are expected to save about $800,000 a year in power costs.

The 54,000 square foot space will house 16,000 square feet of computer room space on a 4-foot raised floor, and a 6,000 square foot tape archive, with room for expansion of both. The facility has 6 megawatts of power capacity, and is supported by a flywheel UPS system.

Thermal Storage for Ride-Through Cooling

The NOAA facility is supported by a 32-foot tall thermal storage tank, which holds 25,000 gallons of water chilled at 55 degrees. At full load, the thermal storage will provide about 10 minutes of cooling in the event of a power outage to allow time to restart the chillers.

Smith said the growth of the NOAA supercomputing program is part of a larger effort to offer more granular information at a time when instance of severe weather are on the rise.

“We need climate modeling down to local environments,” said Smith. “The public is demanding a lot more service from NOAA.”

About the Author

Rich Miller is the founder and editor-in-chief of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)

7 Comments

  1. Howard

    Good detail here, Rich. Any more publicly available info on costs?

  2. The Fairmont facility had a budget of $27.6 million, not including ongoing power costs. NOAA used a design-build approach, and finished the project in less than 18 months to meet its deadline.

  3. 33 watts per rack. *Wow*, that's efficient.

  4. john reedy

    I think you mean 33 kW per rack. Otherwise it sounds boringly standard design.

  5. Rich... Great post I found on this link... http://www.theregister.co.uk/2011/09/20/noaa_data_center_design_considerations/