How Much Time, Once the Cooling Fails?

Uh oh. All of your chillers have just shut down at once. How much time do you have to diagnose and fix the problem before the data center heats up and equipment begins to fail? For most data center managers, the answer is “not nearly enough.” That was the case for the staff at Hosting365, one of Ireland’s largest hosting companies, when all seven of their chillers shut down at 1 pm yesterday afternoon.

Hosting365’s main data center is at capacity (they’re currently expanding it) and packed with customer equipment. Almost immediately the temperature began to rise by about 3.5 degrees (2 degrees C) per minute. Within 15 minutes areas of the data center were experiencing heat above 40 degrees Celsius – just over 100 degrees Fahrenheit. Servers began to shut down, and staff turned off the rest to protect the equipment.

Hosting365 had figured out the problem – an electrical short in a fan coil, which then fried a fuse that supported the other chillers – within 10 minutes of the original failure. Within 20 minutes, Hosting365 staff had replaced the fuses and brought the chillers back online. By then it was already too late.

“It’s clear from this issue that the suite cannot tolerate even an 18 minute failure of the chillers,” Hosting365 Managing Director Steven McCarron wrote in a status update for customers. “We have an excellent electrical and facilities team and we’ll be looking at ways to beef up the cooling capacity and redundancy. Ironically, we’ve spent more money than the company makes in a year on improving and adding redundancy to our infrastructure in the last two years, and then, as is often the case, something small springs up and causes a problem.”

To its credit, Hosting365 acted quickly and provided customers with a prompt review of the incident. “As managing director, founder and owner, the buck stops here,” McCarron wrote. “So, let me detail exactly what happened today, the order of events, how we reacted and handled the problems and what we’re going to do about it tomorrow.”

Get Daily Email News from DCK!
Subscribe now and get our special report, "The World's Most Unique Data Centers."

Enter your email to receive messages about offerings by Penton, its brands, affiliates and/or third-party partners, consistent with Penton's Privacy Policy.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

One Comment

  1. Data Center Aficionado

    Does anyone know how many watts/sqft this Hosting365 facility was designed for? Recent conventional wisdom is to provide uninterruptible cooling (pumps/fans on UPS, chilled water storage for ride-through) in facilities designed upwards of ~180w/sqft. It would be interesting for the industry to get some real-world knowledge from this unfortunate event - especially if this occured at a density of around 100w/sqft which is pretty common in Dublin. Kudos to the facility operations team on their speed and obvious competence to resolve this issue in under 20 minutes.