There are few better ways to get an idea of how a data center will behave when something is changed in the environment before actually applying the change than having a sophisticated, accurate virtual model of the facility.
As people increasingly rely on software running in data centers for nearly every aspect of their lives, the issue of data center power consumption becomes more and more acute. Using data center modeling is one of the best ways to ensure a new or existing facility will use energy in the most efficient way.
Because data center energy efficiency is such an important topic, Facebook, Bloomberg, IBM, Comcast, Intel, and Verizon, among others, have partnered with the U.S. National Science Foundation and four universities to study ways to improve efficiency at the Center for Energy-Smart Electronic Systems (ES2) at Binghamton University in New York State.
ES2 recently built a data center lab for use in its research, and one of the latest research papers to come out of the department examines data center modeling.
The research project basically compared a data center model created by a software solution and its predictions to empirical measurements taken in the facility over time. The researchers found a model can be made substantially more accurate if it is adjusted based on monitoring data collected from the operational facility.
“The models without experimental validation are questionable,” Husam Alissa, a Binghamton PhD candidate and the paper’s lead author, who was also deeply involved in designing the ES2 lab, said.
Using a Computational Fluid Dynamics model to design a data center before it is built is a valid approach, he said. Once the facility is built, however, the model will be a lot more useful if it’s adjusted based on operational data.
“If you really want to understand your data center… you definitely need to spend some time on the validation process.”
The team used Future Facilities’ CFD modeling software and consulted with the vendor’s representatives extensively. Here are three key things they learned can help improve a data center model:
1. Measure Airflow Everywhere
One of the discrepancies between the model and reality was the rate of airflow through perforated floor tiles. The further a tile was from a CRAH (Computer Room Air Handling) unit, the higher its flow rate was, for example, since some of the air the unit supplied bypassed tiles that were right next to it.
They also measured a drop in flow rate in the middle of one of the aisles, which they suspected was being caused by a vortex elsewhere under the aisle. CFD modeling combined with empirical data can be very helpful in identifying such phenomena, they wrote.
Alissa and his team also traced some flow deficit to things like unsealed floor holes that were used to route cooling pipes and power conduits, tile cuts around the air conditioning unit, seams, and the point where the raised floor met the side walls.
2. Start With as Clean a Slate as Possible
The simpler the room’s conditions are, the easier it is to understand its behavior. If possible, it helps a lot to eliminate things that complicate air flow and take measurements in that simplified state. These can range from shutting down IT equipment to equalizing room pressure or identifying floor leakage.
Having an idea of how a room behaves under such simplified conditions can be very helpful in understanding raised-floor behavior, plenum flow patterns, and their effect on the delivery of cold air through the tiles and server temperature.
3. It’s Not Just a Box
It’s important to not oversimplify the physical aspects of the cold-air plenum. Details like cutouts, floor jacks, and supply-vent locations all have an effect on airflow, and a good model should account for all of them.
Floor jacks, for example, the vertical columns that support the raised floor, affect air flow significantly simply because there are so many of them. Without accounting for jack resistance, the data center model overestimated plenum pressure buildup and predicted tile delivery. In some spots, closer to the walls, where there’s higher pressure buildup, it overshot predicted tile delivery by as much as 30 percent, compared to measured data.
It took Alissa and his team about seven months to complete the work. After it was completed, however, they recalibrated nearly every aspect of airflow in their model, including cooling units, servers, tiles, plenums, leakage, room geometry, and more. Once this calibration was done, the model became a truly predictive one, representative of measurements taken in experimentation.