Insight and analysis on the data center space from industry thought leaders.

Density – More than a Metric (Part 1)

In a data center developer’s parlance, as in physics, the unit of data center density is pretty straightforward – server input power per unit of area – kilowatts per square meter or watts per square foot. There are, however, plenty of other variables that shape a discussion on the topic, writes Tate Cantrell of Verne Global.

4 Min Read
Data Center Knowledge logo

Tate Cantrell is CTO of Verne Global with primary responsibilities for product design and development and data center operations.



Verne Global

Density. It seems like a simple concept – mass per unit of volume. Lead is dense – air is not. (Although one could argue. . .) End of discussion: unless you are discussing the topic with a data center aficionado, in which case you best grab a tall mug of coffee, as it could be a long and drawn out conversation.

Density in a Data Center

In a data center developer’s parlance, as in physics, the unit of data center density is pretty straightforward – server input power per unit of area – kilowatts per square meter or watts per square foot. There are, however, plenty of other variables that shape a discussion on the topic. Does the area include mechanical cooling equipment? Is the area just the footprint of the server cabinets? Does the input power include the losses of the UPS’s? The electrical distribution? The power supplies? To wage this discussion properly, we look not at the critical environment as a whole, but the hardware itself: the individual bits of silicon that perform true computational work and create all of that heat as a result. More computational work means more heat. A higher concentration of silicon within a server cabinet means more heat in a smaller physical space and thereby a higher density. But density alone is not the objective.

The goal of any data center operation is to achieve maximum benefit to a company or an individual with the least amount of cost. Benefits to a company should be the flexibility to innovate and grow and standardization to perform and sustain. Thanks to the practices of virtualization and shared storage pools (cloud if you must) the benefits of flexibility and standardization are mutually bound at the platform level in a way that would make James Maxwell think of electricity and magnetism. Understanding that computational work and true benefit are proportional in a logical way is golden, but the question as to the design of the data center facility – high density or low density – becomes a question of cost.

Providing Power, Removing Heat

I can make a reasonable argument that many organizations – both large and small – have looked at this dilemma and have ended up on both sides of the conversation, but it ultimately comes down to personal preference. For some companies, the form factor of a 2U server allows for better or less expensive network options. For others, parallel computing is paramount and tight form factors allow for super-fast intra-server networks. Regardless of the hardware choice, the data center facility must be able to provide continuous power and to remove the heat, however focused and however dense it may be.

Within many organizations, the high likelihood is that both low and high density favorable solutions will be required. The result is the basic conundrum that is inevitably presented to a data center designer. Provide a design that can appropriately cool high intensity cabinets at 15-20 kW up to 30-40kW per position (or more) while allowing an option for a deployment of lower intensity devices at 6–12 kW per position. Couple that with a simple request not to strand cost and make sure that the space can respond to a last minute decision to go either way. Ultimately the benefit of the data center is in how quickly it is available for use, not just in the dependability of the engineering once the power, cooling and network services are engaged. This convergence of requests makes you want to consider easier, more predictable tasks like putting on a blindfold and solving the Rubik’s cube.

Seeking the Simplest Solution

The best solution to complexity comes when you aim for simplicity. Simple solutions are found by bringing people together to create common benefit across the organization. The stakeholders must start their search for simplicity with information and options.

You can start with modeling the use cases. No decisions should be made until some direction can be made on the hardware profile and the mix within the facility. It is incumbent upon the facility experts and the IT magi to work together up front on the needs of the business, the personal preferences, and ultimately the basic hardware requirements.

Stakeholders can also push for standardization. Standardizing on a high or a low-density environment will save up front in design complexity and in the future with operational simplicity. There is a simple reason – cost – why the data center environment is moving rapidly on the curve from Innovation and Customization up to Commodity and Utility Services as pointed out so eloquently by Simon Wardley at OSCON in 2010.

In my next post, I’ll continue to explore the impact of density as we look at additional key considerations for data center planners as they try to maximize the benefits for their data center operations.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like