Hamilton: Servers Dominate Data Center Costs

Add Your Comments

James Hamilton of Amazon Web Services has done a lot of interesting research about the cost of operating a large-scale data center. James recently updated his data center cost model to account for recent advances in efficiency and separate out the cost of networking equipment.

James estimates that it costs about $3.5 million a month to operate a data center that can hold 45,000 to 50,0000 servers. The largest chunk of that is the servers themselves, which account for 57 percent of the operating cost, while networking gear represents another 8 percent. On the facilities side, power distribution and cooling account for 18 percent of the monthly costs, while other infrastructure adds another 4 percent. Finally, there’s the cost of power, which works out to 13 percent of the total.

The updated results, which are available on a spreadsheet, reinforce Hamilton’s point that despite all the talk about the cost of electricity and power/cooling infrastructure, servers remain the largest single component of the cost of major data centers.

“The key point to keep in mind is the amortization periods are completely different,” James writes .”Data center amortization periods run from 10 to 15 years while server amortizations are typically in the three year range. Servers are purchased 3 to 5 times during the life of a data center so, when amortized properly, they continue to dominate the cost equation.”

About the Author

Rich Miller is the founder and editor-in-chief of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)