Hamilton: Servers Dominate Data Center Costs

Servers are the largest single cost component of operating an Internet-scale data center, according to updated data from James Hamilton of Amazon Web Services.

James Hamilton of Amazon Web Services has done a lot of interesting research about the cost of operating a large-scale data center. James recently updated his data center cost model to account for recent advances in efficiency and separate out the cost of networking equipment.

James estimates that it costs about $3.5 million a month to operate a data center that can hold 45,000 to 50,0000 servers. The largest chunk of that is the servers themselves, which account for 57 percent of the operating cost, while networking gear represents another 8 percent. On the facilities side, power distribution and cooling account for 18 percent of the monthly costs, while other infrastructure adds another 4 percent. Finally, there's the cost of power, which works out to 13 percent of the total.

The updated results, which are available on a spreadsheet, reinforce Hamilton's point that despite all the talk about the cost of electricity and power/cooling infrastructure, servers remain the largest single component of the cost of major data centers.

"The key point to keep in mind is the amortization periods are completely different," James writes ."Data center amortization periods run from 10 to 15 years while server amortizations are typically in the three year range. Servers are purchased 3 to 5 times during the life of a data center so, when amortized properly, they continue to dominate the cost equation."

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish