Data Center Power Drain an ‘Urban Myth’?
Is the data center power drain an urban myth? That’s the provocative title of a blog post from Joe McKendrick, an editor to Database Trends and Applications magazine. In reviewing some of the data from recent studies by Lawrence Berkeley National Laboratory, McKendrick notes that data center power represented just 0.6 percent of U.S. electricity usage in 2005, and argues that a broader cost/benefit analysis shows that the savings generated by that equipment more than justifies that level of use.
Yes, your data center may be running up some huge electric bills, and it’s important to seek ways to cut this con
I think it’s important to read McKendrick’s entire post, because he adds at the end that we should still look for ways to be more efficient. His major point, it seems, is that IT is getting a bad rap as being an energy hog, when really IT has saved a lot more energy than it has wasted.
Thanks for the link, Rich. I ended up writing about it, too.
The panic is cost-driven, not resource-driven.
The cost of power in a datacenter environment is more than double what it costs to power anything else. It used to be that you could move into a rack and use a single 120V/20A circuit, maybe two. Not anymore. Fill up a rack with servers and you’ll need a few 30A circuits, maybe even at 208V. You will pay for the amperage, AND you’ll pay for the usage. Colocation facilities are EXPECTED to have power capacity, and full redundancy of that capacity. So a circuit price doesn’t just include the price of grid power, it includes the price of backup power. Backup power has VERY high cost, most of it in consumables, such as batteries and Diesel fuel. Batteries have a limited lifespan, and Diesel fuel