Insight and analysis on the data center space from industry thought leaders.

Follow-the-Moon Scheduling to Lower Energy Costs

Lowering energy consumption is clearly a primary goal for just about every data center manager on this planet. The answer, however, might come in the form of a combination of multiple emerging technologies, writes Jeff Klaus of Intel.

Industry Perspectives

October 22, 2012

4 Min Read
Data Center Knowledge logo

Jeff Klaus is the director of Intel Data Center manager (DCM). Jeff leads a global team that designs, builds, sells, and supports Intel® DCM.

Jeff-Klaus-tn-2013jpg.jpgJEFF KLAUS

Power and cooling continue to strain even the most modern data centers. Industry analysts estimate that data centers consume ~1.5 percent of the total available energy supply worldwide. And this percentage is going up.

In addition to dealing with the rising costs and the limits of the local utility companies to meet their needs, data center managers face other pressures for curbing energy consumption. For example, “cap-and-trade” programs - in effect in 27 European countries and, more recently, in California and other states and provinces in North America - are being introduced to force conservation of energy.

Lowering energy consumption is clearly a primary goal for just about every data center manager on this planet. Online group discussion forums, blogs, and publications like this one have been debating the various approaches and proposed solutions to the energy crisis in the data center. The answer, however, might come in the form of a combination of multiple emerging technologies.

Cloud Computing: Energy Cost Saver?

Cloud computing, when combined with a power management solution, can effectively lower the total energy costs tied to the data center’s workloads.

For example, consider a company with a large, central data center in New Jersey. The company also operates a West-coast colocation site for disaster recovery and handling overflow requests for the main data center. The colo’s energy rates are slightly lower than New Jersey’s. The company’s private cloud computing model can flexibly meet service requests, with workloads distributed between New Jersey and the West coast data center - on the fly - for optimal server utilization and performance.

What does this have to do with driving down energy costs? Consider how service request scheduling could be adjusted if data center managers knew, with fine-grained accuracy, the energy requirements for the various workloads and service requests. Energy costs could be compared at the time of any service request.

As a first step, the New Jersey-based data center team might shift some applications or service requests to the West coast during the three hours when they pay peak power rates, and the West coast data center is still getting off-peak rates for consumed power. The next step would be to off-shore service requests to another hemisphere. A follow-the-moon strategy would allocate tasks to data centers where the cheapest night-time power rates could be used to drive down the cost of the service request.

Technology Hurdles

Taking advantage of the cheapest energy rates—wherever they may be offered—is not an unrealistic scenario, technologically speaking. Cloud computing is rapidly advancing, especially as security technologies evolve for virtualized servers, and as high-speed networks are being built out. Certainly, there are some regions that still lack reliable, affordable network bandwidth for avoiding application latencies, but there are plenty of off-shore regions that are building out infrastructure at a rapid pace—regions where power is affordable, and the service provider industry is subsidized by government infrastructure investments. The data centers popping up in those areas can offer lower compute rates for off-shored data center service requests and workloads.

So, the critical component becomes the power management or power dashboard—the solution that gives the data center manager the visibility and control of a workload’s power profile. Real-time power management has evolved at a rapid pace over the last decade, and today many enterprises already use thermal and power maps to drive down power consumption in the data center, avoid equipment-damaging power spikes, and throttle down server performance when necessary to remain under the power ceiling for the data center, or the row, or rack of servers within a data center.

These same power management solutions can be used to profile workloads and for making intelligent decisions about which workloads to offshore at which times of day.

Leveraging the Cloud

Cloud computing is a natural progression, when compared to today’s highly virtualized data centers. It promises enterprises unprecedented agility, in terms of cost-effectively meeting the compute needs for dynamic organizations and fast-changing markets. When the cost of power is considered as yet another resource to include in the business case for cloud computing, it makes the evolving cloud models even more exciting in terms of the potential for cutting costs.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like