Looking over blog topics from my peers that have been published during the last few years, I see the migration of interests and “hot” topics among the data center industry. When we realized how wasteful we were with energy in data centers (along with the rest of the world), we were all caught a bit off-guard by the magnitude of the waste. Now that there are a plethora of vendors who offer products and services to address the first steps, such as generating data to get an idea of what’s happening, end users have many choices. The current question is: Are we deploying the solutions that make data centers more sustainable?
Well-run data center operators are measuring and monitoring. One client of mine said he thought 30% of end users are working diligently at lowering their energy use, an additional 10% have done all they can, and the remaining 60% are still waiting to see if someone will tell them what to do. From the 40% who have given active consideration to energy efficiency and started their monitoring programs, the results are interesting.
Data center managers are finding:
1. Wireless sensors work – more data to analyze means more customized solutions follow.
2. Homegrown spreadsheets work as well – everyone is familiar with the layout and the price is right.
3. Refining and finessing the temperature and pressure in your data center is an iterative process of pushing the envelope a bit at a time – not rocket science, just good management practices.
4. Comparison websites are ubiquitous – you can find them from trade organizations, and the Department of Energy wants to know whether you are managing your Power Usage Effectiveness (PUE), too.
5. As companies increase their use of virtualization, placing more of a workload on less IT equipment, PUE gets worse.
One data center operator recently said, “The PUE is supposed to tell us how efficient we are, but it really does not because it is based purely on electrical energy and doesn’t consider other energy sources like natural gas or diesel fuel (or water consumption!). So then I can run on my Absorption chillers all day long and make my electrical PUE look better than it is . . . and my Absorption chillers use 500% more energy (natural gas) to make chilled water than my Centrifugals. This makes no sense.”
Preventing Energy Loss through Grid Heating
At Notre Dame University in South Bend Indiana, Paul Brenner is working on a novel application of heat produced by his servers. This approach, called “grid heating,” leverages distributed computing concepts to relocate and network these processing units to heat sinks where the waste heat is required. The heat sinks vary from industrial processes to large facilities. Grid heating technology not only maps the compute load to the thermal requirement, but also considers the entire compute environment with principle focus on operating temperatures, humidity, air particulate, and noise production. Primary waste heat transfer can occur through air or liquid media depending on the target heat sink.
Paul said: “At Notre Dame (ND), we have constructed an EOC (Environmentally Opportunistic Computing) infrastructure call the Green Cloud, bridging a traditional data center and a containerized data center located at the South Bend City Conservatory and Greenhouse. During cold weather, the heat generated by the facility is vented into the greenhouse, saving both cooling costs for the data center and heating costs for the greenhouse. During hot weather, heat production and delivery is balanced by services migration from the greenhouse to multiple ND operated facilities both on and off the ND campus. This prototype is used for high-throughput batch computing, allowing us to gain operational experience with the experimental system without placing critical services at risk.
“Apart from not using energy-consuming air conditioning cooling at a centralized data center, the Green Cloud further improves energy efficiency by harvesting the waste hot air coming from the work in servers for the adjacent greenhouse facility. The success of this technique makes the EOC concept even more attractive for sustainable cloud computing setups.”
Are Innovative Solutions In Widespread Use?
Then what is the issue in the industry? It is all happening very slowly.
The top 30% of organizations and institutions have innovated, but we haven’t as an industry reacted with the urgency that this opportunity deserves. Global warming is in a big hurry, too many data center operators aren’t. How do we speed up the savings in energy and water, our precious resources?
Key points to consider:
1. Changes in attitude about urgency – A decade to save a few hundred megawatts is a bucket of water in the ocean.
2. Embrace organizational change – IT needs to swallow or merge with facilities engineers to keep tweaking the data center infrastructure.
3. Stay ahead of compliance – Work to do better than the minimum. Compliance is aimed at the lowest common denominator and is just a holding pattern.
4. Share for the pleasure, the rewards will come – translation: don’t hoard how you achieved higher energy savings. Open dialogue with colleagues is helpful to all.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.