Inside the Yahoo Computing Coop
Yahoo! is holding a ribbon-cutting today to mark the opening of its new data center in Lockport, New York, which uses the company’s innovative new Yahoo Computing Coop (YCC) design to dramatically reduce its electricity use. Yahoo CEO Carol Bartz and New York Gov. David Paterson will be on hand for the event.
Readers of Data Center Knowledge are familiar with the YCC building’s design from our previous coverage of the Lockport project. Today Yahoo has released the first photos of its rack design (see above), which provide additional insight into the airflow management that drives the efficiency of the new facility.
Hot Aisle Containment
Each 120-foot by 60-foot Yahoo Computing Coop module has louvers along the side of the building that allow fresh air to enter into the data center. In the equipment area, the fresh air enters the front of the servers, and then exits into a contained hot aisle, which is topped by a chimney that leads into the upper chamber of the “coop.” Depending on the conditions, the warm air can either be recirculated or vented through the cupola.
“The building itself is an air handler,” Scott Noteboom, the Director of Data Center Operations for Yahoo, told DCK earlier this year. “The entire building is meant to breathe, and there’s a lot of louvers and dampers to control the airflow.”
Yahoo Goes Chiller-Less
The Lockport data center will operate without chillers, which provide refrigerated water for cooling systems and are among the most energy-intensive components of a data center. The facility will use an evaporative cooling system during those 9 days a year when it is too warm to use fresh air. The buildings were positioned on the Lockport property to allow Yahoo to bring in cool air from either side of the coop, based on the prevailing winds.
Yahoo projects that the new facility will operate at a Power Usage Effectiveness (PUE) of 1.08, placing it among the most efficient facilities in the industry.
“With the Yahoo! Chicken Coop design, we are spending less than one cent for cooling for every dollar spent on electricity,” said David Dibble of Yahoo. “Significantly reducing our electricity usage is not only good for the environment, but also good for our bottom line, giving Yahoo! a competitive advantage.”
The Yahoo construction phase employed up to 500 workers at the 155,000 square foot site. The company expects to employ up to 125 workers in New York, who will also support a Yahoo! Operations Center, which monitors the Yahoo! infrastructure to ensure consistent uptime, and a Global Service Desk, a 24/7 IT support center for Yahoo! employees.
Yahoo says the Lockport data center will save enough energy to power more than 9,000 New York state households annually, and save enough water in one year to provide drinking water for 200,000 people.
“We’re thrilled to unveil our world-class data center in Lockport,” said Bartz. “Yahoo! is serious about sustainability and is leading efforts to address climate change. That’s why we believe in creating highly efficient data centers that minimize the impact on the environment.”
Joe ParrinoPosted September 20th, 2010
PUE of 1.08?? Not on your best day Yahoo! So you’re saying while having to move all that air, your Fan Energy + Transformer Losses + UPS losses + Feeder Losses + Lighting = 0.08 of your I.T. Load? PUE isn’t about counting only the power you want to count. C’mon…lose the greenwashing already!!
HuangPosted September 21st, 2010
PUE=1.08? It maybe the unchieved figure with calculation. The only way is to use new energy, solar and wind energy. But, what’s the PUE formula?
RajeshPosted September 21st, 2010
chowoPosted September 21st, 2010
UPS losses? There is a possibility that Yahoo is not using UPS to power most of the servers which classified as non-critical.
My request to Yahoo to publish PUE methodology and data has been ignored, and censored on Yahoo blog…
Anyone having any luck getting them to share the data?
[...] ‘Yahoo Computing Coop’ waarbij men volledig passief koelt en hogere temperaturen gebruikt. eBay nam samen met DatacenterPulse dit nog een stap verder door een datacenter in Phoenix te bouwen, met een gemiddelde van 38C in de zomer, en deze volledig van vrije koeling te voorzien. Ook hierbij werden hogere temperaturen voor de IT systemen gebruikt. [...]
So do we build data centres in cold climates only to spend much more overall energy on the cable infustructure to distribute data?
do they have dan in thier servers?
How about UPS, what sort of UPS are they using? from my research the minimum loss of power by the ups is 8%.
Honestly I have hard time to believe in this.
[...] validated this concept with the air conditioning system in its newly erected Yahoo Computing Coop. Its cooling system regulates server temperatures using outdoor air for most of the year. According [...]
PConPosted August 29th, 2012
Huang, you’re too pessimistic. Innovation and success require a more optimistic attitude. And, thinking inside a new paradigm. What others treat as waste heat, and spend enormous amounts of energy and water to cool, can be used as a mechanical lifting force to move air. Heat rises, remember? What do you think the coop design is for? Moving air. And what moves the air? To a large degree, heat. So, not only is energy not being consumed to absorb waste heat, the heat is not wasted either, and the energy exhausted by rows of servers in the form of hot air is also an input into the system. If you’re wrong, and Yahoo can cut 90% off the energy requirements to maintain an operating environment, then what we’re talking about is indeed revolutionary, in both technology, and business.
PConPosted August 29th, 2012
Sorry I tagged Haung with that last comment, when I should have tagged Joe. Whoops!
PConPosted August 29th, 2012
RE: “Data center – A Green Building | VERT.COM | Data Center Québec.
[...] validated this concept with the air conditioning system”
A/C does not necessarily mean chillers. There are other means of cooling air that via inputs of energy.