A view from the catwalk entrance to the enormous Phoenix ONE data center in Phoeniz, Arizona.

Ready to Super-Size the Enterprise

Add Your Comments

A view from the catwalk entrance to the enormous Phoenix ONE data center in Phoeniz, Arizona.

A view from the catwalk of the first phase of the massive Phoenix ONE data center, which features a 34-foot ceiling and raised mezzanine for infrastructure.

PHOENIX - The sign above the entrance to the raised-floor area at the Phoenix ONE data center makes a bold declaration in capital letters: “NOT ALL DATA CENTERS ARE CREATED EQUAL.” That’s the corporate motto for i/o Data Centers, and Phoenix ONE is the company’s effort to put an exclamation mark on it.

At 538,000 square feet, the mammoth Phoenix ONE site is one of the world’s largest data centers. The facility opened for business this month, less than six months after i/o Data Centers took ownership of the property, and several substantial customers have already been installed in the first phase, which features 180,000 square feet of raised floor.

But it’s not only the scope of the facility that makes Phoenix ONE distinctive. The huge data center features a number of design innovations:

  • A high-density cabinet that can support computing power loads of up to 32 kilowatts per rack (2,500 watts per square foot). The patent-pending ThermoCabinet is sealed for complete isolation of hot and cold air. Cool air movies directly from the raised floor into a chamber in the front of the cabinet, through the servers and then exits through a hot air chimney at the rear of the cabinet.
  • A custom ThermoPower strip offering a range of power options for customer cabinets.    
  • A thermal storage system that will allow i/o Data Centers to run chillers for its cooling systems at night, when power rates are lower, and then store cold water for use during daylight hours.  
  • An enormous rooftop array of solar panels, which will eventually generate as much as 4.5 megawatts of power for the data center – nearly three times the capacity of Google’s rooftop solar array at its California headquarters.
  • A variety of energy efficiency features, including low-power LED lighting on the data center floor, ultrasonic humidifiers for climate control, highly efficiency computer room air handlers (CRAHs) using plug fans,  high-efficiency chillers, and perimeter flooring made from reccycled car tires.   

i/o Data Centers says it expects the Phoenix ONE facility to be certified under the LEED ( Leadership in Energy and Environmental Design) program from the U.S. Green Building Council (USGBC).  

“We are constantly trying to innovate,” said Anthony Wanger, president of i/o Data Centers, which is approaching capacity on its Scottsdale ONE data center. Wanger sees in-house research and development as a key differentiator for the company.

Air Economizers in 1999
That focus on innovation dates back to the carrier hotel at 120 East Van Buren in Phoenix, which brought together the team that would later form i/o DataCenters. Fresh-air cooling using air economizers has been a hot trend over the past two years, but the Van Buren facility began using air economizers in 1999.  

After selling 120 East Van Buren it to Digital Realty Trust in a 2006 deal valued at $175 million, i/o Data Centers picked Scottsdale as the site of its first project. The Scottsdale facility hosts corporate data centers from 20 of the largest companies in the Phoenix area, said Wanger. It’s carrier-neutral, offering access to multiple bandwidth providers, and doesn’t charge for cross connects.

Favorable Disaster Profile
i/o Data Centers has tenants who are web hosts and Internet companies. But its bread and butter is the enterprise data center. Phoenix has an attractive disaster risk profile, with no hurricanes and low exposure to earthquakes and tornadoes, making it a favored market for corporate data storage and disaster recovery.

This has helped i/o Data Centers fill the 125,000 square foot Scottsdale facility nearly to capacity in two years. As the company looked for its next property, Wanger was intrigued by a huge facility built as a combined headquarters/warehouse for a water bottling company that went out of business. 

The site had a 69kv electrical sub-station, 18 internal 480v sub-stations and redundant chiller plants and water feeds. Wanger estimates that the existing  infrastructure saved the company about $20 million in site improvements. The Phoenix ONE project is fully funded by a $56 million equity investment in January by Sterling Partners. 

Equipment Galleys
Both the Scottsdale and Phoenix data centers segregate the mechanical, electrical and power (MEP) infrastructure from the IT equipment. UPS systems and cooling air handlers are housed in equipment galleys off the data center floor. Wanger said this is useful for restricting access to ensure that MEP vendors and technicians can’t access IT equipment and vice versa.  

Some of the cooling and power distribution equipment at Phoenix ONE is housed in equipment galleys on a raised mezzanine. This shortens the runs for high-voltage power cabling – an important consideration with a huge data floor – and also creates more floor space for colocation.

i/o Data Centers sells colocation by the rack or cage, as well as enclosed data suites. The ThermoCabinet offers a new approach to airflow containment for high-density cabinets. It shares some features of existing containment systems, but creates a sealed system at the cabinet level – rather than the pod or row – and adapts the concept to a raised floor rather than a slab.

About the Author

Rich Miller is the founder and editor at large of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Add Your Comments

  • (will not be published)