December 22nd, 2011 By: Rich Miller
What were the major trends in the data center industry during 2011? We’ve identified 10 trends that had a significant impact on the sector. Here’s our list:
1. The Cloud = Business for the Data Center Industry
About once a week I still see goofy headlines asserting that cloud computing is bad news for data centers. The reality, which became crystal clear in 2011, is that the growth of cloud computing means big business for the data center industry. Virtual servers don’t magically float in the clouds. They all live in physical servers, inside data centers. Cloud technologies have driven demand for more efficient data center space that can support higher-density computing workloads. That trend manifests itself in many ways – a hardware refresh, or a data center retrofit, or outsourcing to a cloud specialist, or leasing colocation space or wholesale data center suites. Cloud growth at Rackspace means more leasing for DuPont Fabros, international expansion for Salesforce.com means more business for NTT, and Twitter’s need for impoved latency and redundancy means business for QTS. Not to mention that the data center providers who were most aggressive about moving into enterprise cloud, Terremark and Savvis, were both acquired this year. On virtually all fronts, 2011 was the year in which cloud computing moved from discussion to dollars, and the data center industry was a major beneficiary.
2. Modularity Goes Mainstream
Another technology that saw adoption shift gears was the modular data center. The trend was solidified by a steady stream of announcements of new projects and new customers – something that had been conspicuously absent during the first few years of containerized offerings. It wasn’t just the number of modules, either. Adoption moved beyond “hyper-scale” players like Microsoft, Amazon and eBay. The new projects featured enterprise customers, and even financial firms – including a 21-container “data center park” installation for a European financial firm, a 4.5 megawatt modular install for Allianz Global Investors, and interest in modular designs from Wells Fargo. Service providers like DataPipe, Phoenix IT and Logicalis also went public with their use of modular solutions to expand. “The modular approach works very well,” said Sean McAvan, Vice President of EMEA Sales for Datapipe. “The modular approach gives us the comfort level that we can continue to acquire clients and build gradually.” Modular data centers are not for everyone. How big is the potential market? A survey of Data Center Knowledge readers found that 35 percent are currently either using modular products or evaluating them with an eye towards adopting the technology in the next 12 to 24 months.
3. Buying Binge by Telcos
This was the year in which the telecom companies went cloud shopping. Kicking off the year was a blockbuster deal in which Verizon paid $1.4 billion to buy Terremark, followed by CenturyLink’s $2.5 billion acquisition of Savvis. There was also action at the local and regional level, like Windstream’s deal for PAETECH. The icebreaker was the 2010 deal in which Cincinnati Bell bought CyrusOne for $525 million, which has produced strong returns from CyrusOne’s colocation’s business. “How many disappointed buyers are there?” asked Mark Thorsheim, Partner and Managing Director of DH Capital, at the Tier 1 conference in September. “Just about every deal has produced strong results. We’re in a very robust sector. Where else are private equity firms and lenders going to do better? They’ve really done exceedingly well.”
4. Data Centers Go Further Afield
Data center hubs used to live in the central business district of major cities. This was a year in which some of the biggest projects wound up in places you’ve never heard of before. Places like Prineville, Oregon or Forest City, North Carolina or Lulea, Sweden – three cities where Facebook is investing about a billion dollars in data center projects. The largest cloud computing providers are focused on cheap power, tax incentives and a cool climate to support free cooling. This year’s hot destination is Oregon, which meets all three requirements. In addition to the new sites for Facebook in Prineville and Amazon in Umatilla, there are new projects in Hillsboro for Adobe, NetApp and Fortune Data Centers – along with rumors that Apple and Rackspace are scouting sites. This trend toward rural locations, which began in 2007, continues to remake the data center map. With more states than ever offering incentives, expect the industry geography to continue to diversify in 2012.
5. “Open” Comes to the Data Center
The Open Compute Project was launched in April to publish data center designs developed by Facebook for its Prineville, Oregon data center, as well as the company’s custom designs for servers, power supplies and UPS units. Facebook’s decision to open source its designs prompted expectations that the move could democratize data center infrastructure, making cutting-edge designs available to companies that can’t afford their own design team. The Open Compute project is just one of a handful of initiatives hoping to bring standards and repeatable designs to IT infrastructure. These include the Open Data Center Alliance, Open Networking Foundation, Open Source Routing Forum and OpenStack Foundation to develop a cloud computing platform. What’s driving all this openness? “Some of the ‘rules’ that drive our industry are wrong, and sharing data will help change that,” said James Hamilton, a Distinguished Engineer at Amazon Web Services, who noted shifts in industry practice on data center temperature and humidity. “Progress happens when people get frustrated with something,” added Andy Bechtolsheim, founder of Sun Microsystems and now Arista Networks.