In another sign of the momentum for modular data center designs, the federal government is developing a guide to help agencies choose among the growing number of container-based offerings. Industry consultant Mark Bramfitt says he is working with the General Services Administration (GSA) and the Department of Energy’s Lawrence Berkeley National Laboratory (LBNL) to develop a guide to evaluating container data centers and next-generation modular designs.
“We are interested in providing a clear snapshot of the industry today, with the primary goal of describing a specification and deployment planning process that will be relevant in the future,” Bramfitt writes. Bramfitt says the guide will include a case study on a container deployment at the University of San Diego and is seeking input from vendors and additional case studies from end users.
A growing number of vendors are now introducing data center container products, with recent entries including Datapod, i/o Data Centers and Colt. The growing interest in containers dovetails with an intensifying focus on modular designs that can provide predictable, repeatable components for IT and power systems.
The federal government has already begun what looms as the largest data center consolidation in history, hoping to dramatically reduce IT operations that are currently distributed among more than 1,100 data centers. The Obama administration recently ordered the heads of federal agencies to accelerate efforts to sell or consolidate underused real estate, singling out data enters for special attention.
Will a container-powered cloud computing offering prove to be a compelling cost-cutting strategy for the Obama IT team? It remains to be seen. But the creation of a buying guide for these products suggests that modular data center designs will, as Bramfitt puts it, be “relevant in the future.”