In the past decade, the number of data centers operated by the U.S. government has skyrocketed from 432 to more than 1,200. The agencies building these new facilities must have been adding capacity because they had filled their existing servers and space, right?
Wrong. “One of the most troubling aspects about the data centers is that in a lot of these cases, we’re finding that server utilization is actually around seven percent,” Federal Chief Information Officer Vivek Kundra said Wednesday in an address at the Brookings Institute. That means some agencies have been investing in new data centers instead of tapping the 93 percent of capacity available on their existing servers.
Boosting The Case for Cloud Computing
“That’s unacceptable when you think about all the resources that we’ve invested,” said Kundra, who once again made the case for shifting government IT to a cloud computing model.
Server utilization is sometimes referenced as a “dirty secret” of IT operations. Analysts cite varying average ranges of utilization, typically ranging from 10 percent to 30 percent of available capacity. Low utilization is often attributed to servers being dedicated to a particular operating system, or the need for critical applications to be isolated from other workloads. A common solution is virtualization, which allows users to run multiple OSes on the same hardware, but is not appropriate for some applications and workloads.
Kundra used the utilization data to argue for the cloud computing model. We’ve previously noted the “data centers are evil”narrative running through some of Kundra’s presentations. On Wednesday he discussed a specific instance in which he said the cloud model saved the government money.
$600,000 to Launch A Blog?
“We’ve already begun our shift to cloud computing,” said Kundra. “We started with a strategy on looking at a ‘cloud first’ policy in terms of areas where we’re not compromising national security in any way or the privacy of the American people. An example with TSA was that they were going to spend approximately $600,000 to stand up a blog until the CIO came in and said, well, wait a second, why do we need to spend all this money on creating a blog when all the software is available online for free?”
Yes, you read that correctly. A government agency was going to spend $600,000 to set up a blog. While cloud hosting is one solution, that issue could also be addressed by installing free blog software on some of that underutilized server capacity, which would also be free.
Saving $1.7 Million on USA.gov
But other early government cloud projects have produced more defined benefits. The General Services Administration recently said it is saving $1.7 million a yearby hosting the USA.gov federal information portal web site on Terremark’s Enterprise Cloud.
The GSA previously paid $2.35 million in annual costs for USA.gov, including $2 million for hardware refreshes and software re-licensing and $350,000 in personnel costs, compared to the $650,000 annual cost to host the site with Terremark.
Kundra made it clear that one of the goals of the governments’ impending data center consolidation effort will be to identify projects that can run in the cloud “instead of just webifying our brick and mortar institutions.”
Moving Beyond Hardware Migration
“It makes no sense if consolidation is nothing more than taking 10,000 servers and moving them from ten data centers to one data center,” said Kundra. “Part of what we’re trying to do in the process of consolidating data centers is to figure out where cloud computing makes sense for the federal government.”
Kundra also announced the creation of a joint authorization board to provide security certification for cloud computing apps. The effort will be led by NIST, which will work with the departments of defense and homeland security.