Sun Unveils ‘Data Center In A Box’
October 17th, 2006 By: Rich Miller
So that’s what Jonathan Schwartz was talking about. Sun Microsystems is introducing an energy-efficient, water-cooled turnkey data center housed in a shipping container that can be quickly deployed to expand existing IT infrastructure. An official unveiling will take place today, but Sun gave an early preview to John Markoff of the New York Times and CNet. The company isn’t commenting on whether its new product is related to a November 2005 report by PBS tech pundit Robert X. Cringley of a data center in a container sighted at Google. The system is planned for commercial availability in the second half of 2007, with prices beginning around $500,000. An excerpt:
Painted black with a lime green Sun logo, the system can consist of up to seven tightly packed racks of 35 server computers based on either Sun’s Niagara Sparc processor or an Opteron chip from Advanced Micro Devices. The system includes sensors to detect tampering or movement and features a large red button to shut it down in an emergency. Once plugged in, it requires just five minutes to be ready to run applications. Sun has applied for five patents on the design of the system, including a water-cooling technique that focuses chilled air directly on hot spots within individual computing servers. The system, which Sun refers to as “cyclonic cooling,” makes it possible to create a data center that is five times as space-efficient as traditional data centers, and 10 percent to 15 percent more power-efficient, Mr. Schwartz said.
Sun’s concept envisions the containerized data centers being deployed quickly in warehouse space to provide quick expansion for rapidly-throwing IT infrastructure. The units might be of particular interest to companies transitioning to new data centers, or those who reach capacity at their existing data center before a new one is completed.
“We are targeting customers who are concerned about saving space, power and getting to market quickly,” said Schwartz. Sun is leasing the units as well as selling them. The possibility of delays at some of the many “greenfields” new construction projects could make this appealing to customers.
It’s not completely plug-n-play, however. The “data center in a box” requires chilled water to support the cooling system, in addition to Internet connectivity and appropriate power infrastructure. Markoff’s story notes that the prototype “sits in a container case adjacent to a Sun office building here (Menlo, Park, Calif.), connected to two large fire hoses for water cooling and 500 kilowatts of redundant power.”
As has been discussed often here, some data center managers have deep reservations about water cooling, and the fire hoses are bound to give these folks a major case of the willies. But Sun’s new offering extends the boundaries of the data center universe, and give additional options to managers of fast-growing enterprises.
The CNet story notes, correctly, that the concept of containers for server equipment is not new. “If they’re thinking they invented it, they’re wrong,” Jerald Murphy, a Robert Francis Group analyst, told CNet’s Steven Shankland, who said he designed shipping container-based computer systems for customers six years ago.
There’s a Russian saying that “there’s nothing more permanent than temporary”. Instead of deploying Blackboxes as an interim fix while traditional facilities are being built, might enterprises follow Jonathan Schwartz’ advice and “revisit basic assumptions”? Or could “trailer park computing” (as Nicholas Carr puts it – http://www.roughtype.com/archives/2006/10/trailerpark_com.php) be swept away by a Google/Microsoft powered computing grid? Carr compares Blackboxes to Edison’s off-the-shelf power plants, which were ultimately replaced by the electrical grid.
Hi Isabel. I think there are several factors that place limits on the potential user base of Sun’s “data center in a box.” Given the space constraints, I don’t imagine it could work without water cooling, and many data center managers simply aren’t ready for liquid cooling. It also doesn’t eliminate the requirement for redundant power infrastructure. Customers and IT executives want 24×7 uptime, and no matter how you house the computing hardware, you still need generators and UPS units (and fire hoses, in this case).
Consider the school house analogy. My son’s school houses a number of classes in trailers, but they don’t include them in the tour for parents, and when the school district builds new schools they don’t design the project with trailers included.
It strikes me as an interesting niche offering rather than a game-changer.
Daniel ThygesenPosted October 19th, 2006
I hope that someone will help me with a question I have…
On the Sun website, they state that Blackbox can handle 10.000 simultaneous users… Where do they get this number from? Do they have a formula or is it practical stress testing?
Thanks in advance