Microsoft’s mad-scientist data center research crew appears to have liked the results they’ve seen after submerging a relatively small underwater data center pod somewhere off the coast of California last year as a test. The team has stepped up its underwater data center ambitions, the project’s lead told a conference in New York Wednesday.
While still in preliminary planning stages, the next underwater deployment may be about four times the size of the first pod, or about the size of a shipping container, Ben Cutler, the project’s manager, said, according to Data Center Frontier.
The first pod, a 10-by-7-foot cylindrical shell that contained a single rack of servers, went underwater around August of last year. The Project Natick team pulled it out and brought it back to the Microsoft headquarters in Redmond, Washington, in December to collect experimental data.
There are multiple motivations behind the experiment, one of them being that half of the world’s population lives close to ocean shores, and it’s a lot easier to get permits to submerge equipment underwater than it is to construct data centers on land, Cutler told DatacenterDynamics.
He presented on the project at this week’s DatacenterDynamics Enterprise conference in New York.
Microsoft also likes the consistency of deploying close to the ocean floor, which has relatively consistent water temperature and doesn’t get disturbed by storms and currents. “The ocean is more of a standard place,” Cutler told DCD. “It’s more consistent, both physically, and in the laws in the ocean, which are more consistent.”
There’s also a sustainability aspect to the project. The vision, in the long run, is to have underwater data centers that are only connected to shore by a network cable. They would be powered by turbines that leverage tidal energy and cooled by ocean water. These submarine IT facilities could also become artificial reefs, becoming ecosystems filled with marine life.
After 105 days of operation 30 feet deep, results Cutler’s team have observed from the first capsule are encouraging. None of the hardware in the pod failed, and its cooling system performed more efficiently than the researchers expected.
They are now planning a much larger experimental deployment, which could go up to as much as half a megawatt of IT capacity, he said. In the long run, Cutler envisions entire 20 MW seaborne server farms providing low-latency cloud services to densely populated coastal areas around the world.
There are many barriers Project Natick will have to overcome before the idea becomes a viable solution.
One potential problem could be warming of the water around a larger server farm. While the artificial-reef aspect of it may be good for marine life, as our contributor Mark Monroe writes, the heat it would generate could create microclimates and attract unexpected species.
Cutler told DCD that the next Natick deployment will probably be in deeper waters, where there is less marine life, and have a “spaghetti-like” heat exchanger, making it less attractive for plants and animals.
Another problem the researchers will have to address is slack tide. They will have to figure out how to power the servers during the two periods each day when there is no tidal energy to move the turbines.
There’s also the question of cost. According to Monroe, it can cost 10 to 20 percent more to build water-tight modules than it costs to build on-land modular data centers in use today.