Wrapping up a three-month long experiment operating an underwater data center, Microsoft claims to have come up with a solution for one of the biggest problems facing worldwide tech companies in the 21st century: finding space for their massive data centers, and most importantly of all, cooling them. Microsoft’s underwater experiment was small-scale, a proof of concept, which involved assembling a server rack in a water-tight steel cylinder and lowering the contraption into the Pacific Ocean just off the coast of California.
According to Microsoft’s data from the experiment, the server was actually in use by consumers, without their knowledge. As part of the underwater venture, Microsoft routed some of their customers computing workload through the submarine server. The experiment was such a success, they ran the server two months longer than they’d originally planned.
According to Peter Lee, vice president of Microsoft’s research group “Research NExT,” the next experiment will replicate the first one with a few key changes. Namely, the underwater data center will be some four times larger, or at least house four times the computing power, and in an effort to increase the efficiency of the underwater server farm, it will be equipped with turbines to convert ocean currents into electricity, reports CNN Money.
“Our first experiment was like dipping our pinkie toe in the water, and now we’re going for the big toe,” Peter Lee told CNN Money.
Data centers, the conventional dry-land variety, are huge. They’re built wherever land and energy are cheap, and sometimes they’re built near bodies of water that can be used to cool the hundreds, sometimes thousands, of individual servers they operate. Google, for instance, has a massive data center in Eastern Oregon, with massive cooling towers sunk deep into the industrial complex. At dusk, they vent water vapor quietly into the air. One of Google’s data centers even uses toilet water to cool their servers, according to Wired.
It’s a problem all of the tech industry’s major players face, not just Microsoft and Google. Apple, Intel, any company with a logo on a personal computer or mobile phone probably has a massive data center somewhere in remote locales chosen for their cheap land and cheap cooling, rather than their convenience to consumers or employees. With remote data centers, companies have to pay premiums to upgrade local Internet infrastructure to handle the speeds the data centers will require.
Microsoft says their underwater data center could change all that. According to Ars Technica, around of half the world’s population lives within a few miles of an ocean, and if Microsoft could outsource its data needs to the ocean floor, bringing high density data storage closer to major population centers, they could more easily serve their consumers. Essentially, you could expect higher-speed connections to Microsoft products like cloud storage or their search engine Bing.
Also, Microsoft as a company could enjoy some big savings. The underwater server farms wouldn’t require the massive infrastructure that conventional data centers require, i.e. onsite staff or physical buildings. According to Ars Technica, the underwater data centers can be built and deployed in around ninety days. A traditional data center can take up to 2 years to build and staff.
The underwater data centers, code-named Project Natick, are still in their infancy, but Microsoft’s Peter Lee is enthusiastic about the possibilities, particularly, since the underwater data centers would potentially be more environmentally-friendly than typical dry-land data centers. Microsoft claims the data centers net-heat would be around zero, given that it would be putting off heat and consuming energy from the ocean currents. They’re also less noisy and environmentally disruptive than typical data centers, claims Microsoft. Apparently, nearby shrimp and crabs produce more noise than the experimental underwater data center.
[Photos via Microsoft]