Microsoft shelves its underwater data center — Project Natick had fewer server failures compared to servers on land
“My team worked on it, and it worked,” says CO+I Head Noelle Walsh.
Microsoft has quietly discontinued its Project Natick underwater data center (UDC) experiment, which began in 2013. The company confirmed the news with DatacenterDynamics, with Head of Microsoft’s Cloud Operations + Innovation Noelle Walsh saying, “I’m not building subsea data centers anywhere in the world.” She later added, “My team worked on it, and it worked. We learned a lot about operations below sea level and vibration and impacts on the server. So, we’ll apply those learning to other cases.”
Data centers are expected to grow exponentially in the coming years, with Nvidia selling over 3.76 million data center GPUs last year alone. These cards are expected to consume 14.3 TWh of electricity in a year, which doesn’t include the cooling solutions. According to DataSpan, 40% of data center consumption is in cooling systems, so if Microsoft can find a way to reduce or even eliminate this cost, it could reduce its power requirements for building a data center.
Aside from the potential energy savings, Microsoft discovered other things from the servers it installed off the coast of Scotland in 2018. The company only lost six of the 855 submerged servers versus the eight servers that needed replacement (from the total of 135) on the parallel experiment Microsoft ran on land. It equates to a 0.7% loss in the sea versus 5.9% on land.
The company said that the primary reason for this longevity is seawater’s temperature stability and the inert nitrogen gas used to protect the servers. When asked whether using robots in data centers as part of its learnings, Walsh said, “We’re looking at robotics more from the perspective that some of these new servers will be very heavy. How can we automate that versus having people push things around? We are learning from other industries on robotics, but we’re also very cognizant that we need people. I don’t want people worried about their jobs.”
While Microsoft has concluded its undersea data center research, China just began its submerged server project in 2023, lowering 68,000 square meters of servers on the southern coast of Hainan. On the other hand, Microsoft did not indicate whether it would start another UDC project in the future.
Welsh said, “I would say now we’re getting more focused. We like to do R&D and try things out, and you learn something here and it may fly over there. But I’d say now, it’s very focused.” Nevertheless, Microsoft isn’t stopping its data center development projects. The company is reportedly partnering with OpenAI to build a $100 billion AI supercomputer data center, and it has nuclear ambitions to build modular reactors for projects like these.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.
-
bit_user Sounds like we're safe from boiling the oceans for just a little bit longer, but I think this idea probably has enough attractive elements that it's going to come back. China has taken to it, and there are others besides Microsoft that could take it up.Reply
Actually, one interesting possibility would be if you could combine a coastal datacenter with a desalinization plant. Maybe use the datacenter's cooling system to evaporate large amounts of sea water? Then, use more seawater to re-condense it. They could use solar and offshore wind as supplementary power sources. The heated effluent might be usable for aquaculture applications. -
thestryker This was an interesting concept but as always feasibility has to come into play which I assume is why we haven't had a ton of companies trying to jump on the bandwagon.Reply
This is a fantastic point and realistically the way it should be approached to minimize waste.bit_user said:Actually, one interesting possibility would be if you could combine a coastal datacenter with a desalinization plant. Maybe use the datacenter's cooling system to evaporate large amounts of sea water? Then, use more seawater to re-condense it. They could use solar and offshore wind as supplementary power sources. The heated effluent might be usable for aquaculture applications.
Reminds me of farms where they were installing solar panels and chose to do it over irrigation channels which ended up not only generating electricity, but lowering evaporation. -
mcfridgeguy23
This is a common practice, but not that extreme. Many data centers use 'heat pumps' for their cooling systems. In most cases, the waste heat isn't hot enough with synthetic refrigerants to boil water. However with the increase of more natural refrigerants in data centers, they are able to provide near boiling water temperatures and in some cases, steam.bit_user said:Actually, one interesting possibility would be if you could combine a coastal datacenter with a desalinization plant. Maybe use the datacenter's cooling system to evaporate large amounts of sea water? Then, use more seawater to re-condense it. They could use solar and offshore wind as supplementary power sources. The heated effluent might be usable for aquaculture applications. -
cryoburner The company only lost six of the 855 submerged servers versus the eight servers that needed replacement (from the total of 135) on the parallel experiment Microsoft ran on land. It equates to a 0.7% loss in the sea versus 5.9% on land....
...The company said that the primary reason for this longevity is seawater’s temperature stability and the inert nitrogen gas used to protect the servers.
You can't really compare the reliability of the on-land experiment with the under-sea one when they were clearly using different setups. Did being underwater have anything to do with the difference in failures whatsoever? Or was it just the fact that they knew the hardware was going to be more difficult to access in the event of a failure, so they built in additional redundancy and other protective measures? Was the server hardware itself even similar, and did both sets of servers see a similar amount of load on that hardware? None of this is answered here, though at the very least we know that one set of hardware was in a sealed nitrogen environment, while the other was likely just in regular air.
Most importantly, I suspect the underwater servers were found to simply not be cost effective, hence why the project is getting scrapped. Even if being underwater had anything to do with increasing reliability or reducing server operating costs, it doesn't matter if the cost to build, install and retrieve these underwater server systems more than negates those savings. If it actually provided some tangible benefits deemed to be worth the increased cost, you would have seen them expanding on the concept, not scrapping it after a decade. -
bit_user
If the on-land setup was intended as a control, then it probably was comparable in their minds. Even if that meant using different hardware, if the hardware differences were deemed necessary for underwater deployment, then I'd still expect they would regard it as a valid control.cryoburner said:You can't really compare the reliability of the on-land experiment with the under-sea one when they were clearly using different setups.
Good questions. I'm sure they considered these factors, but I doubt they published a detailed breakdown.cryoburner said:Did being underwater have anything to do with the difference in failures whatsoever? Or was it just the fact that they knew the hardware was going to be more difficult to access in the event of a failure, so they built in additional redundancy and other protective measures?
The hardware shouldn't be more different than necessary, if the on-land deployment was intended as a control.cryoburner said:Was the server hardware itself even similar, and did both sets of servers see a similar amount of load on that hardware?
Yes, good points.cryoburner said:Most importantly, I suspect the underwater servers were found to simply not be cost effective, hence why the project is getting scrapped. Even if being underwater had anything to do with increasing reliability or reducing server operating costs, it doesn't matter if the cost to build, install and retrieve these underwater server systems more than negates those savings. If it actually provided some tangible benefits deemed to be worth the increased cost, you would have seen them expanding on the concept, not scrapping it after a decade. -
tamalero I wonder if in the future we will be able to see a full enclosed system where the water goes all the way to boiling point, powered a turbine, cooling down and it supplies a part of what they consume in power.Reply -
Integr8d thestryker said:This was an interesting concept but as always feasibility has to come into play which I assume is why we haven't had a ton of companies trying to jump on the bandwagon.
This is a fantastic point and realistically the way it should be approached to minimize waste.
Reminds me of farms where they were installing solar panels and chose to do it over irrigation channels which ended up not only generating electricity, but lowering evaporation.
Which always sounds good. But then the higher humidity drives mold growth which requires an antifungal (which can only be made by XYZ using Yadda Yadda chemicals which leak into the water and OMGGGG).
There’s no free lunch. -
JTWrenn It is just the next generations version of corporate farms. Destroy public resources to decrease cost and pass it on to consumers in ways other than raising prices. We need to stop it so companies can't use public resources to sell things. You want to cool your servers, you pay for everything they do to our public lands. Then pass that on to customers and we see if they still want it at the true cost.Reply -
bit_user
I wonder if that would capture enough energy to be worth the trouble. Anyway, it will remain a hypothetical question, as long as there's no phase-change cooling system in place where condensation happens above the boiling point of water.tamalero said:I wonder if in the future we will be able to see a full enclosed system where the water goes all the way to boiling point, powered a turbine, cooling down and it supplies a part of what they consume in power. -
bit_user
Oh noesss!!!111 High humidity in irrigation channels? Who'd have expected such a thing? Plus, I mean, fungus in the dirt? How'd that get there?? What ever will we do if there's fungus contaminating our nice, clean soil??Integr8d said:Which always sounds good. But then the higher humidity drives mold growth
IMO, it'd be a good idea to try fact-checking some of these thoughts, before being so reflexively negative.
Unlocking efficiencies in systems from better design is what engineers do. There's not always a free lunch, but it can usually be cheaper or better tasting - sometimes both!Integr8d said:There’s no free lunch.