Intel Invests $700 Million in Immersion Liquid Cooling Solutions

Submer
(Image credit: Submer)

Intel on Thursday said it would invest $700 million in its new research and development facility that will design next-generation immersion liquid cooling solutions and other data center-oriented technologies. In addition, the company rolled out the industry's first open intellectual property (open IP) immersion liquid cooling solution and reference design that enables data centers to start using immersion liquid cooling without investing in expensive custom solutions.

The first step towards democratization of immersion liquid cooling solutions commenced today when Intel introduced the industry's first open IP reference design of an easy-to-deploy and easily scalable total immersion liquid cooling solution. The reference design is a proof of concept that will be completed in collaboration with Intel Taiwan and across the Taiwanese ecosystem in a phased approach. Many server OEMs reside in Taiwan, so working closely with them will allow Intel to address server suppliers and server users.

But Intel does not stop with a single reference design. The company plans to establish its new Oregon Research and Design Mega Lab in its Jones Farm campus dedicated to immersion cooling, water usage effectiveness, and heat recapture and reuse. Construction of the new center will start today, and it will begin operations in late 2023.

(Image credit: Intel)

The new lab will ensure that Intel's future data center products are Xeon, Optane, network interfaces, switch gear, Agilex FPGAs, Xe accelerators, Habana accelerators, and other products under development and ready for immersion liquid cooling. Essentially, Intel wants ILC to become as widespread as traditional air and liquid cooling systems. 

Modern Intel Xeon Scalable CPUs have a thermal design power of around 270W per socket. In contrast, artificial intelligence and high-performance computing accelerators can consume up to 700W of power per OAM or SXM5 socket. Furthermore, with heat dissipation of around 6000W per machine, air and liquid cooling are losing their attractiveness in terms of costs and efficiency as power consumed by today's chillers accounts for 35% ~ 40% of total data center power consumption, according to data by 2SRSi

Intel believes that immersion liquid cooling with energy reuse could reduce the power consumption of data center cooling systems and carbon emissions, making data centers cheaper to operate and lowering pollution emitted by various power plants. But there is a problem with immersion liquid cooling (ILC) solutions: virtually all deployments of ILC use expensive proprietary hardware designs. To make immersion liquid cooling more accessible to mainstream customers, Intel has been working with various ILC specialists for the past year or so

"Intel's dedication to its global partnerships is evident with these announcements today," said Sandra L. Rivera, Intel executive vice president, and general manager of the Datacenter and AI Group. "The future of the data center and data center design is based on innovative and sustainable technologies and practices, and I am proud of the work we are doing every day to help make a sustainable future a reality."

Anton Shilov
Freelance News Writer

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • hotaru251
    i know its not, but it would be funny if this was due to how hot their new chips run.


    just wish it would gte more attention as the idea of a immersion system is great.

    eveyr few yrs u hear progress but in end nothing comes of it x_x
    Reply
  • thisisaname
    hotaru251 said:
    i know its not, but it would be funny if this was due to how hot their new chips run.


    just wish it would gte more attention as the idea of a immersion system is great.

    eveyr few yrs u hear progress but in end nothing comes of it x_x

    I was thinking it could be how hot their new graphics chips run :)
    Reply
  • InvalidError
    hotaru251 said:
    i know its not, but it would be funny if this was due to how hot their new chips run.
    If it was only specific chips getting warm, then that can be taken care of by cooling blocks.

    For datacenters though, immersion cooling would be considerably more power-efficient than air-cooling racks since 1L of coolant carries as much heat as ~1000L of air using a tiny fraction as much space for heat exchange between the heat source and coolant. The coolant-to-refrigerant heat exchangers for chilled liquid are also be much smaller than air conditioning coils.
    Reply
  • Alvar "Miles" Udell
    InvalidError said:
    For datacenters though, immersion cooling would be considerably more power-efficient than air-cooling racks since 1L of coolant carries as much heat as ~1000L of air using a tiny fraction as much space for heat exchange between the heat source and coolant. The coolant-to-refrigerant heat exchangers for chilled liquid are also be much smaller than air conditioning coils.

    And the coolant can be used for other ventures, such as aquaculture, which could minimize cooling costs, or even use them to turn a profit.

    A Japanese data center is using waste heat to farm eels | TechSpot
    Reply
  • edzieba
    It doesn't appear the open standard is actually available yet. The document attached to the press release does clarify that this is a single phase immersion system- i.e. the fluid never boils, having a boiling point >150°C - rather than a phase-change system like the 'common' DIY Fluorinert immersion systems.
    Reply
  • Krotow
    Intell will now sell their branded kitchen kettles. For obscene prices.
    Reply
  • gg83
    If we are gonna put fans of every component, we might as well just immerse the whole thing.
    Reply
  • -Fran-
    Alvar Miles Udell said:
    And the coolant can be used for other ventures, such as aquaculture, which could minimize cooling costs, or even use them to turn a profit.

    A Japanese data center is using waste heat to farm eels | TechSpot
    I was going to say. This is getting to the point where putting your data center next to a nuclear plant may be a good idea as both could use the same water treatment plants going into it for cooling and disposing of it. Although I don't know how if water coming out of a nuclear plant is actually usable; I suppose it is unless there's a radiation leak.

    Anyway, putting data centers closer and closer to the shores may not be a bad idea going forward.

    Regards.
    Reply
  • KyaraM
    -Fran- said:
    I was going to say. This is getting to the point where putting your data center next to a nuclear plant may be a good idea as both could use the same water treatment plants going into it for cooling and disposing of it. Although I don't know how if water coming out of a nuclear plant is actually usable; I suppose it is unless there's a radiation leak.

    Anyway, putting data centers closer and closer to the shores may not be a bad idea going forward.

    Regards.
    https://cdn.britannica.com/62/162162-050-586ADA35/diagram-nuclear-power-plant-reactor.jpgWell, in theory it might be usable if you can condense it down since it would be absolutely pure water vapor without radiation. However, considering what a risk factor nuclear plants are, I would rather not have (read, shut them down) them and use the cooling water for only the data centers instead if possible.
    Reply
  • -Fran-
    KyaraM said:
    https://cdn.britannica.com/62/162162-050-586ADA35/diagram-nuclear-power-plant-reactor.jpgWell, in theory it might be usable if you can condense it down since it would be absolutely pure water vapor without radiation. However, considering what a risk factor nuclear plants are, I would rather not have (read, shut them down) them and use the cooling water for only the data centers instead if possible.
    Ah! Right. I forgot the simple fact most nuclear plants evaporate the water instead of circling it. That makes it a moot point for "re-use", but at least the entry point can be shared. A data center can certainly benefit from refreshing the coolant every now and then instead of re-circulating the same coolant over and over in a closed system. Probably way more expensive to build though, but I'd argue it would run for a lot of years with no problem. Upgrade cycles could be a problem I guess? Oh welp.

    Regards.
    Reply