U.S. Injects $112M into Supercomputing to Enable Fusion Future

fusion
(Image credit: Shutterstock)

They say that good things come in threes, and the U.S. is definitely banking on the  Lawrence Livermore National Laboratory (LLNL) to deliver just that when it comes to hot fusion. Having achieved their second successful fusion ignition with an energy surplus (meaning that more energy was produced than was required to achieve the fusion reaction itself) within a national lab on July 30th, the U.S. now aims to spur research and facilitate a successful third ignition — and beyond. To do that, the country is ready to invest a further $112M into a dozen supercomputing projects.

Fusion (short for nuclear fusion) refers to the ability to fuse two light atoms into a single, heavier one: a process that when successful, leads to the release of massive amounts of energy in the form of electrons. Unlike fission (which works by breaking down heavy elements such as uranium or plutonium), nuclear fusion is expected to be a safe, nearly-unlimited source of energy. When done right, fusing two light atoms (such as deuterium and tritium, each a hydrogen isotope that carries additional neutrons compared to "plain" hydrogen) brings about an energy surplus that is more than four times the amount that fission processes can generate. That also makes it a process worth about four million times the amount of energy released from coal burning (at a per-kilogram basis) — its merits are obvious.

It's on the back of that promise that the newly-instated Scientific Discovery through Advanced Computing (SciDAC) program combines the two pre-existing programs from the Department of Defense with the aim of streamlining programs invested into solving complex fusion energy problems using supercomputing resources, including exascale systems. 

"The modeling and simulation work of these partnerships will offer insight into the multitude of physical processes that plasmas experience under extreme conditions and will also guide the design of fusion pilot plants," said DoE Associate Director of Science for FES, Jean Paul Allain. 

There's still a lot of work to achieve a sustainable, surplus-energy fuel ignition that actually rockets humanity into a clean, energy-conscious and abundant future, however. The July 30th fusion ignition did provide a higher energy output than was delivered into the light-atom fuel capsule (although it's unknown how much better it was than the 2.05 megajoules-in, 3.15 megajoules-out achieved in December of last year), but that only takes into account the energy transmitted unto the pellet itself. Unfortunately, the way that energy is delivered into the pellet (via 192 lasers) is still extremely inefficient — LLNL needed to push a staggering 322 megajoules to fire the lasers themselves, which still left the process on a global energy deficit.

But the way forward is to better understand the quantum processes surrounding fusion. Until quantum computers themselves can provide a viable computing platform that can crack that code (and there's no telling how long that will take — but it's likely in the decade mark), supercomputers based on standard computing are the best way we have to look into the ordered chaos of processes that occur when the laser strikes the pellet. 

The $121M will certainly be a boon there — but it definitely won't be enough. Yet we humans have this strange way of looking farther ahead — of chasing the carrot — than simply focusing on what is right in front of us. This grant injection is part of that, and a healthy injection into the High performance Computing (HPC) landscape — however small a slice of the total it ultimately turns out to be.

Francisco Pires
Freelance News Writer

Francisco Pires is a freelance news writer for Tom's Hardware with a soft side for quantum computing.

  • InvalidError
    322MJ to pump the lasers to deliver 2MJ of energy to the ignition point and get only 3MJ back doesn't look like an energy gain to me... and that doesn't even count the energy required to produce the fuel pellet, set up the ignition chamber and clean up after half of the test fixture gets destroyed by the ignition event.

    They are going to need a lot more than AI and supercomputers to achieve the 120+X efficiency increase needed to just break even. If anything useful comes out of this, it will be things like improved lasers and optical amplifiers, not the fusion method itself.
    Reply
  • evdjj3j
    Deuterium doesn't have an extra electron it has a neutron in it's nucleus while normal hydrogen does not have any neutrons.

    https://en.wikipedia.org/wiki/Deuterium
    The nucleus of tritium has two neutrons instead of none.

    https://en.wikipedia.org/wiki/Tritium
    All three isotopes only have one electron.
    Reply
  • JTWrenn
    We need to look to different tech instead of just building bigger and bigger versions of things that have never gotten anywhere near real output. Not to mention the issues with their fuel source and breaking of their system with high energy particles.

    I think going with a helion type pulse heating system makes a lot more sense when you dig down into the long term usage and fuel production procurement side of things. All the current giant laser or tokamak type reactors are really science experiments to lean from. I just don't see them as ever being fully functional production reactors.
    Reply
  • weber462
    BOINC Project. Support science
    Reply
  • InvalidError
    JTWrenn said:
    All the current giant laser or tokamak type reactors are really science experiments to lean from. I just don't see them as ever being fully functional production reactors.
    I think the stellarator is the most likely one to become economically viable for GW-scale baseline power production.

    Helion's overall simplicity looks interesting but these things are only projected to produce about 50MWe a pop assuming the Microsoft project is only one reactor and the project is actually successful. At that scale, they may be most suitable as substation stabilizers: operate them at ~50% as the baseline and use their practically instantaneous response time to provide +/-25MW of local buffer against source (renewables) and load fluctuations. These could be handy for substations that feed Tesla v4 supercharger stations and competing EV charging providers once EVs become more commonplace.
    Reply
  • thisisaname
    InvalidError said:
    322MJ to pump the lasers to deliver 2MJ of energy to the ignition point and get only 3MJ back doesn't look like an energy gain to me... and that doesn't even count the energy required to produce the fuel pellet, set up the ignition chamber and clean up after half of the test fixture gets destroyed by the ignition event.

    They are going to need a lot more than AI and supercomputers to achieve the 120+X efficiency increase needed to just break even. If anything useful comes out of this, it will be things like improved lasers and optical amplifiers, not the fusion method itself.
    Ignoring most of the energy input plus support infrastructure and only counting the direct energy is how they turn less than 2% yield into 150% yield.
    Reply
  • JTWrenn
    InvalidError said:
    I think the stellarator is the most likely one to become economically viable for GW-scale baseline power production.

    Helion's overall simplicity looks interesting but these things are only projected to produce about 50MWe a pop assuming the Microsoft project is only one reactor and the project is actually successful. At that scale, they may be most suitable as substation stabilizers: operate them at ~50% as the baseline and use their practically instantaneous response time to provide +/-25MW of local buffer against source (renewables) and load fluctuations. These could be handy for substations that feed Tesla v4 supercharger stations and competing EV charging providers once EVs become more commonplace.
    That is per reactor I believe, so cost of reactor and longevity/cost to run is the real issue. If those work out than you scale and refine and just get better and better. The issue is how large can it scale and how many can you run safely in a given footprint say vs a wind or solar farm. If they are extremely safe you could spread them out and have a much better grid, but the question is how safe.

    The safety is the biggest thing, and why nuclear is such a hassle. Security and distance make it hard to do, but if these are safe even in the worst possible explosive failure than you could have distributed power in where you need it.
    Reply
  • bit_user
    Lawrence Livermore National Laboratory (LLNL) to deliver just that when it comes to cold fusion.
    I think somebody still has room-temperature superconductors on the brain. The referenced article said nothing about cold fusion - just regular, hot fusion!

    The key thing is to have energy-producing fusion (i.e. that emits more energy than is required to trigger it) as a practical power-generation source.
    Reply
  • bit_user
    InvalidError said:
    They are going to need a lot more than AI and supercomputers to achieve the 120+X efficiency increase needed to just break even.
    Since there are some pretty smart people working on this stuff, I'd guess they're aware of that issue. Perhaps if the reaction could be self-sustaining, the ignition energy wouldn't matter too much. Or, maybe they find better materials that are easier to ignite and provide a better yield.

    InvalidError said:
    If anything useful comes out of this, it will be things like improved lasers and optical amplifiers, not the fusion method itself.
    The experiment cited in the article represents a milestone, but not on par with the best ideas currently under development. Its significance was reproducing an experiment from a couple years prior, to help prove that it wasn't merely a fluke or bad measurements. Simply confirming others' results is an important part of scientific process.

    Apparently, there's enough confidence that big breakthroughs are around the corner that there are several startup companies with their own fusion techniques currently under development. In just the past year, 13 fusion energy startups have been founded:
    https://sciencebusiness.net/news/nuclear-fusion/german-private-fusion-firms-multiply-government-pledges-fresh-research-money
    Reply
  • InvalidError
    JTWrenn said:
    If they are extremely safe you could spread them out and have a much better grid, but the question is how safe.
    Safety-wise, the whole spiel about hydrogen fusion is that worst case, you have a deuterium/tritium gas leak which is going to disperse in the atmosphere and rain back down in the water it likely originated from, no big deal. Spreading them out might be an issue with trucking the fuel to however many locations there are, which means that much more fuel storage infrastructure and staff to deal with. For a grid-scale deployment, you'd likely still end up with concentrated deployments near major substations for convenience, management and maintenance efficiency.
    Reply