PhysX quietly retired on RTX 50 series GPUs: Nvidia ends 32-bit CUDA app support

PhysX
(Image credit: Nvidia)

Nvidia has quietly retired 32-bit PhysX support on RTX 50 series GPUs — a game-specific graphics technology that was advertised heavily during the 2000s and early 2010s. Nvidia confirmed the technology's end-of-life status (at least the 32-bit version) on the Nvidia forums as a result of 32-bit CUDA applications support deprecation starting with the RTX 50 series.

As far as we know, there are no 64-bit games with integrated PhysX technology, thus terminating the tech entirely on RTX 50 series GPUs and newer. RTX 40 series and older will still be able to run 32-bit CUDA applications and thus PhysX, but regardless, the technology is now officially retired, starting with Blackwell.

PhysX is one of the oldest Nvidia technologies, almost rivaling the age of CUDA itself. PhysX was a proprietary physics simulation SDK capable of processing ragdolls, cloth simulation, particles, volumetric fluid simulation, and other physics-focused graphical effects.

Since its inception in 2004, the PhysX API that Nvidia acquired as part of its Ageia purchase (and then adapted to use GeForce GPUs instead) and physics technology has been integrated into a decent-sized number of games. It was used with several notable AAA games, including the Batman Arkham trilogy, Borderlands: The Pre-Sequel, Borderlands 2, Metro: Last Light, Metro: Exodus, Metro 2033, Mirror's Edge, The Witcher 3, and some older Assassin's Creed titles.

PhysX advertised the idea of running physics calculations on the GPU (formerly an Ageia PPU) rather than the CPU. Running physics on the GPU usually allows for significantly greater rendering performance for physics-related graphical effects, enabling higher frame rates and also improving the quality of physics effects compared to what could be achieved on a CPU. The problem was that PhysX support on Nvidia GPUs was only possible because it used CUDA — a proprietary Nvidia platform that enabled CPU-focused programming languages to be executed on the GPU.

By the late 2010s, PhysX's adoption slowed significantly in favor of more flexible alternative solutions (including CPU- and GPU-compatible solutions). The biggest problem that plagued PhysX was its strict requirement for an Nvidia GPU, preventing it from being used on competing GPUs, consoles, and smartphones. On top of this, Nvidia also began dropping support for some PhysX features later in its life cycle. For example, in 2018, Warframe transitioned from PhysX to a homebrewed physics simulation framework (based on PhysX) due to Nvidia dropping physics particle simulation support.

The only way now to run PhysX on RTX 50 series GPUs (or newer) is to install a secondary RTX 40 series or older graphics card and slave it to PhysX duty in the Nvidia control panel. As far as we are aware, Nvidia has not disabled this sort of functionality. But the writing is on the wall for PhysX, and we doubt there will be any future games that attempt to use the API.

Aaron Klotz
Contributing Writer

Aaron Klotz is a contributing writer for Tom’s Hardware, covering news related to computer hardware such as CPUs, and graphics cards.

  • artk2219
    Just a quick note, Nvidia didnt invent PhysX. They bought Ageia, and even Ageia didnt even develop it themselves, the engine was developed by a swiss company named NovodeX AG, that was aquired by Ageia. Ageia then produced and released stand alone physics accelerator cards using the engine from NovodeX. Nvidia eventually bought Ageia in 2008, and implemented an API that would allow only their GPU's to use the physX engine, and thats where it died.

    https://en.wikipedia.org/wiki/PhysXhttps://ethz.ch/en/industry/entrepreneurship/explore-startup-portraits-and-success-stories/exits/novodex.htmlhttps://en.wikipedia.org/wiki/Ageia
    Reply
  • Jabberwocky79
    PhysX was just the coolest.... Like, it wasn't a gimmick, it genuinely made a huge impact in the quality of the experience. I remember playing Arkham Asylum, first w/o PhysX, and then later with it (once I upgraded my GPU). My jaw was literally dropping. But then again, I don't think Raytracing is a gimmick, so what do I know.
    Reply
  • Jame5
    Man, seems like I want to stick with a 40 series card then. Why upgrade for more drama and less functionality?
    I like the Arkham games, I like Mirror's Edge.
    Reply
  • Jabberwocky79
    So, genuine question... Does this mean that modern non-PhysX games have less physics capabilities than those old PhysX-supported titles , or was PhysX basically made obsolete due to increased capabilities of modern CPUs?
    Reply
  • rluker5
    Jabberwocky79 said:
    So, genuine question... Does this mean that modern non-PhysX games have less physics capabilities than those old PhysX-supported titles , or was PhysX basically made obsolete due to increased capabilities of modern CPUs?
    I think PhysX was made obsolete by lack of use.
    Most of the implementations I've seen were bad, but there were a few pretty good ones. Like the smoke in the first two Metro games, or the fire in Witcher 3 before they stopped using GPU PhysX, or Batman Arkham Knight. My GT730 pciex1 card running the PhysX could keep up with SLI 1080tis running raster in PhysX supporting games. When GPU PhysX stopped working in W3 my CPU load went way up and the flames dropped their refresh rate. There are a ton of games that use a lot of GPU processing to do a worse job at smoke than some passive trash tier card can handle doing better smoke with PhysX. It could have helped games look better on cheaper hardware, but it just wasn't used much. A lot of uses were things like chips that would appear that you could scatter, or some out of place piece of paper strangely flapping in some very localized wind.

    But if you really want to run PhysX in some old game with a new 50 series GPU you could probably just pick up a $20 Quadro K620 off of Ebay and have the PhysX run off of that card. I'm not totally sure on this one since there might be some driver lock preventing you from doing it with a 50 series, but it works with my 3080.
    Reply
  • Jabberwocky79
    Kind of like stereo 3D Vision - awesome tech, never caught on, eventually obsolete. I cared about that more than PhysX, but both were awesome
    Reply
  • renz496
    Jabberwocky79 said:
    So, genuine question... Does this mean that modern non-PhysX games have less physics capabilities than those old PhysX-supported titles , or was PhysX basically made obsolete due to increased capabilities of modern CPUs?
    This issue is about gpu physx specifically. Personally i think gpu physx become less relevent once nvidia come out with PhysX 3 which have improved performance and much better multi cpu core support. Some of the effect that nvidia claim can only be done effectively on gpu with older PhysX 2 are now can be done on cpu woth PhysX 3. As long as you did not go crazy/excessive with it. Personally i think it always can be done on cpu like what havok did with the original red faction game. Some of gpu physx effect are excessive to the point i think it only ruin the look and performance rather than enhance it (like the massive flying brick in scare crow level in arkham asylum).
    Reply
  • edzieba
    People seem to be very confused about what PhysX actually is.

    The PhysX everyone knows from marketing: some proprietary thing that uses the GPU to make flappy fabrics and some extra particles.

    The PhysX actually implemented by hundreds of games: an open source CPU-based physics engine (like Havok et al) used by several different game engines.

    The former is affected by the driver no longer supporting that particular branch of the API, but it was also barely used by any games in the first place, well over a decade ago. The latter is unaffected, because it never relied on the GPU in the first place.
    Reply
  • AkroZ
    Jabberwocky79 said:
    So, genuine question... Does this mean that modern non-PhysX games have less physics capabilities than those old PhysX-supported titles , or was PhysX basically made obsolete due to increased capabilities of modern CPUs?
    They are many games released that use physics extensively provided by game engines like Unity 3D or Unreal Engine, under the hood it's not necessarily CPU bound as they can use OpenCL (or CUDA) to do the same as PhysX.
    But it's true that CPU was not powerful enough to run physics with a good framerate and now you can (i have tested it when released by Ageia).

    PhysX when bought by nVidia have given them the marketing tool for CUDA like DLSS has been the marketing tool for RTX.
    They contracted many game editors to integrate it in theyre games and it worked when ATI Radeon were more powerful than nVidia Geforce, peoples were buying Geforce for PhysX.
    Reply
  • beyondlogic
    Honestly it's extremely short sighted on Nvidia for dropping this makes 5000 series even more less attractive

    Less functionality
    More expensive
    Broken out the gate.

    Nvidia response is just typical we don't care lol.

    I can see alot of good will just being flushed down the toilet. There's quite a few games that use that engine. Some are popular even today
    Reply