PhysX feature unlocked for RTX 5090 with RTX 3050 'helper' to enable full performance

GeForce RTX 5090 Founders Edition
(Image credit: Nvidia)

A few days ago, it came to light that Nvidia has dropped support for 32-bit CUDA applications with its latest RTX 50-series (Blackwell) GPUs. Support for PhysX has gradually faded over the years. However, PhysX can still be offloaded to an RTX 40-series (Ada Lovelace) or older GPU, and that's exactly what a user at Reddit has done. In addition, we've gathered some interesting benchmarks, courtesy of VerbalSilence on YouTube and the same Reddit user, where the GTX 980 Ti handily outperforms the RTX 5090 in 32-bit PhysX games.

PhysX is fully functional in 64-bit applications like Batman: Arkham Knight, so Nvidia hasn't abandoned the technology entirely. However, the GPU maker has retired 32-bit CUDA support for RTX 50-series GPUs (and likely beyond). Given the age of the technology, most games with PhysX were compiled using 32-bit CUDA libraries. This is a software limitation, for the most part, though maintaining support for legacy environments is easier said than done.

As the news dropped, a Redditor snagged a separate RTX 3050 GPU to pair with the primary RTX 5090 to maintain PhysX support in older 32-bit titles. Using the Nvidia Control Panel, you can offload PhysX computations to a separate GPU or CPU, which you never need to do. Around 20 years back, dedicated processors for computing physics calculations were dubbed PPUs (Physics Processing Units). Ageia used to make such devices, which Nvidia later acquired.

Swipe to scroll horizontally

Game

RTX 5090 with RTX 3050

RTX 5090

Mafia II Classic

157.1

28.8

Batman Arkham Asylum

390

61

Borderlands 2

122

N/A

Assassin's Creed IV: Black Flag

62

62

In older 32-bit titles, there's a substantial gap between using the RTX 3050 and without. With legacy PhysX no longer supported, RTX 50 GPUs crash when you enable the setting or fall back to CPU processing. The user mentions that despite setting the RTX 3050 as a dedicated PhysX processor, 64-bit games utilize the RTX 5090 anyway. As mentioned above, modern PhysX implementations, at least the handful that exist, should still run fine on Blackwell.

Another test conducted by VerbalSilence reveals a striking difference in Mirror's Edge, where in some scenes, the RTX 5080 plummets to less than 10 FPS while the GTX 980 Ti sits comfortably at almost 150 FPS. The performance delta is heavily dependent on the game's PhysX implementation. Still, Borderlands 2 sees the GTX 980 Ti lead the RTX 5080 by almost 2X, and that's telling. Here, the GTX 980 Ti system is coupled with a Core i5-4690K, with the Ryzen 7 9800X3D reserved for the RTX 5080 setup.

It's unlikely that Nvidia will reinstate compatibility for legacy CUDA applications. If you genuinely wish to play your favorite 32-bit titles with PhysX, maybe it's time to dust off that old GPU in your cabinet and restore it to service.

Hassam Nasir
Contributing Writer

Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he’s not working, you’ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun.

  • nimbulan
    Personally I'm hoping that somebody can figure out how to make a compatibility mod of sorts that can basically bridge a game's 32 bit PhysX over to 64 bit so it can use hardware acceleration without requiring a second old GPU.
    Reply
  • FunSurfer
    nimbulan said:
    Personally I'm hoping that somebody can figure out how to make a compatibility mod of sorts that can basically bridge a game's 32 bit PhysX over to 64 bit so it can use hardware acceleration without requiring a second old GPU.
    They can use the RTX 4000 PhysX drivers and modify them to work for the RTX 5000 series, these GPU generations are similar enough.
    Reply
  • aberkae
    And here I was hoping we would get rtx support on a secondary card to offload all the rt workload. 😅
    Reply
  • PixelAkami
    aberkae said:
    And here I was hoping we would get rtx support on a secondary card to offload all the rt workload. 😅
    I'd love this idea. Use my 2080 to run RT in its entirety then use whatever new gpu i got to do raster.
    Reply
  • qwertymac93
    nimbulan said:
    Personally I'm hoping that somebody can figure out how to make a compatibility mod of sorts that can basically bridge a game's 32 bit PhysX over to 64 bit so it can use hardware acceleration without requiring a second old GPU.
    Imagine someone using ROCm to run these old physx games in Linux on an RX7900XT and blowing a 5090 out of the water. Lol.

    Physx was always a scam to sell CUDA anyway, it didn't HAVE to run like crap on CPUs, it just did because Nvidia put the minimal possible effort in making it work so their GPUs would look good. A source port to real multithreading with AVX2 (or even AVX512) would easily play any of these old physx games on modern hardware.
    Reply
  • -Fran-
    PixelAkami said:
    I'd love this idea. Use my 2080 to run RT in its entirety then use whatever new gpu i got to do raster.
    Well, that used to be a thing with early generations of PhysX, but nVidia blocked that almost immediately when it becamse somewhat popular to use (back then) an ATI for render and nVidia only for PhysX. So in the modern world, you can bet anything nVidia will block any "hybrid" setups.

    --
    In regards to the news/article: well, since nVidia is now for people with lots of money, they just sell you the solution to a problem they created themselves. Why optimize their drivers or add a wrapper for 32->64 when they can just tell you to buy a secondary card for it. Also, damn the Devs that took that sweet PhysX money over HaVoK or Bullit.

    The Way It's Meant To Play You All.

    Regards.
    Reply
  • LibertyWell
    -Fran- said:
    Well, that used to be a thing with early generations of PhysX, but nVidia blocked that almost immediately when it becamse somewhat popular to use (back then) an ATI for render and nVidia only for PhysX. So in the modern world, you can bet anything nVidia will block any "hybrid" setups.

    --
    In regards to the news/article: well, since nVidia is now for people with lots of money, they just sell you the solution to a problem they created themselves. Why optimize their drivers or add a wrapper for 32->64 when they can just tell you to buy a secondary card for it. Also, damn the Devs that took that sweet PhysX money over HaVoK or Bullit.

    The Way It's Meant To Play You All.

    Regards.

    This captures the entire story. It would be TRIVIAL for Nvidia (and it's unlikely they don't already have this fully fledged) to code around the 32bit interface.

    This is it guys. What else do you need to know about this company? They artificially limit backwards compatibility to the tune of a 10 year old card performing 3 times better on games that people STILL PLAY, in order to trap you in their ecosystem so they can sell you a solution later.

    Beyond anything I've learned about the 5090 this week, this, above all, signals how utterly wretched this unbridled corporation is. How much they absolutely f*king hate their customers and consider them nothing more than a cash register spilling their hard earned FRNs into their coffers like the idiots they know we are.

    Bring back the Sherman AntiTrust Act, shatter and splinter and burn this company into a million falling ashes of confetti, and then maybe, just maybe, we can get back to some semblance of sanity.
    Reply
  • 8086
    In my opinion, todays news is one reason why I think nVIDIA should have never killed off the dedicated PPU. Having the option to use a PPU with any graphics card could have prevented these compatibility issues.
    Reply
  • SiNmaster
    It has come full circle. PhysX was originally an addon-card, back in the day. I was debating buying one, never did and was glad i didn’t when nvidia acquired Ageia for their physX and added it on to their cards.
    Reply
  • -Fran-
    This segment from MLiD interviewing Matt Alderon (game developer with proper insight) is very telling. Totally worth a watch.

    eUY-EIclo40View: https://www.youtube.com/watch?v=eUY-EIclo40

    Regards.
    Reply