Analysis: PhysX On Systems With AMD Graphics Cards
Rarely does an issue divide the gaming community like PhysX has. We go deep into explaining CPU- and GPU-based PhysX processing, run PhysX with a Radeon card from AMD, and put some of today's most misleading headlines about PhysX under our microscope.
Summary And Conclusion
CPU-Based PhysX summary
To summarize the headlines of the last few months and summarize the test results, we can conclude the following:
- The CPU-based PhysX mode mostly uses only the older x87 instruction set instead of SSE2.
- Testing other compilations in the Bullet benchmark shows only a maximum performance increase of 10% to 20% when using SSE2.
- The optimization performance gains would thus only be marginal in a purely single-core application.
- Contrary to many reports, CPU-based PhysX supports multi-threading.
- There are scenarios in which PhysX is better on the CPU than the GPU.
- A game like Metro 2033 shows that CPU-based PhysX could be quite competitive.
Then why is the performance picture so dreary right now?
- With CPU-based PhysX, the game developers are largely responsible for fixing thread allocation and management, while GPU-based PhysX handles this automatically.
- This is a time and money issue for the game developers.
- The current situation is also architected to help promote GPU-based PhysX over CPU-based PhysX.
- With SSE2 optimizations and good threading management for the CPU, modern quad-core processors would be highly competitive compared to GPU PhysX. Predictably, Nvidia’s interest in this is lackluster.
The AMD graphics card + Nvidia graphics card (as dedicated PhysX card) hybrid mode
Here, too, our verdict is a bit more moderate compared to the recent hype. We conclude the following:
Pros:
One can claim that using the additional card results in a huge performance gain if PhysX was previously running on the CPU instead of the GPU. In such cases, the performance of a Radeon HD 5870 with a dedicated PhysX card is far superior to a single GeForce GTX 480. Even if you combine the GTX 480 with the same dedicated PhysX card, the lead of the GTX 480 is very small. The GPU-based PhysX solution is possible for all AMD users if the dedicated Nvidia PhysX-capable board is powerful enough. Mafia II shows that there are times when even a single GeForce GTX 480 reaches its limits and that “real” PhysX with highly-playable frame rates is only possible with a dedicated PhysX card.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Cons:
On the other hand, we have the fact that Nvidia incorporates strategic barriers in its drivers to prevent these combinations and performance gains if non-Nvidia cards are installed as primary graphics solutions.
It's good that the community does not take this lying down, but instead continues to produce pragmatic countermeasures. But there are more pressing drawbacks. In addition to the high costs of buying an extra card, we have added power consumption. If you use an older card, this is disturbingly noticeable, even in idle mode or normal desktop operation. Everyone will have to decide just how much money an enthusiast project like this is worth. It works, and it's fun. But whether it makes sense for you is something only you can decide for yourself.
-
eyefinity So it's basically what everybody in the know already knew - nVidia is holding back progress in order to line their own pockets.Reply -
Emperus Is it 'Physx by Nvidia' or 'Physx for Nvidia'..!! Its a pity to read those lines wherein it says that Nvidia is holding back performance when a non-Nvidia primary card is detected..Reply -
It looks like the increase in CPU utilization with CPU physX is only 154%, which could be 1 thread plus synchronization overhead with the main rendering threads.Reply
-
eyefinity The article could barely spell it out more clearly.Reply
Everyone could be enjoying cpu based Physics, making use of their otherwise idle cores.
The problem is, nVidia doesn't want that. They have a proprietary solution which slows down their own cards, and AMD cards even more, making theirs seem better. On top of that, they throw money at games devs so they don't include better cpu physics.
Everybody loses except nVidia. This is not unusual behaviour for them, they are doing it with Tesellation now too - slowing down their own cards because it slows down AMD cards even more, when there is a better solution that doesn't hurt anybody.
They are a pure scumbag company. -
rohitbaran In short, a good config to enjoy Physx requires selling an arm or a leg and the game developers and nVidia keep screwing the users to save their money and propagate their business interests respectively.Reply -
iam2thecrowe The world needs need opencl physics NOW! Also, while this is an informative article, it would be good to see what single nvidia cards make games using physx playable. Will a single gts450 cut it? probably not. That way budget gamers can make a more informed choice as its no point chosing nvidia for physx and finding it doesnt run well anyway on mid range cards so they could have just bought an ATI card and been better off.Reply -
archange Believe it or not, this morning I was determined to look into this same problem, since I just upgraded from an 8800 GTS 512 to an HD 6850. :OReply
Thank you, Tom's, thank you Igor Wallossek for makinng it easy!
You just made my day: a big thumbs up! -
jamesedgeuk2000 What about people with dedicated PPU's? I have 8800 GTX SLi and an Ageia Physx card where do I come into it?Reply -
skokie2 What is failed to be mentioned (and if what I see is real its much more predatory) that simply having an onboard AMD graphics, even if its disabled in the BIOS, stops PhysX working. This is simply outragous. My main hope is that AMD finally gets better at linux drivers so my next card does not need to be nVidia. I will vote with my feet... so long as there is another name on the slip :( Sad state of graphics generally and been getting worse since AMD bought ATI.. it was then that this game started... nVidia just takes it up a notch.Reply