I'll be doing SLI soon and with Physx starting to be utilized more and more, I was wondering about the effects of choosing one out of your two GPU's to do nothing but Physx. Some of my main thoughts are:
- If I dedicate one to Physx, how much will my performance take a hit during an application?
- Exactly what does the dedication improve exactly? Does it just provide more calculations for a greater amount of
visible effects? Do the effects last longer, ect.
- Has anyone ever had any trouble or hiccups when trying to do this?
If you already have an SLI setup then if I remember properly you will actually take a performance hit from disabling it to dedicate one card to Physx. Also there will be no difference in amount/duration/look of the effects. The only change would be your over all frame rate.
Helltech is also correct, I don't see GPU accelerated Physx being used more lately nor would I expect its usage to increase. Since Batman:AA came out a year ago only Metro 2033 has used it any worthwhile effect and you would need an insane setup to make it worth enabling. The only game I've heard of in the near future that will use it in notable fashion is Mafia 2 and apparently you also need an extremely high powered setup to make it worth enabling.
Don't bother with a Physx only card. And what do you mean being utilized more and more? Physx is in the process of being faded out.
There was a thread floating around some where that showed the performance gain from using a physx card was so little that it wasn't even worth it.
You have a link to this fading out ? Your spreading information that counters everything Nvidia is currently promoting.
The benefit of having 2 identical nvidia cards for sli is enhanced with Nvidia's new control panel. Which lets you expirement with using a certain gpu for PhysX. This could come in handy in some games that don't support SLI yet, allowing you to fully utilize the second gpu for physX.
For my recent upgrade, I was finally able to experience PhysX acceleration on Unreal Tournament 3. I was literally laughing at all the gore and debris flying around. It definitely added more "splat" to the gameplay. If you do not have a decent card to do both graphics and PhysX, adding a dedicated PhysX card is a relatively cheap upgrade (~$100). If you are going for SLI with two good cards, I would let both cards handle both the graphics and the PhysX.
Sorry I don't have a link to the article I read yesterday, but PhysX could be phased out eventually (no time soon, though) as the CPU takes over much more of the physics caluculations. This is at the heart of the disagreement between Intel and Nvidia at the moment.
The CPU has always done the physics calculations, even most of them in games with GPU Physx. What should eventually replace GPU accelerated Physx is DirectCompute or OpenCL based physics APIs because they are non-proprietary and can run on all video cards regardless of brand. Of course when those take off Nvidia will likely port Physx from CUDA to one of them but it will take a while.
Found a in depth review of Mafia II . http://www.hitechlegion.com/reviews/gaming-software/526...
The demo is out and I've seen comments how it was meh graphically. It seems the game is highly optimized for physX and 3d. All sorts of comparison pictures and videos showing the added graphics.
There are charts and tests with multi gpu's , also dedicated physX cards.
Unlike most PC PhysX titles, Mafia II relies completely on PhysX to render an immersive gangster's paradise. Turning PhysX off results in a hum-drum gameplay that resembles 3rd person shooters from the early part of the decade but with Ambient occlusion and higher res textures. I would rather trade graphical quality than play the game in its highest settings with PhysX disabled. It simply is just not the same game without the dust, debris, smoke and of course, the stylistic trench coats!
I played the Mafia 2 demo and it looked great without PhysX. I ran the benchmark with PhysX enabled, and aside from a few pieces of concrete falling to the ground... It wasn't anything that made me want to go out and buy a PhysX card.
What should eventually replace GPU accelerated Physx is DirectCompute or OpenCL based physics APIs because they are [b said:
non-proprietary and can run on all video cards regardless of brand. Of course when those take off Nvidia will likely port Physx from CUDA to one of them but it will take a while.]What should eventually replace GPU accelerated Physx is DirectCompute or OpenCL based physics APIs because they are non-proprietary and can run on all video cards regardless of brand. Of course when those take off Nvidia will likely port Physx from CUDA to one of them but it will take a while.
here we go again with the misconception. directcompute/openCL wasn't designed for gaming just to deliver realistic physX. directcompute/openCL is a gpgpu language hence, if it will replace anything, it will replace CUDA not physX.
and directcompute isn't non-proprietary, it's a part of directX. you can't run it out of the box on a linux/unix system even if you have a radeon or a geforce card. same way you can't do directX on any linux system without using wine.
No, you don't get it. This thread is specifically about GPU accelerated Physx in GAMING. I was talking about openCL/DirectCompute in the context of GAMING like everything else in this thread. Every single thing you have complained about in my statement is from you purposely removing it from this context so you can nitpick me. Please, leave me alone. I find your method of debate tedious and offensive and I got more than enough of it in the other thread.
you do not use a terminology such as non-proprietary to describe something prolifically as proprietary as directcompute. whether in the context of gaming or not.
I do, I did and I would again because within the context it is in fact appropriate whether you like it or not. It is an open standard that MS allows all to use in the context of a Windows based system with DirectX installed(AKA every gaming system.) You don't need to pay Microsoft to use it, you don't need a license, you don't need permission and they couldn't prevent anyone from writing code that uses it even if they wanted to.
As for me using the term "open source" I didn't use it in this thread or the other one, so you are delusional, presumptuous and quite frankly you are simply harassing me with made up nonsense at this point. Just stop it.