Sign in with
Sign up | Sign in
Your question
Closed

Nvidia PhysX Question

Last response: in Graphics & Displays
Share
May 4, 2011 8:43:37 PM

Hello, i have 460 gtx 1gb SLI and i want to know if i will notice a preformance boost with 430 GT like dedicated physX card or maybe its pointless. thanks

More about : nvidia physx question

a c 402 Î Nvidia
May 4, 2011 10:31:23 PM

The rule of thumb for adding a dedicated PhysX card is roughly a 25% boost in performance. BUT that is only when playing PhysX games. The GT 430 is not the best dedicated PhysX card, but would provide a boost nonetheless. What PhysX games do you play?
Score
0
Related resources
May 5, 2011 6:37:30 AM

but if i play a game that is not support the tehnology it is pointless?
i am just playing COD: black ops online and some new games for single player.
Score
0
May 5, 2011 8:54:09 AM

Correct, only a few games actually support physx. If you don't play a physx game then it is pointless.
Score
0
a b Î Nvidia
May 5, 2011 4:29:29 PM

What would be interesting to know then, is what's the minimum card to get just for PhysX that will make a difference worth its price? GT240? GT440?
Score
0
a b Î Nvidia
May 5, 2011 4:33:50 PM

Optimal PhysX card will have around 100 CUDA cores, maybe a little more. I think a 260 is the absolute highest you need, with little to no gains after that. Below that, a GT240 is great at just under 100 cores, but below that the gains will be less and less.

OP: As stated, there's only 21 games that are going to support PhysX acceleration, so you might play a whole 5 of them, tops. Probably not worth the price for you. I spent $65 on a GT240 and played Metro 2033, Cryostasis, Mafia 2, Mirror's Edge, and some Batman: AA... probably wasn't worth it, but at the same time I don't regret it because the eye candy and effects it added were really nice. Still, it was an extra $13 per game.
Score
0
May 5, 2011 7:15:05 PM

I saw a video of a guy doing this test... but he tested a 240 then a 550 560 and 580 as dedicated physx cards... no leap in power there or anything nooo.

@wolfram: any game based on the unreal engine will have physx too so its not going to stay at 20 some odd games.
Score
0
a b Î Nvidia
May 5, 2011 7:40:53 PM

shin0bi272 said:
I saw a video of a guy doing this test... but he tested a 240 then a 550 560 and 580 as dedicated physx cards... no leap in power there or anything nooo.

@wolfram: any game based on the unreal engine will have physx too so its not going to stay at 20 some odd games.


Yes, it will. There's a difference between having PhysX and having hardware accelerated PhysX. Unreal uses the PhysX SDK, but that's no different than, say, Havoc.

http://physxinfo.com/
Score
0
May 5, 2011 8:09:06 PM

except that if you dont have hardware accellerated physx you cant play certain levels that they released for it because it brings your gpu to its knees.

Plus physx has always been an sdk like havok... its a physics engine... The only difference is havok charges 250k for theirs and its scripted physics and ageia... er now nvidia gives it out for free and its hardware accelerated. If you want to see why it has to be hardware accelerated download the water demo from nvidia's website and turn off the texture mapping. It will show you that all the droplets of water are actually spheres being rendered in realtime. When Ageia bought it from the original manufacturer (name escapes me right now but it began with an N) they changed it to add hardware acceleration and gave the engine away for free so they could push their hardware cards. It didnt go so well (though if you had one of those ageia ppu1 cards back in 03 or 04 you were able to play cell factor which had massive amounts of debris in it and it ran smooth... or you could get 1 fps on your 6800gtx). Then a few years after everyone thought they were dead Nvidia buys them and integrates the hardware acceleration into their cuda software (I might be wrong but I think nvidia was also putting a dedicated chip on their cards for physics to try to push their own version of gpu accelerated physics back when the 8800 first came out but dont quote me on that). So if a game supports physx at all it has the physx engine on it and can be hardware accelerated. The degree to which most games implement the physx varies though due to not wanting to abandon and/or piss off 50-60% of the gamers buying their game. Physx engines arent like object models in a game... if it runs on one engine thats the only engine you have. That's why you dont see physx acceleration on havok games... the two cant co-exist on the same game engine.
Score
0
a c 81 Î Nvidia
May 5, 2011 8:21:54 PM

shin0bi272 said:
except that if you dont have hardware accellerated physx you cant play certain levels that they released for it because it brings your gpu to its knees.

Plus physx has always been an sdk like havok... its a physics engine... The only difference is havok charges 250k for theirs and its scripted physics and ageia... er now nvidia gives it out for free and its hardware accelerated. If you want to see why it has to be hardware accelerated download the water demo from nvidia's website and turn off the texture mapping. It will show you that all the droplets of water are actually spheres being rendered in realtime. When Ageia bought it from the original manufacturer (name escapes me right now but it began with an N) they changed it to add hardware acceleration and gave the engine away for free so they could push their hardware cards. It didnt go so well (though if you had one of those ageia ppu1 cards back in 03 or 04 you were able to play cell factor which had massive amounts of debris in it and it ran smooth... or you could get 1 fps on your 6800gtx). Then a few years after everyone thought they were dead Nvidia buys them and integrates the hardware acceleration into their cuda software (I might be wrong but I think nvidia was also putting a dedicated chip on their cards for physics to try to push their own version of gpu accelerated physics back when the 8800 first came out but dont quote me on that). So if a game supports physx at all it has the physx engine on it and can be hardware accelerated. The degree to which most games implement the physx varies though due to not wanting to abandon and/or piss off 50-60% of the gamers buying their game. Physx engines arent like object models in a game... if it runs on one engine thats the only engine you have. That's why you dont see physx acceleration on havok games... the two cant co-exist on the same game engine.


The PhysX SDK, like Havok, has a lot of features available to the developers using it. Just because the SDK has the features to add GPU accelerated PhysX, does not mean that all games developed using the SDK will offer that feature. The fact is, the vast majority of the developers using the PhysX SDK, do not include those features. As a result, you gain nothing from having a PhysX capable card.

The list of games that gain a benefit to having a PhysX capable GPU, is about 21 games long, while the list of games that use the PhysX SDK is over well over a hundred (I believe it's over 200).

To put it simply: the PhysX SDK offers the dev's the ability to add GPU accelerated PhysX, it does not automatically add GPU accelerated PhysX. Most dev's choose not to add GPU accelerated PhysX when using the PhysX SDK.
Score
0
a c 171 Î Nvidia
May 5, 2011 8:35:34 PM

shin0bi272 said:
except that if you dont have hardware accellerated physx you cant play certain levels that they released for it because it brings your gpu to its knees.

Plus physx has always been an sdk like havok... its a physics engine... The only difference is havok charges 250k for theirs and its scripted physics and ageia... er now nvidia gives it out for free and its hardware accelerated. If you want to see why it has to be hardware accelerated download the water demo from nvidia's website and turn off the texture mapping. It will show you that all the droplets of water are actually spheres being rendered in realtime. When Ageia bought it from the original manufacturer (name escapes me right now but it began with an N) they changed it to add hardware acceleration and gave the engine away for free so they could push their hardware cards. It didnt go so well (though if you had one of those ageia ppu1 cards back in 03 or 04 you were able to play cell factor which had massive amounts of debris in it and it ran smooth... or you could get 1 fps on your 6800gtx). Then a few years after everyone thought they were dead Nvidia buys them and integrates the hardware acceleration into their cuda software (I might be wrong but I think nvidia was also putting a dedicated chip on their cards for physics to try to push their own version of gpu accelerated physics back when the 8800 first came out but dont quote me on that). So if a game supports physx at all it has the physx engine on it and can be hardware accelerated. The degree to which most games implement the physx varies though due to not wanting to abandon and/or piss off 50-60% of the gamers buying their game. Physx engines arent like object models in a game... if it runs on one engine thats the only engine you have. That's why you dont see physx acceleration on havok games... the two cant co-exist on the same game engine.

I can't say that I recall that, the only card I can think of that used a dedicated PhysX GPU was a one-off from EVGA but that was a long time after the 8800's came out.
Score
0
May 5, 2011 8:37:11 PM

I wish they would release the PCIex1 physx cards that they used to have. That would make all our lives easier! I have plenty of those slots to use up!
Score
0
a b Î Nvidia
May 5, 2011 8:42:30 PM

What bystander said.

I compared PhysX SDK to Havoc because they both are simple physics engines... the difference is that PhysX can be run with GPU acceleration for very specific effects. Acclerated PhysX can add particle effects, smoke effects, and cloth effects as well as slightly more advanced physics calculations. Besides that, the vast majority of PhysX implementation is not accelerated. Did you know PhysX is available on consoles? Do you think consoles have GPU acceleration?
Score
0
a c 171 Î Nvidia
May 5, 2011 8:44:56 PM

crewton said:
I wish they would release the PCIex1 physx cards that they used to have. That would make all our lives easier! I have plenty of those slots to use up!

I don't recall there ever being a PCIe x1 PhysX card, I only know of the ones that took up a PCIe x16/x8/x4 slot like this one :-
Score
0
a b Î Nvidia
May 5, 2011 8:47:04 PM

bradkman said:
This may help...

http://www.youtube.com/watch?v=cbww3dhzK0M


That is one of the worst examples of a dedicated card for physx and a snuff film of the tech community. :pfff: 

Shame on you for posting this :non: 

Score
0
a b Î Nvidia
May 5, 2011 8:52:43 PM

shin0bi272 said:
except that if you dont have hardware accellerated physx you cant play certain levels that they released for it because it brings your gpu to its knees.

Plus physx has always been an sdk like havok... its a physics engine... The only difference is havok charges 250k for theirs and its scripted physics and ageia... er now nvidia gives it out for free and its hardware accelerated. If you want to see why it has to be hardware accelerated download the water demo from nvidia's website and turn off the texture mapping. It will show you that all the droplets of water are actually spheres being rendered in realtime. When Ageia bought it from the original manufacturer (name escapes me right now but it began with an N) they changed it to add hardware acceleration and gave the engine away for free so they could push their hardware cards. It didnt go so well (though if you had one of those ageia ppu1 cards back in 03 or 04 you were able to play cell factor which had massive amounts of debris in it and it ran smooth... or you could get 1 fps on your 6800gtx). Then a few years after everyone thought they were dead Nvidia buys them and integrates the hardware acceleration into their cuda software (I might be wrong but I think nvidia was also putting a dedicated chip on their cards for physics to try to push their own version of gpu accelerated physics back when the 8800 first came out but dont quote me on that). So if a game supports physx at all it has the physx engine on it and can be hardware accelerated. The degree to which most games implement the physx varies though due to not wanting to abandon and/or piss off 50-60% of the gamers buying their game. Physx engines arent like object models in a game... if it runs on one engine thats the only engine you have. That's why you dont see physx acceleration on havok games... the two cant co-exist on the same game engine.



You are ok on some things but wrong on the rest. One their is no 6800gtx only the 6800, 6800gt, and the 6800ultra for both agp and pci-e. Second the 8800 era never had dedicated physx. There were some limited but rare versions of the gt200 era cards that did such as the gtx 275 co op that was one single gt200 and a g92 for dedicated physx. There were some that also used the GT215 for dedicated physx as well but they are very rare and only intended for short production and retail periods like around the holidays.
Score
0
a b Î Nvidia
May 5, 2011 8:56:28 PM

crewton said:
I wish they would release the PCIex1 physx cards that they used to have. That would make all our lives easier! I have plenty of those slots to use up!


http://www.newegg.com/Product/Product.aspx?Item=N82E168...

You had a chance to buy one at retail unlike the second gen that was intended to replace what people know as ageia before nvidia had killed off the cards.



http://www.behardware.com/news/9058/physx-2-in-pictures...

Interesting how a little $20 pci ageia physx card is still able to out perform a i7 920 in physx games and apps. :lol: 
Score
0
May 6, 2011 6:20:51 AM



Those are the ones! Supposedly you can cut current graphic cards to fit into the PCIe x1 lanes and still be able to perform physx with them. I'd much rather just buy a modern card like that though. Make it like 40 bucks and be happy.
Score
0
a b Î Nvidia
May 6, 2011 1:23:05 PM

crewton said:
Those are the ones! Supposedly you can cut current graphic cards to fit into the PCIe x1 lanes and still be able to perform physx with them. I'd much rather just buy a modern card like that though. Make it like 40 bucks and be happy.


Sorry the only thing that pops into a 1x slot these days that is nvidia is the ion and physx by driver requires a card supporting 32 shaders. 32 shader btw isn't enough to run physx comfortably in modern games. A 9600gt is the bare minimum for this job. The agiea ppu cards can still be used but in older games using older drivers. It is good for old systems that do not have an pci-e slot like a p4 or a socket a rig. When I do replace my gtx 460 and go sli or crossfire with what ever that replaces it I will be running out of both 16x and 1x slots thanks to other intended upgrades so that I am not using old pci slots.
Score
0
May 10, 2011 5:11:25 AM

ghostrider5 said:
Hello, i have 460 gtx 1gb SLI and i want to know if i will notice a preformance boost with 430 GT like dedicated physX card or maybe its pointless. thanks

------------------------------
I use two superclocked EVGA GTX 460's in SLI and today I put in a GTS 450 for PHYSX only and my score went way up...almost double over just SLI when running the tough METRO benchmark with PHYSX on. With PHYSX on the third card only I get about the same score as if I had turned the PHYSX off when only using the SLI cards.

Note that I had all three cards being tracked by the expanded EVGA Presision program and it did show me that the 450 was only using about 30% of its capacity...I can only guess that the 450 that I got for $99.00 shipped is maybe a little overkill, but I wanted a cool running card that exausted out the back.

I suspect that the third card would not be all that helpful, if I had a couple of 580's in sli or a 590<g>
I do not know if someone running other cards will get the same milage as me on the benchmark with sli plus PHYSX.
But again..it doubled my average FPS scores.
Bullmoose
Score
0
a c 81 Î Nvidia
May 10, 2011 5:20:27 AM

bullmoosetom said:
------------------------------
I use two superclocked EVGA GTX 460's in SLI and today I put in a GTS 450 for PHYSX only and my score went way up...almost double over just SLI when running the tough METRO benchmark with PHYSX on. With PHYSX on the third card only I get about the same score as if I had turned the PHYSX off when only using the SLI cards.

Note that I had all three cards being tracked by the expanded EVGA Presision program and it did show me that the 450 was only using about 30% of its capacity...I can only guess that the 450 that I got for $99.00 shipped is maybe a little overkill, but I wanted a cool running card that exausted out the back.

I suspect that the third card would not be all that helpful, if I had a couple of 580's in sli or a 590<g>
I do not know if someone running other cards will get the same milage as me on the benchmark with sli plus PHYSX.
But again..it doubled my average FPS scores.
Bullmoose


Have you tried running the game with and without PhysX enabled, or the benchmark? I cannot notice a difference myself, so I just don't bother with PhysX in that game. The best I can tell, the difference comes down to whether the physics are guess, or truly calculated, which happen to be very close.

That's not to say PhysX is no good, I've seen it pretty noticeable in Sacred 2, although I'm not sure how important that is to me either.
Score
0
May 10, 2011 5:52:10 AM

bystander said:
Have you tried running the game with and without PhysX enabled, or the benchmark? I cannot notice a difference myself, so I just don't bother with PhysX in that game. The best I can tell, the difference comes down to whether the physics are guess, or truly calculated, which happen to be very close.

That's not to say PhysX is no good, I've seen it pretty noticeable in Sacred 2, although I'm not sure how important that is to me either.

----------------
Yes, bystander, thanks for asking, I did.
But, I can't get real and quantitative and repeatable values from any game.
Sort of like people in agony about watts, and don't just get a kill-a-watt cheap device to tell them the truth.<g>
The benchmark for Metro does give me those things...
My little cards are hard pressed in the benchmark and with every thing as high as I can get it...I will run a little over 50 fps average with sli and physx off.
With physx on I get about 28 average in SLI only.
This third card brings me back up to over 50 with physx on.
These overclocked 460's in sli scale up better than 90% on most games like fallout new vegas and I get frame rates that are 50 or 60 and run up well over 100 in most places on most games other than Metro (it's a killer).
It is just that metro and metro benchmark will really tax these cards more than anything except maybe prime95.
Another thing no one speaks much of is the speed of the CPU...My 2600K is clocked and stable at 4.6gh+ and I have all the proof I need from all the passmark tests I have run with the only change being in clock speed, that CPU speed will improve graphics and games as much as any thing else. (and a lot cheaper).
Score
0
a c 171 Î Nvidia
August 6, 2012 9:13:11 AM

This topic has been closed by Mousemonkey
Score
0
!