Sign in with
Sign up | Sign in
Your question

Performance and Physx

Last response: in Graphics & Displays
Share
a c 130 U Graphics card
August 16, 2008 9:55:35 AM


I have been reading a lot in the press latley about these Physx drivers Nvidia had released as beta, which as we have covered before were not given to futuremark so they are not valid as benchmark drivers.
My question is. Who cares and why? 3dm and Vantage are synthetic benchmarks and its all down to how the card handles actual games in the real world at the end of the day. As these drivers do increase performance in the tests i have seen why would anyone not want to use them ? Im guessing ATI are working on a version ? Also i cant see how these drivers can be kept out of things even if futuremark say they are not official (im guessing Nvidia will submit a set soon). The amount of performance increase is there and reviews will have to at least include then for comparison wouldnt they ?

Your thoughts and any info please.
Mactronix

More about : performance physx

a b U Graphics card
August 16, 2008 10:30:55 AM

I'd use them, but I don't have PhysX-based games :( 
August 16, 2008 5:37:12 PM

The problem is, they decrease fps in actual games. Most games are gpu limited, not cpu limited. It's better to have physics calculations done on cpu that is mostly idle instead of the gpu that is already working high load. When you're running a game, it doesn't matter if that quad core is working at 10% or 30% load, it won't impact frame rates. If you keep that already underused cpu idle while tax the already busy gpu, frame rates will decrease.

In other words, Gpu based physics calculations only improve synthetic benchmarks, while tanking real world performance.
Related resources
a b U Graphics card
August 17, 2008 6:20:18 AM

Yea I know PhysX reduces performance, but it does so much less when running on the GPU. Therefore if you are going to use it, you should offload it from the CPU as much as possible.
a c 130 U Graphics card
August 17, 2008 10:12:54 AM


So am i getting this right then ?
The Physx reduces the load on the CPU. But for a top end system that needs the card running at its best it causes a reduction in FPS at high res, because obviously the card is doing more. So that leads me to thinking that at lower resolutions or on a machine with a highend GPU and a lower than ideal spec CPU this would help reduce any CPU restrictions in the system. (bottlenecks)

So long story short Nvidia are billing these drivers as something they are not ? or is it the industry that has been hoodwincked into doing it for them ? I have seen reviews like the firingsquad ones that show increases in the performance and i have seen magazine reviews that are getting upset about the increase and are sugesting Nvidia are trying to cheat in the FPS wars again, 3dm Vantage etc.

Again your thoughts or any info apreciated.

Mactronix
a c 271 U Graphics card
a b \ Driver
August 17, 2008 11:49:25 AM

@mactronix, I haven't had a chance to play with them yet. but I have been wondering whether I will have access to the showcase level in GRAW2, I,ll get back to you.
a c 271 U Graphics card
a b \ Driver
August 17, 2008 11:57:22 AM

Quote:
undeniable proof

Err, such as? [:mousemonkey:2]
a c 130 U Graphics card
August 17, 2008 12:17:08 PM


Ok thanks for the info so far guys. Im with strangestranger on this i wont trust them untill its proven one way or another. Im still surprised the industry seems to b esucked in by this though.
@ mousemonkey, Yes it will be interesting to know if you will get access to things origonally designed to run with the add in card, could open up a small back catalogue of titles for some people.

Mactronix
a c 271 U Graphics card
a b \ Driver
August 17, 2008 1:33:58 PM

mactronix said:
Ok thanks for the info so far guys. Im with strangestranger on this i wont trust them untill its proven one way or another. Im still surprised the industry seems to b esucked in by this though.
@ mousemonkey, Yes it will be interesting to know if you will get access to things origonally designed to run with the add in card, could open up a small back catalogue of titles for some people.

Mactronix

It works :) 

a b U Graphics card
August 17, 2008 1:47:08 PM

One thing, everyones always saying, there arent any challenging games out now to really push alot of these newer top end cards. To me, thatd be a perfect scenario to use phsx, as youd still have acceptable fps, while having physx as well
August 17, 2008 3:26:42 PM

Ah! I really fouled up that one. :na: 

I forgot Nvidia physx is artifically made multithreaded to the extreme in order to run on gpu. In that state, it would be like running graphics, which is extremely multithreaded in nature, on cpu.

The light cpu load added applies only to original, unaltered physics caluculation, like those used in Crysis, based on engine developed in house. To be honest, blowing up a shed in Crysis and see a hundred differently shaped individual pieces falling looks better than bumping a single solid box in UT3. It's hard to notice that UT3 even use physics calculations at all if you're not careful.

Besides, the fps hit physx takes on gpu performance can be significant. While cpu load with generic physics calculations does not affect fps.
http://www.guru3d.com/article/physx-by-nvidia-review/4
a b U Graphics card
August 17, 2008 4:55:22 PM

Like I posted in another thread, too bad nVidia doesnt embrace DX10.1 and thus, all the TWIMTBP games, cause some of that impact would be taken up with better fps using DX10.1
August 17, 2008 5:13:10 PM

JAYDEEJOHN said:
Like I posted in another thread, too bad nVidia doesnt embrace DX10.1 and thus, all the TWIMTBP games, cause some of that impact would be taken up with better fps using DX10.1

Didn't nvidia's old architecture prevent it from utilizing DX10.1? Wasn't that why they bribed Microsoft to push the optimizations from DX10 to DX10.1? :p 
a b U Graphics card
August 17, 2008 5:16:15 PM

Yeah, too bad to, but, at the time, the only "DX10" cards out were the G80s, so they both punted, not just nVidia, cause M$ had to have something to showcase their Vista OS.
a c 130 U Graphics card
August 17, 2008 7:29:00 PM


In that respect M$ really ballsed up, by allowing the DX10 spec to be what it was and not what it should have been they not only shot themselves in the foot by making Vista undesirable Performance/FPS wise to gamers. They also held up graphics development being passed on to the end user. In their haste to rake in the bucks they released new OS with half arsed graphical support and Nvidia were laughing all the way to the bank.
If they had waited and left things as they were meant to be who knows where we could be now. I know personally that having seen the tests with DX10.1, if it had been that spec that was the Vista release performance i would have bought in to it.

Mactronix
August 23, 2008 9:54:33 AM

go get the drivers if you have a GPU that will use them. not only did my vantage score just over 3000, but i also get about an extra 8-12 FPS on crysis.

so they actually have real-world gaming advantages, which is the main thing
August 24, 2008 2:26:47 AM

spac13 said:
go get the drivers if you have a GPU that will use them. not only did my vantage score just over 3000, but i also get about an extra 8-12 FPS on crysis.

so they actually have real-world gaming advantages, which is the main thing

8-12 fps more? And you do realize that like most games, Crysis use generic physics engine instead of Physx, right?
August 24, 2008 2:51:25 AM

i said drivers.... not just physx.......
a b U Graphics card
August 24, 2008 2:55:50 AM

I havnt heard of such increases from this driver, tho Ive heard its good. Oh, and yea, it doesnt have anything to do with the physx either, as well as it being run in Crysis on physx. Daggers right.
August 24, 2008 3:06:41 AM

JAYDEEJOHN said:
I havnt heard of such increases from this driver, tho Ive heard its good. Oh, and yea, it doesnt have anything to do with the physx either, as well as it being run in Crysis on physx. Daggers right.

If it does use Physx, it'll just tax the already heavily loaded gpu and decrease performance instead of increase it. :p 
a b U Graphics card
August 24, 2008 3:30:01 AM

Unfortunately. I wish it came for free, but thats not the case
August 24, 2008 3:43:13 AM

JAYDEEJOHN said:
Unfortunately. I wish it came for free, but thats not the case

Actually, I was surprised that the performance cost is so low. Physics calculations isn't as multithreaded as graphics in nature, and runs on cpu natively far better than gpu. Generic physics engines like those used in Crysis produce heavy effects at relatively low cpu load. The only reason Physx can even take advantage of gpu's parallel processing at all is because it's been artifically restructured to do so. In the process, cpu, which can't handle the multithreading, performs poorly if it's made to emulate it.

They really did a hell of a job turning apple into orange. It's impressive. :p 
a b U Graphics card
August 24, 2008 3:58:42 AM

That it is, but the average joe doesnt undertand all that, and thinks its just another part of the game, and no loss. Part of it can be blamed on hype. I noticed the average jill knows better tho heheh. OK, not so average
a c 130 U Graphics card
August 24, 2008 7:17:02 AM

JAYDEEJOHN said:
That it is, but the average joe doesnt undertand all that, and thinks its just another part of the game, and no loss. Part of it can be blamed on hype. I noticed the average jill knows better tho heheh. OK, not so average


Yea tell me about the hype, and the confusion this seems to be causing. Its not just your average jill and joe either. Reviewers and journalists alike seem to have swallowed the whole GPU speed increase from Physx, and as far as i can tell its all based on Vantage results. These people really should know better. Im looking into how it works so i can understand whats going on at a hardware usage kind of level.
Mactronix

a b U Graphics card
August 24, 2008 7:41:22 AM

People get swallowed up in hype all the time. R600, Crysis, Atom, now PhysX. We'll soon see if PhysX lives long after official release.
a b U Graphics card
August 24, 2008 9:17:51 AM

PhysX dont run on the CPU, it never did and never will.
August 24, 2008 10:53:39 AM

PhysX runs in 3 mode:
Standard - one GPU renders both Graphics + PhysX (not ideal as you'll need a lot of GPU horsepower).
SLI mode - have two GPUs render both Graphics + PhysX.
Multi-GPU mode - GPU1 renders Graphics and GPU2 renders PhysX.(no SLI board needed.

so i think to get the best performance the last option is best if you really do want PhysX on your computer.
August 24, 2008 2:02:39 PM

iluvgillgill said:
PhysX runs in 3 mode:
Standard - one GPU renders both Graphics + PhysX (not ideal as you'll need a lot of GPU horsepower).
SLI mode - have two GPUs render both Graphics + PhysX.
Multi-GPU mode - GPU1 renders Graphics and GPU2 renders PhysX.(no SLI board needed.

so i think to get the best performance the last option is best if you really do want PhysX on your computer.

Keep in mind that no matter what, running Physx will increase cpu load alongside no matter how you run it. Some parts of physics processing like reintegration simply cannot be rigged to work off gpu. Traditional physics processing runs entirely off cpu, Physx works both cpu and gpu. Think of gpu's role in it as more of acceleration.
August 24, 2008 2:09:47 PM

Obviously in game physics is a good thing but I'm not convinced that Nvidia's physx implementation is going to become the standard. I tried the drivers and some of the demos and was seriously underwhelmed. The demo with the water looks like goop. I've seen good in game water (Bioshock dx10) without physx, I've clear cut forests in Crysis and I still recall being impressed with destructible environments in Company of Hero's almost 2 years ago.

While recently deciding on a GPU upgrade physx and Cuda almost convinced to buy a GTX280. In the end I don't yet see a reason for me to need Cuda because by all reports Badaboom sucks. As I previously stated physx is underwhelming at this point. In the end Nvidia's attempt to sell a second rate card on the basis of added features wasn't convincing.

Having said that Nvidia's strategy may pan out, but in the hardware race you don't buy hardware today gambling for future feature support.

I'd be very happy if in the future my old 8800gtx could sit one slot down from my4870x2 and handle physics.
August 24, 2008 2:14:19 PM

No1sFanboy said:
Obviously in game physics is a good thing but I'm not convinced that Nvidia's physx implementation is going to become the standard. I tried the drivers and some of the demos and was seriously underwhelmed. The demo with the water looks like goop. I've seen good in game water (Bioshock dx10) without physx, I've clear cut forests in Crysis and I still recall being impressed with destructible environments in Company of Hero's almost 2 years ago.

While recently deciding on a GPU upgrade physx and Cuda almost convinced to buy a GTX280. In the end I don't yet see a reason for me to need Cuda because by all reports Badaboom sucks. As I previously stated physx is underwhelming at this point. In the end Nvidia's attempt to sell a second rate card on the basis of added features wasn't convincing.

Having said that Nvidia's strategy may pan out, but in the hardware race you don't buy hardware today gambling for future feature support.

I'd be very happy if in the future my old 8800gtx could sit one slot down from my4870x2 and handle physics.

You're not supposed to be impressed. Physx isn't inhierently superior than existing physics processing that works off cpu, just works differently.

As for running Nvidia and ATI cards on the same system, you might get more driver problems than it's worth. :p 
August 24, 2008 2:34:29 PM

dagger said:
You're not supposed to be impressed. Physx isn't inhierently superior than existing physics processing that works off cpu, just works differently.

As for running Nvidia and ATI cards on the same system, you might get more driver problems than it's worth. :p 


Why should I not want to be impressed? Nvidia's selling point is that a GPU can do better physics than a CPU. What I wanted to see was the added power of a GPU show me something better than what I've seen before. Otherwise their purchase of Ageia and pushing physics support on future titles that may only work with their hardware seems like a lot of waste.
August 24, 2008 8:57:29 PM

Quote:

i don't play too many games and to me HL2 is still the benchmark for physics integration in a game.

Amount of physics in hl2 is tiny compared to games like Crysis. It's getting old. :p 
August 24, 2008 10:02:59 PM

No1sFanboy said:
Why should I not want to be impressed? Nvidia's selling point is that a GPU can do better physics than a CPU. What I wanted to see was the added power of a GPU show me something better than what I've seen before. Otherwise their purchase of Ageia and pushing physics support on future titles that may only work with their hardware seems like a lot of waste.



erm actually you may have misunderstood!its meant to do physicx BETTER/FASTER and NOT!! NOT!! do MORE. so you shouldnt see MORE of what you have already seen before, :lol: 
a b U Graphics card
August 24, 2008 10:45:31 PM

dagger said:
Amount of physics in hl2 is tiny compared to games like Crysis. It's getting old. :p 

What use is the physics in Crysis? It doesn't make the game any more involved. There isn't even any ragdoll like in Far Cry, what were the idiots at Crytek thinking when they "forgot" about that? HL2 is physics more polished too. I don't see entire building roofs in HL2 being held up at one corner by a thin wooden pole. I don't see trees getting stuck as they fall and so spinning around and around and then flying off. Yep, look at this vid I took: http://www.youtube.com/watch?v=X__IoCHTGPE

And what's with the floating "umbrella"?
August 24, 2008 11:03:43 PM

iluvgillgill said:
erm actually you may have misunderstood!its meant to do physicx BETTER/FASTER and NOT!! NOT!! do MORE. so you shouldnt see MORE of what you have already seen before, :lol: 



What are you on about? Where you quoted me I used the "better" twice and never used the word "more". Either way I have no clue what distinction you're trying to make.

The point is once upon a time dedicated graphics became desirable because it brought "more" graphical detail which was "better". Now Nvidia picking up where Ageia left off needs to show us "more" and "better" physics to make a case for Physx. In case anyone does ever offend you by using the word "more" in the words of Nvidia:
Quote:
On the PC, PhysX technology harnesses the power of any CUDA-enabled general-purpose parallel computing processor, including any NVIDIA GeForce® 8 Series or higher GPU, to handle 10-20 times more visual complexity than what’s possible on today’s traditional PC platforms.


http://www.nvidia.com/object/io_1218533603421.html

August 25, 2008 1:53:16 AM

but it never said it will somehow calculate/produce more with the same given work load.

its like the core 2 quad compare to the dual core and Intel made the claim you can do twice as much then a dual core system can.so you expect a quad core convert a mp3 and the output film being 2 files because of intel said it can do twice more?

edit:
Quote:
On the PC, PhysX technology harnesses the power of any CUDA-enabled general-purpose parallel computing processor, including any NVIDIA GeForce® 8 Series or higher GPU, to handle 10-20 times more visual complexity than what%u2019s possible on today%u2019s traditional PC platforms.


yes BUT that only if 10-20 times more workload is being put to the GPU. it said it can HANDLE but does not mean it will give 10-20% more of something out of nothing.say if the game need to calculate the leaves movement, the CPU is only calculate 10 trees at the same time,bwhere as GPU can calculate 100-200 trees at the same time before slow down. but if only the game only need to render 10 tress?so will the GPU physics render another 90-190 trees when it didnt even ask to do it and only because nvidia said it can HANDLE 10-20% more?
a b U Graphics card
August 25, 2008 2:30:09 AM

My question here is: Are they developing with the help of *real* physicists? I mean, physics can get as nasty as your assumptions go of what "real physics" is. Independant of how hardware/software handles the issue, u can have a simplistic way of looking a fall (ex. g.) or a relativistic-too-much-realistic fall wich will only add more junk and have a close result in practical terms. That being said, maybe that's why HL2 "physics" engine is still so good into my eyes also, cause it was polished from the assumptions and up. Having omfg-hyper-complex formulas just to show off is kinda retarded even to point it out ("visual complexity"?). The real brain work goes on the design table of the engine itself, not the hardware that can take care of that engine. nVidia is playing a dangerous game here and focusing on hardware and not the engine itself is going to hurt bad.

Also, how much "thread-ability" does paper physics formulas can get? Are they using relativistic or plain old Newtonian? Well, it shouldn't matter. Physics in games are a joke so far. Maybe for, let's say, AutoCad, SolidWorks or Pixar Studios it is indeed important to have this kind of flexibility, but then again, they don't have to render in real time, just have accurate results.

My 2 pesos.

Cheers!
!