Sign in with
Sign up | Sign in
Your question

PhysX, Nvidia, and Reviewers.

Last response: in Graphics & Displays
Share
August 18, 2008 9:53:13 AM

Hya mates

I guess creating this topic or "discussion" in the graphics are is the most appropriate. Guess my question is quite simple.

Why don't we get reviews on PhysX ? Why are almost every reviewer keeping their paws off ?

We got free games, paid games, several apps, tons of cards compatible. Every reviewer praises CUDA and all those advances CUDA is bringing to the table. So, why don't we get a review on that ? I don't mean only toms, the biggest sites just didn't touch it. Seemed like a freaking blank page.

PC Gaming and gaming overall will evolve that way. In this case Nvidia is on the edge, and if the reviews are about "cutting-edge" hardware, why not test the software ? Are reviewers afraid that they might like it ? We already had one decent discussion about it here on the forums that derived to a bit tech discussion, but good on none the less.

But it was just one, and short lived by that matter. I wont buy a Nvidia anytime sooner, but honestly, nobody is touching it?
Anybody (of the forum strollers, like myself) with g80 to g92 already tried the pack ? If so what are you opinions on the matter?

Really, i think physics is the future of gaming. Whether it be the Nvidia PhysX, or Havok, or anything else.

More about : physx nvidia reviewers

August 18, 2008 10:04:02 AM

I think PhysX is sucks. Always did and always will.
August 18, 2008 10:28:22 AM

bpogdowz said:
I think PhysX is sucks. Always did and always will.


That is technical, logical and enlightened opinion.

Not wanting to sound like a Nazi, but that opinion is of no value or consequence.


Related resources
a b U Graphics card
August 18, 2008 10:29:10 AM

Its still too early. As more games and apps make use of it we will see those reviews. Im thinking, once a retail vid converter using CUDA comes out, we will see em
August 18, 2008 9:12:40 PM

bpogdowz said:
I think PhysX is sucks. Always did and always will.


I hate to say that, but I kind of agree. I mean PhysX is a fad and its only purpose is to make you buy more GPU's.

The only logical step is to have extra processor cores taking care of that. And dont tell me the extra cores are not powerful enough to do that.....
a c 130 U Graphics card
a b Î Nvidia
August 18, 2008 9:16:51 PM


As i mentioned in this thread i posted earlier http://www.tomshardware.co.uk/forum/253934-15-performan... There are reviews out there and it seems that as far as the physx goes the actual real life gaming performance dosent stack up with the synthetic results. (Oh what a surprise there). The thing that gets me is how easily the reviews i have seen have taken it at face value and started slinging mud at Nvidia, accusing them of trying to distort benchmarks with dodgy drivers[Again (alegidly)] Vantage has a Physx component to it so i dont see the problem.

What i would like your thoughts on is the idea i had at work today that the ATI cards should lend themselves to doing this without a noticable loss of performance, due to the way they work. 800 SP's and worse case scenario 1 in 5 will be activly doing something. so that leaves 640 SP's that could be doing something else ? I beleive the Nvidia cards use the SP's more fully so dont have the spare performance, which in turn would impact the performance.

Mactronix :) 
a b U Graphics card
August 18, 2008 9:36:10 PM

i agree its to early to tell, but current nvidia cards can run physics on the card but they simple suck at the moment.
a c 271 U Graphics card
a c 168 Î Nvidia
August 18, 2008 9:42:31 PM

invisik said:
i agree its to early to tell, but current nvidia cards can run physics on the card but they simple suck at the moment.

But now you can play the 'Agiea Island' level in GRAW2, so all is right with the world. :whistle: 
a b U Graphics card
August 18, 2008 9:51:11 PM

lol. i have 2 gtx 260 in sli and when i play ut3 physic maps i get like 25-40fps. without physics enabled i get over a 100fps.
a b U Graphics card
a b Î Nvidia
August 18, 2008 11:13:35 PM

mactronix said:
The thing that gets me is how easily the reviews i have seen have taken it at face value and started slinging mud at Nvidia, accusing them of trying to distort benchmarks with dodgy drivers[Again (alegidly)] Vantage has a Physx component to it so i dont see the problem.


Well that it's omething that is doign double duty when the test was created for something that was dedicated to just one task doesn't strike you as innacurate results?
If S3 was to use CPU co-processing in order to do a graphics-only test, would you not cry foul?
When ATi re-arranged the order in which shaders were loaded (end result image was correct shaders just were more efficiently loaded ADCB rahter than ABCD for that hardware) Futuremark said that was a violation of their terms for the benchmark. So since the nVidia drivers re-organize the way that futuremark processes the test, isn't that the same?

When the PPU support was added to Vantage it was done as a specific hardware implemention; emulating it in drivers messes up the tests, since it allows the GPU to do 100% graphics one moment and then 100% physics the next, while in the real world some portion would be graphics and some portion physics, even in SLI where you would have one card for graphics and one for physics, during the Vantage tests it allows both to do 100% GPU and then 100% PPU when testing, which would not be what would happen with a PPU and not what would happen in real gaming.

If you could lock GPU 0 , 1, 3, or 4 into a PPU only mode, then that would make it a realistic test, otherwise it simply over-inflates the numbers in a way that no application would experience when running PhysX othere than when non-graphical like Vantage, and how many of those apps would you see or even care about?

It's not technically cheating since nV never applied to certify the drivers (although they claimed they would , and FM said they would approve them originally), however since then FM has said they would not allow any GPU-PhysX enhanced results in the Hall-of-Fame;
http://www.xtremesystems.org/forums/showthread.php?t=19...

I think if no one notice the change of the DLLs I wouldn't be surprised if there would've been the approved then later removed results after all the reviews had been done with those earlier drivers.

The thing that alot of people take issue with is that FM removed the DX10.1 component of their tests because they didn't want to favour one IHV over the other, then allowedthis to go on as long as it did despite doing precisely that. Add that the statements at the beginning of this that 'all is well... and oh by the way Futuremark will be releasing a game with GPU-physics sometime in the future....' made people question their neutrality.

I don't think it's cheating but it's definitely not kosher, and further relegates 3Dmark to a benchmark that is internally valid only and does even less to reflect real world applications/games.
August 18, 2008 11:26:54 PM

invisik said:
lol. i have 2 gtx 260 in sli and when i play ut3 physic maps i get like 25-40fps. without physics enabled i get over a 100fps.


Alright. But enabling physics adds something to the experience no ?

And by the way, what your saying doesnt adds up. try Disabling SLI, and one for physics and another for graphics. I can see you having 40-50 fps (considering the best possible scenario with a 100% gain, witch neither SLI nor CF are ussually capable) .

In the driver you can select them. you could select your second GTX260 to do physics only. And then tell the guys here, it sucks, it doesn't suck. Would be great news to hear back from you.
a b U Graphics card
August 19, 2008 12:17:07 AM

ill give it a shot. can u tell me step by step where i go so i can enable 1 of the cards for just physics.
a b U Graphics card
a b Î Nvidia
August 19, 2008 12:40:19 AM

Not sure if this will help, but they mention Multi-GPU mode in the Guru3D review;
http://www.guru3d.com/article/physx-by-nvidia-review/3

Not sure whether it's strictly available with different GPUs (wouldn't make sense that it shoud), but what you would then be doing is trading the benefit of SLi for GPU grunt for dedidicating that GPU to just the PhysX worload. Likely still a drop in performance, but not sure what exactly. Even Guru3D only tests the SLi setup in the Mutli Mode, not what the difference is betwen SLi GPU, SLi GPU+PhysX in Single Mode and SLi GPU+PhysX in Multi mode they only test that last option.
a c 130 U Graphics card
a b Î Nvidia
August 19, 2008 4:14:02 AM


Thanks for clearing that up TGGA i was unaware that they (Nvidia ) were doing it like that as none of the reviews/benchmarks i had seen bothered to mention it and on the face of it using a Physx driver to run a physx test seemed like a reasonable thing to do. The way some of the articles are worded its clear that they dont/didnt know themselves.

Mactronix
a b U Graphics card
August 19, 2008 8:25:56 AM

Maybe to the dismay of Huang, we will see multichips from nVidia, the main gpu, a smaller one for physx. Thats what should be done. Other apps being used in a non gaming situation will help to promote physx, and if I were nVidia, Id be looking at adding a second chip, costs be damned
August 19, 2008 9:25:42 AM

JAYDEEJOHN said:
Maybe to the dismay of Huang, we will see multichips from nVidia, the main gpu, a smaller one for physx. Thats what should be done. Other apps being used in a non gaming situation will help to promote physx, and if I were nVidia, Id be looking at adding a second chip, costs be damned


Maybe not a second chip, but another core or cores. The Ageia board was made in a 130nm process if im not mistaken. Reducing it to 55nm or 65nm and ripping it down a bit would do the trick. Nvidia is getting better than Intel in making a bang of a "renamed" product. It is very logical if they follow this path.

New GTX 360 and 380. Die-shrinked GT200s and with a PhysX PPU there. Sounds logical !!!

Oh, wait, badaboom review on anand. Im going to check it.

Badaboom: A Full Test of Elemental's GPU Accelerated H.264 Transcoder
http://www.anandtech.com/video/showdoc.aspx?i=3374

I know it is not a PhysX benchmark, but hey, its something at least.
a b U Graphics card
August 19, 2008 9:39:03 AM

TY for the link, Im off reading...
a b U Graphics card
a b Î Nvidia
August 19, 2008 9:55:06 AM

A little dissapointed actually.

Kinda reminds me of ATi's AVIVO transcoding let down.

Reduced image quality, limited options, and only effective on something noitceably better than a GF8600GTS.

Would've liked to have seen what the GF9600GT made of the tests.

It'll be interesting to see what the Cyberlink GPU-acceleration add-ons bring in the fall.

Also, I don't think adding another GPU to a card for physX makes sense, that's add board complexity, cost and failure point. From nV's stand-point don't know what would be best to do, but for most consumers buying an old GPU/Card seems like the wise choice.
a b U Graphics card
August 19, 2008 10:01:46 AM

True, but mobo limatations plays into that scenario unfortuanately. If nVidia goes more multi chip in the future, this could change, and yea, not the bang I was expecting from baddaboom heheh
a b U Graphics card
a b Î Nvidia
August 19, 2008 10:11:55 AM

The only thing for me, those willing to pay for the cost of an added chip tothe PCB etc. will pay for a multi PEG MoBo as well (remember likely a 1-2X throughput but physically 16x slot would be fine for the PhysX functions, wouldn't require full SLi support).
a b U Graphics card
August 19, 2008 10:25:53 AM

True, but the newbies who dont any better could be sold. Who knows? Either there has to be better fps/cards, or it doesnt look that good. Tho, quads are a waste IMO, heres their chance to do some work
August 19, 2008 10:33:36 AM

well, imo i think the physics should stay on the cpu, cos while the gpu is rendering the game, what does the cpu do? physics! take that away and what's the point of having an extreme cpu?
August 19, 2008 11:23:04 AM

V3NOM said:
well, imo i think the physics should stay on the cpu, cos while the gpu is rendering the game, what does the cpu do? physics! take that away and what's the point of having an extreme cpu?


In gaming, many parts of the rendering are still done by the CPU. And i don't see it changing so soon. If not every enthusiast was rocking a Athlon 2000+ ( the 8W part) and a 4870x2 or a 280 GTX. Trust me on this one, CPU still takes a major part.

@ JDJ

I don't mean another chip on the board, but another core or PPU on-die. It will have a faster interconnect with the GPU skipping bridges and other "contraptions". About the ageia chip as i said earlier, i guess (tried to validate my statement, but couldn't find the info) it was made on 130nm or bigger. Ripping apart some functionality/cores and die-shrinking it to 55nm, they could fit withing a GPU die. Of course in this case would make the GT200 die even bigger....so, lets wait and see.

@badaboom review.

Incompatibilities problems, somewhat decent performance, low of flexibility. Yep, got every point in a experimental type of technology. It wont take over the world, but it is a very nice start. As for the average gamer don't use tomography or "drilling-oil" that was out of the point. Specific Workstations and Servers that were bound to CPU workloads are now GPU workloads. Medical assisted IT is getting pretty common home based. I know this field, because I've worked closely with them.

For example i mounted a home workstation for a neuro-surgeon that did diagnostics on 3d rendered (on-the-fly) images from the pacients. They were taken by the front, left and top views. A SINGLE diagnostic could take over 4.7gb, depending on case and person (yes, it was the only person i saw buying DVD-DL frequently). The Machine at the time had a Sata II Raid 0 ,DDRII 800 , an ATI X800 (for filters, not gaming, Nvidia image quality at the time was horrible for this job) and a Athlon X2 3800+ 939. All top stuff.

While in preview mode ,the machine would do fine, but in working with the full resolution (in the Raid 0 HDD, not reading from DVD ofc) it would stutter a bit during diagnostics. This year we upgraded it to q Q6600. It wasn't really a improvement i might add. That Q6600 is now used for gaming, and he was back to his AMD. Lets hope for a CUDA version of this software soon.

A TLDR version: CUDA will sell GPUs to people that would never buy them for gaming in the first place.
a b U Graphics card
August 19, 2008 11:36:02 AM

I didnt say where or how a second chip would be, but could be used. If done on the cheap, in regards to Apes statements, it could still be done. And yes, its still early on. I agree with Anand, open source this, have everyone give a hand, see what comes of it. Open it up to the point that apps are so diversified, it cant be ingnored, which usually means, someone will take advantage of it and make some money
a b U Graphics card
August 19, 2008 2:30:15 PM

i cant dedicate one card for physics wont let me select my other graphic card. strange.
=?
August 19, 2008 2:34:45 PM

Is there an Ageia PPU on the Nvidia GTX 2*0 series?
August 19, 2008 2:49:04 PM

invisik said:
i cant dedicate one card for physics wont let me select my other graphic card. strange.
=?


Do you have the lastest driver ? If so i think you need to disable SLI first. Cheap your profiles in the Nvidia Panel. Would love to help you, but i don't know Nvidia control center....

@maximiza

Nope, as far as we know. It can do it, but as far as i know there isn't a PPU there.
a b U Graphics card
August 19, 2008 3:31:12 PM

yup i have the latest drivers i downloaded this v177.83 which includes physics in the driver.
!