niz

Distinguished
Feb 5, 2003
903
0
18,980
Heh you ATI fanbois make me laugh... just trying to find anything whatsover bad to say about the nVidia card.

Who cares if dynamic branches take 0.95Ms on the 8800 instead of 0.5 on the 1950? The 8800 pounds the 1950 into the ground and dances on its grave in overall performance.
 
Heh you ATI fanbois make me laugh... just trying to find anything whatsover bad to say about the nVidia card.

NIZ I've seen your posts, and you're the same as the guy above only you do it for nV.
(I'll edit just to say maybe, as I haven't noted Raystonn's posts so maybe I'm mischaracterizing him in relationship to you, that may be insulting him, cause we know you shill for sure).

Who cares if dynamic branches take 0.95Ms on the 8800 instead of 0.5 on the 1950? The 8800 pounds the 1950 into the ground and dances on its grave in overall performance.

Actually re-read his post, the GF8800 doesn't do it in 0.95ms, that's the X1800, the GF8800 does it in 11 ms, no decimal.

Dynamic branching is actually quite important, but from the review @ behardware;
http://www.behardware.com/articles/644-6/nvidia-geforce-8800-gtx-8800-gts.html

the impact won't be as detrimental in games thanks to the very fast speed of the shaders, but it likely will cause issues for things like GGPU and maybe even Physics.
 
The fact is, these fast shaders are taking 11ms to do a single branch...

I understand, I was only pointing to your second post actually with regards to calculations, not branching.

I'm not disagreeing with your first part, but the second part is negated by the speed.

Two different items.

Scalar vs Vector is compensated by speed, but the branch prediction is still a major issue.

ie Iagree with your first post, but don't think your second matters as much for the equality, but just like the Xbox360 the unified design doesn't mean 1:1.

Anywhoo, gotta leave work, so I'll have to reply to you after I get my car from Dodge.
 

adi_alter_ego

Distinguished
Oct 28, 2006
15
0
18,510
if its true, maybe it isnt such a big breaktrough, but, i have seen benchmarks in a lot of games and it shows improved image quality and improved framerates, especially and high resolution. For the end-user`s point of view, this is what we all want!
fanboism or not, the g80 is scoring high and ultimately this is the card to go with for now, untill ati will throw something in the dx10 war.

take care :)
 

Lord_CorDox

Distinguished
Nov 16, 2006
1
0
18,510
Howdy, I’ve been a NVIDIA user for as long as they have been around.. About three years ago I decided to get into game development, because I saw room for improvement in games in general and wasn't satisfied with the current standards..

I'm glad to see someone else is running into the same results I've been getting with a G80..

I've been searching all over looking for information on dynamic branching with a G80 thinking I must have somehow missed something.. At first I was quite excited when they announced improved dynamic branching for the G80, because it would allow me to do so much more, I already have an engine I've worked on that makes use of dynamic branching.. I finally got my 8800 GTX recently, and was quite shocked to see how my app was performing after reading the tech briefs on G80 and looking at the graphs they showed comparing to ATI's latest Dx9 card..

To me, when I think Dynamic Branching + DX10, I’m seeing new ways of doing things, ways to stop using scanline rasterizers and go to a new technique that doesn’t have all the weaknesses of scanline, have virtually unlimited viewing distances, and overall allow deeper levels of interactivity in games. Remove the Dynamic Branching from that, and it’s a HUGE bottleneck to all of these new things I’d like to support, I’d basically be stuck doing things the same way DX9 does, which is far too limiting for my project.

I actually do consider myself a fanboi.. A NVIDIA fanboi for a long time now, which is why I hope this information gets out and NVIDIA is made clearly aware, because I want them to remain competitive. What it all comes down to is who can help make projects like mine a reality. If I have no choice, I’m willing to take a look at ATI for the sake of remaining objective.

I’m tired of seeing games that have nearly identical gameplay with a different theme or setting or just the same game with improved visuals.. End, the clone wars must.

-Regards,
CorDox
 

celewign

Distinguished
Sep 23, 2006
1,154
0
19,280
How much does that extra couple milliseconds mean? What is dynamic branching, or whatever?

Enlightenment. Someone, please, I'm too lazy to Wiki it.
-cm
 

airalex1919

Distinguished
Jun 3, 2006
63
0
18,630
hmmm thats very intresting
thanks for the heads up!
yeek! wonder how they'll preform when some of the more complex shader dx10 games come out
hopefully not as bad as you predict or lots of people are going to be dissapointed!
 

rodney_ws

Splendid
Dec 29, 2005
3,819
0
22,810
Heh you ATI fanbois make me laugh... just trying to find anything whatsover bad to say about the nVidia card.

Who cares if dynamic branches take 0.95Ms on the a8800 instead of 0.5 on the 1950? The 8800 pounds the 1950 into the ground and dances on its grave in overall performance.

I think the purpose of the post by the "fanboi" ws to point out the possibility that future titles will really be hampered by the 8800's hardware. No one is disputing its present-tense domination... they're just questioning its position when ATI has a DX10 card.

I've got an Nvidia in my desktop and an ATI in my laptop... so I'm not sure what you can accuse me of being.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
While if true, I don't think it directly effects the G80 as of now, however this may be an issue once Direct X10 is really in use. Good thing I'm waiting until March for my videocard upgrade.
 

piesquared

Distinguished
Oct 25, 2006
376
0
18,780
this is interesting as IMO it adds weight to the see what r600 has to offer argument. if it is like the G80 in that its unified shaders are very fast and it also has the branch prediction abilities of the x1000 series cards( is that possible?) then in future games that make use of complex shaders, will it be alot more able to maintain performance in dx10 games with high settings?

What i'm wondering is, how long into the future will we have to wait for the use of complex shaders, if developers back nvidia.
 
I don't think it directly effects the G80 as of now,

Well it does affect the G80 now, and likely forever (unless some driver could fix what in every way appears to be a hardware limitation). Whether it will matter in the near term or long term is another story.

It may be like the FX in that the weaknesses aren't exposed as a 'current' concern in some game until long after people have moved on to the G90/R700, but who knows for sure at this point.

One thing that would affect it now is the G80's potential usefullness for things like Folding@Home.
 

sir_roadkill

Distinguished
Apr 11, 2006
30
0
18,530
Might as well wair for R600 to compare in that case. However I do want to play FEAR in top candy during the Uni summer break. Might get GTS.
 

tr2448

Distinguished
Aug 6, 2006
11
0
18,510
Incredible fanboy bs. Many investigations of shader performance shows G80 superior. Not even worth a reply except for flame bait extinguishing,
 

niz

Distinguished
Feb 5, 2003
903
0
18,980
For our project, the X1950XTX is about 1,000 times faster for the current scene complexity. As the scene complexity increases (which it definately will) projections are that the G80 will be outperformed by a factor of approximately 1,000,000. We use lots of dynamic branching to manage large sets of data in the shader. The G80 just isn't up to the task. I'm sure the R600 will only make this worse.

Performance in your DX9 games is nice, but you should be looking to the future.

-Raystonn

>> X1950XTX is about 1,000 times faster
>> the G80 will be outperformed by a factor of approximately 1,000,000

Dude you're just sounding ridiculous when you imply that a X1950XTX is ever gonna be faster than a G80 in overall performance let alone the ridiculous thousand or a million times you state.

>> DX9 games is nice, but you should be looking to the future.
Yeah its a shame the G80 doesn't do DX10. Maybe I should buy the currently available ATI/AMD card instead </sarcasm>