Sign in with
Sign up | Sign in
Your question

Does gtx 280SLI have an unnecessarily high CPU overhead?

Last response: in Graphics & Displays
Share
July 23, 2008 12:36:24 AM

http://techreport.com/r.x/r700-preview/crysis-1920-very...

Something about this particular screenshot which was linked in another thread might give a great deal of explanation as to why the gtx280 is quite evidently not working as it should be in SLI or otherwise.

There is something fundimentally odd about 9800 GTX SLI having higher performance than GTX 280 SLI - and I don't think it has anything to do with driver maturity.

On that token, maybe there is a driver maturity issue - is the software driver somehow blowing cpu performance into the gutter which is driving down the framerates? Is the SLI bridge hardware flawed on this card somehow? Just looking at that particular screenshot is telling me that there is something quite seriously wrong with the gtx280's SLI performance.

Any comments?
a b U Graphics card
July 23, 2008 12:40:48 AM

maybe the new cards need more kick on the cpu to put forward there full potential?
July 23, 2008 12:44:00 AM

Surely, we've seen this before.

However the 4870 and 4870x2 aren't having this problem - which implies to me there has to be a /specific/ reason as to why it simply is not scaling worth a damn.

Even in some of the other tests, this particular setup is quite evidently being very held back by cpu power - but more to the point is why is it holding it back this much, but it doesn't hold back the AMD solution's scaling..? Like, I would think that there would be more of a hit to their solution - all other things considered equal.
Related resources
a b U Graphics card
July 23, 2008 12:51:27 AM

is it possible nvidia is doing it on purpose? u no how now nvidia and intel are enemies now maybe there doing this on purpose just "testing" maybe via and nvidia have something up there sleeve to through intel off. they might be working on something which when used intel cpus, the cards dont perform as well and if via releases a new cpu it works well with theirs. intel doesnt want to use sli wat option does nvidia have and via is against intel. just my wild theory lol.
July 23, 2008 12:54:19 AM

That is a pretty wild one - but given that Nvidia is losing a great deal of money right now I don't think thats the reason.

I imagine there has to be a logical explanation for why this is effecting NV's cards more than AMD's; but I just can't be damned to figure it out.
July 23, 2008 12:56:02 AM

now when you say 4870x2... does that mean cf, or the one thats coming out in less than a month?
a b U Graphics card
July 23, 2008 12:57:42 AM

do u think its just an architecture problem on the gt200?
July 23, 2008 1:06:11 AM

Eklipz330 - goto techreport.com and look at the 4870x2 preview article; you will see that the 4870x2 is typically a little bit ahead of 4870 CF and that the 4850/4870 CF scaling is usually noticeable in everything but Crysis (which is very software bound for AMD at the moment)

Invisik - it was actually something of a suspicion that I had, because i've heard the phrase used before when describing SLI technology.. i'll paraphrase; SLI is not just about throwing 2 cards together, its about how it interacts with the entire platform.

NV claimed they were reluctant to give Intel a license for SLI tech for this reason. Whether that is valid or not I really couldn't tell you.

Now that said, I use gtx260s in SLI myself - but I run on a 4.05 ghz C2D. I can say for absolute certainty that my min and avg framerates are higher than the gtx280 SLI numbers from that techreport article - but i'm limited to testing in 1680*1050, I only own a 22" display; so I can't validate for the higher resolutions.

Regardless, if I clock my processor to stock frequencies - I have noticed a markable difference in my framerates, especially in Crysis.

While in all practical terms this really doesn't bother me as I have a solid and stable build that handles the 4ghz OC quite well; however it does puzzle me a great deal as to why this configuration at least /seems/ very CPU sensitive - especially when the AMD solution is nowhere near as CPU sensitive. This obviously makes the AMD solution the better of the two for most users, however it doesn't mean that i'm still not curious as to why this phenomenon is so evident.
a b U Graphics card
July 23, 2008 1:12:18 AM

i get wat ur saying now it makes me curious to. hmmm just wondering wat does the cpu do while playing games? i mean the ai, physics? isnt it mostly being done through the gpu.
July 23, 2008 1:18:14 AM

http://www.anandtech.com/video/showdoc.aspx?i=3354&p=3

even in this site they are using 177.34 drivers for the gt200 cards - and the 177.41 drivers were quite available when they benchmarked the 4870x2 preview

Why do I have a feeling i'm going to be backtracking drivers to see if there is performance differences.... X_X
a b U Graphics card
July 23, 2008 1:29:59 AM

does 177.41 make that much of a difference usually drivers improve performance like 5-10%
July 23, 2008 2:27:06 AM

I'm not sure, I started out on the 177.41 - I didn't get mine at launch

I'll have to check another day and be sure - I doubt that's it though
a c 130 U Graphics card
July 23, 2008 9:47:05 PM


So the card that is meant to be lower down the pecking order scales better than the new all singing and dancing one. And you havent seen this before ???

Dont see any logical reason as far as specs go to say it should or shouldnt be the case either way i cant say im that bothered about it.

It would seem strange if it had much to do with CPU usage though unless the card is really bad in that department in which case it would effect the single card score as well.

Mactronix
a b U Graphics card
July 23, 2008 11:09:23 PM

rig it up with a skulltrail dual QX9770's at 6.0GHz and see if its bottlenecked by the CPU
July 24, 2008 12:01:49 AM

sabot00 said:
rig it up with a skulltrail dual QX9770's at 6.0GHz and see if its bottlenecked by the CPU



NOVEL IDEA, i'm taking hardware donations for that project :bounce: 
Anonymous
a b U Graphics card
July 24, 2008 2:28:38 AM

well... I think it was always suspected the new gen of graphics cards would be cpu bottlenecked... I mean.... the faster the processor... the more frames... usually that dies off at higher resolutions... but it doesn't... I'm thinking 4 ghz + processors are needed to really driver these things... and since most people don't have 4 ghz + processors... we don't ever really see how good these cards are.... hopefully not only will the architecture of nehalem reduce bottlenecking... but the supposed insane OCing abilities (which seem true...) of nehalem should allow higher clock potentials to really reduce bottlenecking

although this may not actually be the case... and the cards could just plain suck :kaola: 
!