Something about this particular screenshot which was linked in another thread might give a great deal of explanation as to why the gtx280 is quite evidently not working as it should be in SLI or otherwise.
There is something fundimentally odd about 9800 GTX SLI having higher performance than GTX 280 SLI - and I don't think it has anything to do with driver maturity.
On that token, maybe there is a driver maturity issue - is the software driver somehow blowing cpu performance into the gutter which is driving down the framerates? Is the SLI bridge hardware flawed on this card somehow? Just looking at that particular screenshot is telling me that there is something quite seriously wrong with the gtx280's SLI performance.
However the 4870 and 4870x2 aren't having this problem - which implies to me there has to be a /specific/ reason as to why it simply is not scaling worth a damn.
Even in some of the other tests, this particular setup is quite evidently being very held back by cpu power - but more to the point is why is it holding it back this much, but it doesn't hold back the AMD solution's scaling..? Like, I would think that there would be more of a hit to their solution - all other things considered equal.
is it possible nvidia is doing it on purpose? u no how now nvidia and intel are enemies now maybe there doing this on purpose just "testing" maybe via and nvidia have something up there sleeve to through intel off. they might be working on something which when used intel cpus, the cards dont perform as well and if via releases a new cpu it works well with theirs. intel doesnt want to use sli wat option does nvidia have and via is against intel. just my wild theory lol.
Eklipz330 - goto techreport.com and look at the 4870x2 preview article; you will see that the 4870x2 is typically a little bit ahead of 4870 CF and that the 4850/4870 CF scaling is usually noticeable in everything but Crysis (which is very software bound for AMD at the moment)
Invisik - it was actually something of a suspicion that I had, because i've heard the phrase used before when describing SLI technology.. i'll paraphrase; SLI is not just about throwing 2 cards together, its about how it interacts with the entire platform.
NV claimed they were reluctant to give Intel a license for SLI tech for this reason. Whether that is valid or not I really couldn't tell you.
Now that said, I use gtx260s in SLI myself - but I run on a 4.05 ghz C2D. I can say for absolute certainty that my min and avg framerates are higher than the gtx280 SLI numbers from that techreport article - but i'm limited to testing in 1680*1050, I only own a 22" display; so I can't validate for the higher resolutions.
Regardless, if I clock my processor to stock frequencies - I have noticed a markable difference in my framerates, especially in Crysis.
While in all practical terms this really doesn't bother me as I have a solid and stable build that handles the 4ghz OC quite well; however it does puzzle me a great deal as to why this configuration at least /seems/ very CPU sensitive - especially when the AMD solution is nowhere near as CPU sensitive. This obviously makes the AMD solution the better of the two for most users, however it doesn't mean that i'm still not curious as to why this phenomenon is so evident.
rig it up with a skulltrail dual QX9770's at 6.0GHz and see if its bottlenecked by the CPU
NOVEL IDEA, i'm taking hardware donations for that project
well... I think it was always suspected the new gen of graphics cards would be cpu bottlenecked... I mean.... the faster the processor... the more frames... usually that dies off at higher resolutions... but it doesn't... I'm thinking 4 ghz + processors are needed to really driver these things... and since most people don't have 4 ghz + processors... we don't ever really see how good these cards are.... hopefully not only will the architecture of nehalem reduce bottlenecking... but the supposed insane OCing abilities (which seem true...) of nehalem should allow higher clock potentials to really reduce bottlenecking
although this may not actually be the case... and the cards could just plain suck