GeForce GTX 295 Vs. GTX 275 SLI: When Two Are Better Than One

Benchmark Results: 3DMark Vantage

With clock speeds normalized, the two GeForce GTX 275s and GeForce GTX 295 perform very nearly identically. Restoring the two single-GPU cards to their standard clocks, however, yields a perceptible performance boost in the suite and GPU scores. Given that synthetics are often worst-case indicators, we might be inclined to suspect very little deviation as we move to real-world testing—but that’s not the case.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
51 comments
    Your comment
  • been waiting for more on the 295...
    0
  • I already ordered 2 295's... $504 each.
    -9
  • "Intel Core i7 920 Extreme (Bloomfield)" (page 3)

    there's no extreme version of i7 920, nor is it bloomfield...
    -7
  • What the hell is up with the underclocked cards out performing the others in that H.A.W.X.

    Can the author of the article comment with what they think is going on there?
    5
  • reasonablevoiceWhat the hell is up with the underclocked cards out performing the others in that H.A.W.X.Can the author of the article comment with what they think is going on there?


    Happened in WiC w/o AA as well. Difficult to say went on there, but the results are repeatable. Probably more important, though, is that when more of an emphasis is put on the graphics subsystem, you see those stock-clocked boards take the lead, as we'd expect.
    0
  • 1.Very good article, unlike some other author's articles in this site, this article is solid (starting from the test system down to the conclusion) and interesting, this is what I always expect from Chris.
    2.As for the strange issue in L4D, HAWX and WIC where the slower 275s beat the faster ones....Odd indeed. Is there any chance the normally clocked cards automatically clocked down to 2D mode or somthing in-game? In other words the GPUs usage dropped due to the CPU bottleneck or whatever, and the cards' driver decided to clockdown to save energy! I've seen nvidia and ati cards do that. The monitoring utility of rivatuner could have revealed such things since it shows real-time clocks..BTW what software did you use to downclock?
    It would be funny to consider downclocking our cards to 'gain' performance!!
    3.I hope the new (single PCB) 295 will drop in larger quantities, perhaps it will be more practical than the current one, and will tip the balnce here in its favor.
    http://www.techpowerup.com/img/09-05-12/13c.jpg
    1
  • Can you put the "online shop" section underneath the "Next" button for the next page...its really annoying and inconvenient to have it positioned within the article as it seems to be.
    Thanks
    8
  • Please benchmark in Very High.
    4
  • rags_20Please benchmark in Very High.


    I noticed that to, if I owned that kind of hardware I would be playing every game at the highest settings even if it is Crysis.
    1
  • very good Article, thnx
    1
  • Who the f$%& spends 500$ on a stupid graphics card...

    Good article anyhow ;)
    -8
  • Btw why dont you OC that cpu to 3ghz+?

    Who buys and i7 to use it at stock?
    2
  • stlunaticBtw why dont you OC that cpu to 3ghz+?Who buys and i7 to use it at stock?


    People who worry about their voiding there warranty, people who buy from HP/Dell etc.
    0
  • People who buy an HP/Dell don't buy GTX 275s in SLI...

    I feel many of these games may have been bottlenecked by that CPU. Would've liked to see these tests with 3.5ghz.
    1
  • Quote:
    The most striking result here is the drop from 1920x1200 to 2560x1600. The same bug seen in Crysis manifests itself here as well.


    This isn't a bug, nor is it fixable by a driver update. It's called not having enough VRAM to handle all those MASSIVE textures at quadruple their on-screen resolution. The same thing happens when I move from 1920x1200 to 2048x1536 on my 4870. The only solutions are smaller textures or more VRAM. This is why the "professional" cards (FireGL/FireSTREAM and Quadro/Tesla) will often have 2-4 times the framebuffer as the desktop counterparts.
    4
  • DaerosThis isn't a bug, nor is it fixable by a driver update. It's called not having enough VRAM to handle all those MASSIVE textures at quadruple their on-screen resolution. The same thing happens when I move from 1920x1200 to 2048x1536 on my 4870. The only solutions are smaller textures or more VRAM. This is why the "professional" cards (FireGL/FireSTREAM and Quadro/Tesla) will often have 2-4 times the framebuffer as the desktop counterparts.


    IF the problem is VRAM then why isn't the same result replicated in the GTX 275 benchmarks? Bear in mind SLI setups can only address half the available ram so in this case both cards have the same amount of VRAM.
    1
  • It would have been interesting to see the performance of 2 GTX 295 in SLI. Let's face it, if GTX 275 SLI can outperform a single GTX 295, then the _ONLY_ argument left for a 295 is SLI.
    -1
  • Well, let's see: the 275 at stock clocks went from 31.3 to 7.4, the 275 at 295 speeds went from 28.2 to 8.1, and the 295 went from 28.7 to 7.9. So you tell me how it didn't happen in the 275 benchmarks.

    ps- If you want validation of my remarks, just take a look at all the benchmarks comparing the 4870 512MB to the 4870 1GB. You can see the exact same thing there. Or, if you want to stick with the green, go back to when the 8800GT 256MB came out. At low resolutions it was fine; but crank up either the resolution or the AA/AF, and it chokes.
    1
  • Minor correction to my first post- I meant 1920x1440, not 1920x1200.
    0