GeForce GTX 295 In Quad-SLI

Benchmark Results: 3DMark Vantage

3DMark Vantage performance scales almost perfectly with the number of graphics processors, giving AMD's architecture a slight advantage when “Disable PPU” is selected. The reason we disabled PhysX for this benchmark is that it equates an added feature with added speed, and then adds this supposed performance gain to the CPU score.

GPU scores don’t change when 3DMark’s “Disable PPU” setting is chosen, and the performance scaling is nearly identical to the overall score.

With 3DMark's PPU setting disabled, the CPU scores are as consistent as we’d expect them to be.

Thomas Soderstrom
Thomas Soderstrom is a Senior Staff Editor at Tom's Hardware US. He tests and reviews cases, cooling, memory and motherboards.
  • shreeharsha
    Good job. (but none of these cards are in my budget)
    Reply
  • JeanLuc
    I’m looking at page 9 on the power usage charts – I have to say the GTX295 is very impressive it’s power consumption isn’t that much greater then the GTX280. And what’s very impressive is it uses 40% less power in SLI then the HD4870X2 does in Crossfire., meaning if I already owned a pretty decent PSU say around 700-800 watt’s I wouldn’t have to worry about getting it replaced if I were planning on SLIing the GTX295.

    I would have liked to have seen some temperatures in there somewhere as well. With top end cards becoming hotter and hotter (at least with ATI) I wonder if cheaper cases are able to cope with the temperatures these components generate.

    BTW any chance of doing some sextuple SLI GTX295 on the old Intel Skulltrail?
    Reply
  • Crashman
    JeanLucBTW any chance of doing some sextuple SLI GTX295 on the old Intel Skulltrail?
    Not a chance: The GTX 295 only has one SLI bridge connector. NVIDIA designs its products intentionally to only support a maximum of four graphics cores, and in doing so eliminates the need to make its drivers support more.
    Reply
  • neiroatopelcc
    I'd like to see a board that takes up 3 slots, and use both the 1st and the 3rd slot's pcie connectors to power 4 gpu's on one board. Perhaps with the second pcie being optional - so in case of not fitting the card at all, one could fit it with reduced bandwidth. That way they'd have a basis to make some proper cooling. Perhaps a small h2o system, or a peltier coupled with some propler fan and heatsink.

    ie. a big 3x3x9" box resting on the expansion slots, dumping warm air outside.
    Reply
  • neiroatopelcc
    edit: more like 2x3x9" actually

    and propler=proper
    Reply
  • jameskangster
    "...Radeon HD 4870 X2 knocked the GeForce GTX 280 from its performance
    thrown." --> "throne"? or am I just misunderstanding the sentence?
    Reply
  • kschoche
    So the conclusion should read:
    Congrats on quad-sli, though, for anything that doesnt already get 100+ fps with a single GX2, you're welcome to throw in a second and get at most a 10-20% increase, unless of course you want to get an increase to a game that doesnt already have 100 FPS (crysis), in which case you're screwed - dont even bother with it.
    Reply
  • duzcizgi
    Why test with AA and AF turned on with such high end cards? Anyone who pays +$400 * X wouldn't be playing any game with AA AF turned off or with low res. display. (If I'd pay $800 for graphics cards, I'd have of course had a display with no less than 1920x1200 resolution. Not even 1680x1050)
    And I'm a little disappointed with the scaling of all solutions. They still don't scale well.
    Reply
  • hyteck9
    The performance per watt char is exactly what I wanted to see (it would be even better with some temps listed though). Thanks THG, This will help things along nicely.
    Reply
  • duzcizgi, don't forget about the real hardcore players (those who play tournaments for example), who prefer to play with the lower graphics settings and ensure > 100 FPS.
    Reply