Sign in with
Sign up | Sign in

GeForce GTX 295 In Quad-SLI

GeForce GTX 295 In Quad-SLI
By

Rumors of a GT200-based dual-GPU solution from Nvidia quickly began circulating after AMD’s Radeon HD 4870 X2 knocked the GeForce GTX 280 from its performance throne. Nvidia certainly had the design experience, with its GeForce 7950 GX2 and GeForce 9800 GX2 paving the way for further improvements in its multiple-GPU product line. However, those rumors were quickly quashed by the logic that two full GTX 280 processors at 65nm would require too much power and create too much heat to be combined in a single package.

A recent die-process shrink from 65nm to 55nm helped to reduce both heat and power consumption, allowing Nvidia to pursue its two-cards-in-a-brick GX2 design using the most recent variation of its high-end graphics processor. As with previous GX2 products, Nvidia further reduced heat and power consumption by slightly downgrading its twin graphics processors. The new board uses the memory interface and clock speeds of its GeForceGTX 260, but with the full 240 stream processors its GeForce GTX 280 is known for.

Given Nvidia’s propensity for recycling the names of former high-end parts, one might have expected the company to call its new product the “GTX 280 GX2” or “GTX 380 GX2”. Instead, it chose a middle road, removing the GX2 designation to title this product the GeForce GTX 295.

We saw excellent performance in our GeForce GTX 295 preview, but wondered what advancements improved drivers could bring. Also in the backs of our minds were SLI scaling issues that had plagued 7950 GX2 and 9800 GX2 Quad-SLI configurations, causing these to fall behind SLI pairs of single-GPU 7900 GTX and 8800 GTX cards at ultra-high graphics settings. With these questions in mind, we set about procuring a second GTX 295 unit and two HD 4870 X2 cards for comparing Quad SLI to CrossFireX performance, plus three GTX 280 cards to compare Nvidia’s highest-end 3-way SLI to its current Quad-SLI solution.

Display 89 Comments.
This thread is closed for comments
  • 1 Hide
    shreeharsha , January 15, 2009 10:44 AM
    Good job. (but none of these cards are in my budget)
  • 2 Hide
    JeanLuc , January 15, 2009 10:46 AM
    I’m looking at page 9 on the power usage charts – I have to say the GTX295 is very impressive it’s power consumption isn’t that much greater then the GTX280. And what’s very impressive is it uses 40% less power in SLI then the HD4870X2 does in Crossfire., meaning if I already owned a pretty decent PSU say around 700-800 watt’s I wouldn’t have to worry about getting it replaced if I were planning on SLIing the GTX295.

    I would have liked to have seen some temperatures in there somewhere as well. With top end cards becoming hotter and hotter (at least with ATI) I wonder if cheaper cases are able to cope with the temperatures these components generate.

    BTW any chance of doing some sextuple SLI GTX295 on the old Intel Skulltrail?
  • 5 Hide
    Crashman , January 15, 2009 11:07 AM
    JeanLucBTW any chance of doing some sextuple SLI GTX295 on the old Intel Skulltrail?


    Not a chance: The GTX 295 only has one SLI bridge connector. NVIDIA designs its products intentionally to only support a maximum of four graphics cores, and in doing so eliminates the need to make its drivers support more.
  • 0 Hide
    neiroatopelcc , January 15, 2009 11:11 AM
    I'd like to see a board that takes up 3 slots, and use both the 1st and the 3rd slot's pcie connectors to power 4 gpu's on one board. Perhaps with the second pcie being optional - so in case of not fitting the card at all, one could fit it with reduced bandwidth. That way they'd have a basis to make some proper cooling. Perhaps a small h2o system, or a peltier coupled with some propler fan and heatsink.

    ie. a big 3x3x9" box resting on the expansion slots, dumping warm air outside.
  • -6 Hide
    neiroatopelcc , January 15, 2009 11:13 AM
    edit: more like 2x3x9" actually

    and propler=proper
  • 2 Hide
    jameskangster , January 15, 2009 11:31 AM
    "...Radeon HD 4870 X2 knocked the GeForce GTX 280 from its performance
    thrown." --> "throne"? or am I just misunderstanding the sentence?
  • 3 Hide
    kschoche , January 15, 2009 11:52 AM
    So the conclusion should read:
    Congrats on quad-sli, though, for anything that doesnt already get 100+ fps with a single GX2, you're welcome to throw in a second and get at most a 10-20% increase, unless of course you want to get an increase to a game that doesnt already have 100 FPS (crysis), in which case you're screwed - dont even bother with it.
  • -1 Hide
    duzcizgi , January 15, 2009 12:09 PM
    Why test with AA and AF turned on with such high end cards? Anyone who pays +$400 * X wouldn't be playing any game with AA AF turned off or with low res. display. (If I'd pay $800 for graphics cards, I'd have of course had a display with no less than 1920x1200 resolution. Not even 1680x1050)
    And I'm a little disappointed with the scaling of all solutions. They still don't scale well.
  • 0 Hide
    hyteck9 , January 15, 2009 12:41 PM
    The performance per watt char is exactly what I wanted to see (it would be even better with some temps listed though). Thanks THG, This will help things along nicely.
  • 3 Hide
    Anonymous , January 15, 2009 1:27 PM
    duzcizgi, don't forget about the real hardcore players (those who play tournaments for example), who prefer to play with the lower graphics settings and ensure > 100 FPS.
  • 5 Hide
    topper743 , January 15, 2009 1:38 PM
    Very informitive. I have one request going forward; when the testing teams do a 3Dmark Vantage test would they please include the standard 3Dmark06 benchmark results? Most users won't/don't use the vantage paid test. Vantage results have a different scoring and I and probably others can't interept what those scores mean for systems tested with just the 3Dmark06 standard tests.
  • 7 Hide
    L1qu1d , January 15, 2009 1:54 PM
    Its kinda funny that the 3 way 280 GTX is still on top of the food chain, although it is more expensive:) 

    I still think that the best invested videocard to date would be who ever bought a 8800 GTX or Ultra considering they can still max out games and do well in high resolutions even copared to the 9800 GTXs, and alot of 512 meg card.

    I mean 2006-2009 absolutely worth the 600$ in my opinion if you want to get by in gaming:D 
  • 3 Hide
    akandy , January 15, 2009 2:01 PM
    Bah, these new cards are made just to take back the top spot from ATI. BAH! I want a GT300!
  • -1 Hide
    neiroatopelcc , January 15, 2009 2:04 PM
    L1qu1dIts kinda funny that the 3 way 280 GTX is still on top of the food chain, although it is more expensiveI still think that the best invested videocard to date would be who ever bought a 8800 GTX or Ultra considering they can still max out games and do well in high resolutions even copared to the 9800 GTXs, and alot of 512 meg card.I mean 2006-2009 absolutely worth the 600$ in my opinion if you want to get by in gaming


    Correct - if we ignore that many g80 chips have been killed by heat since then, making it a bit less likely to make do with just one card in 3 years.

    ps. released november 2006 - so basicly it's "just been" 2007 and 2008 without upgrades.
  • 0 Hide
    akandy , January 15, 2009 2:08 PM
    L1qu1dIts kinda funny that the 3 way 280 GTX is still on top of the food chain, although it is more expensiveI still think that the best invested videocard to date would be who ever bought a 8800 GTX or Ultra considering they can still max out games and do well in high resolutions even copared to the 9800 GTXs, and alot of 512 meg card.I mean 2006-2009 absolutely worth the 600$ in my opinion if you want to get by in gaming


    The GTX285 tri sli does a little better.
  • -1 Hide
    computerninja7823 , January 15, 2009 2:20 PM
    haha nvidia takes the crown!!...but truly ATI will come back and take the crown, then nvidia, then ATI and so forth and so-on... but its good to see the green giant back on top. i cant wait to see the new gt300 series cards!... i have to agree L1qu1d i have a 8800gt...dude it kicks some serious bootay!!! and heat was definetly the killer for the g80 series but using riva tuner i have been able up the fan speed to 80% then overclock my 8800gt and ROCKS!!!
  • 1 Hide
    Pei-chen , January 15, 2009 2:24 PM
    Very very nice review. I have been an advocate for 1.5GB+ RAM on video card ever since I got GTA4. It seems like ATI/Nvidia haven’t realize the need for larger buffer on $400+ cards.

    Either way, I just bought a 4850 1GB OC to replace a fairly new G92b 9800GT as it only cost me about $20 to upgrade. I think the 512MB RAM on 9800GT is bottlenecking a few game at 1920*1200. Hopefully the 50% faster 4850 will solve that problem.
  • 3 Hide
    NightLight , January 15, 2009 2:24 PM
    i don't know, i'm getting pretty tired of sli... it should have died with the monster 3dII. I for one was releaved that the geforce series came out, a single card with enough power to play all the games you wanted on high settings, but now you have to spend a lot of cash to get the best graphics. Focus on a single card that can handle everything damnit!
  • 1 Hide
    L1qu1d , January 15, 2009 2:38 PM
    akandyThe GTX285 tri sli does a little better.


    Yeah but I mean its just an Overclocked 280 GTX other than that you have heat and power consumption. I really don't care about those since I have a 1000 watt corsair and really doesn't affect me much. (plus I don't pay for electricity).

    I mean I got my 280s to 712 core clock stable soo really it wouldn't be worth the extra 20$ or so for these cards:)  (if I do the bfg up)
  • -4 Hide
    billiardicus , January 15, 2009 3:05 PM
    Nice write up. A few comments: Why include 1680x1050? Anyone who has 2 gtx 295's and plays at 1680x1050 should be beaten and then shot. IMO it just made the graphs busier and harder to read.

    Interesting that Crysis can still distroy even these god-like gpu setups.

    I still say the GTX 280 is the king. Crossfire/SLI and multiple GPU's on one card don't impress me. They suck too much power, cost too much, require more expensive mobo's and psu's, and simply don't deliver bang/buck.
Display more comments