GeForce GTX 295 In Quad-SLI

Test Settings

The biggest problem we’ve encountered when using high-end graphics solutions is that at the highest resolutions and settings, the cards cannot get data fast enough from the CPU and RAM to keep their graphics cores busy. In an effort to reduce this so-called “CPU bottleneck,” we overclocked our Core i7 processor to 4.00 GHz at a 200 MHz base clock.

Test System Configuration

CPU

Intel Core i7 920 (2.66 GHz, 8.0 MB Cache)

Overclocked to 4.00 GHz (BCLK 200)

CPU Cooler

Swiftech Liquid Cooling: Apogee GTZ water block,

MCP-655b pump, and 3x120mm radiator

Motherboard

Gigabyte GA-EX58-Extreme

Intel X58/ICH10R Chipset, LGA-1366

RAM

6.0 GB Crucial DDR3-1600 Triple-Channel Kit

Overclocked to CAS 8-8-8-16

GTX 295 Graphics

2x GeForce GTX 295

2x 576 MHz GPU, GDDR3-1998

GTX 280 Graphics

3x EVGA GeForce GTX 280 PN: 01G-P3-1280-AR

602 MHz GPU, GDDR3-2214

Radeon HD 4870 X2 Graphics

2x Sapphire HD 4870 X2 PN: 100251SR

2x 750 MHz GPU, GDDR5-3600

Hard Drives

Seagate Barracuda ST3500641AS

0.5 TB, 7,200 RPM, 16 MB Cache

Sound

Integrated HD Audio

Network

Integrated Gigabit Networking

Power

Cooler Master RS-850-EMBA

ATX12V v2.2. EPS12V, 850W, 64A combined +12V

Optical

LG GGC-H20LK 6X Blu-ray/HD DVD-ROM, 16X DVD±R

Software

OS

Microsoft Windows Vista Ultimate x64 SP1

Graphics

NVidia Forceware 181.20 Beta

ATI 8.561.3.0000 Beta

Chipset

Intel INF 8.3.0.1016


Cooling an overclocked Core i7 CPU isn’t a task for lightweights, so we added Swiftech’s latest Apogee GTZ water block to the liquid-cooling kit we normally use for motherboard testing.

Feeding data quickly to the CPU are three 2.0 GB DDR3-1600 modules from Crucial. These particular samples are part of an upcoming High-End Triple-Channel shootout; watch for it in a few days.

Including 3-way SLI tests required a motherboard with proper slot spacing. Gigabyte’s EX58-Extreme worked well on our open platform, though the third card does extend below the lowest slot on standard cases.

Our second card came from MSI; its N295GTX-M201792 arrived overclocked to 655MHz GPU with GDDR3-2100 memory. We actually had to underclock this card to 576MHz/GDDR3-2000 reference speeds to make it cooperate with our reference card in Quad SLI. MSI says it could be releasing an overclocked board soon, which we'll be anxiously awaiting.

Our GTX-295 graphics cards were set to reference clock speeds, so it made sense to use reference-speed HD 4870 X2s in an apples-to-apples comparison. Our cards came from Sapphire.

Finally, the GTX 280 sets the standard for judging GTX 295 performance improvements. We tested three reference-speed cards in single, SLI, and 3-way SLI.

Benchmark Configuration

Call of Duty: World at War

Patch 1.1, FRAPS/saved game
Highest Quality Settings, No AA / No AF, vsync off
Highest Quality Settings, 4x AA / Max AF, vsync off

Crysis

Patch 1.2.1, DirectX 10, 64-bit executable, benchmark tool
Very High Quality Settings, No AA / No AF (Forced)
Very High Quality Settings, 4x AA / 8x AF (Forced)

Far Cry 2

DirectX 10, Steam Version, in-game benchmark
Very High Quality Settings, No AA, No AF (Forced)
Very High Quality Settings, 4x AA, 8x AF (Forced)

Left 4 Dead

Very High Details, No AA / No AF, vsync off
Very High Details, 4xAA / 8x AF, vysnc off

World in Conflict

Patch 1009, DirectX 10, timedemo
Very High Quality Settings, No AA / No AF, vsync off
Very High Quality Settings, 4x AA / 16x AF, vsync off

3D Mark Vantage

Version 1.02: 3DMARK, GPU, CPU scores
Performance, High, Extreme Presets

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
89 comments
    Your comment
  • Good job. (but none of these cards are in my budget)
    1
  • I’m looking at page 9 on the power usage charts – I have to say the GTX295 is very impressive it’s power consumption isn’t that much greater then the GTX280. And what’s very impressive is it uses 40% less power in SLI then the HD4870X2 does in Crossfire., meaning if I already owned a pretty decent PSU say around 700-800 watt’s I wouldn’t have to worry about getting it replaced if I were planning on SLIing the GTX295.

    I would have liked to have seen some temperatures in there somewhere as well. With top end cards becoming hotter and hotter (at least with ATI) I wonder if cheaper cases are able to cope with the temperatures these components generate.

    BTW any chance of doing some sextuple SLI GTX295 on the old Intel Skulltrail?
    2
  • JeanLucBTW any chance of doing some sextuple SLI GTX295 on the old Intel Skulltrail?


    Not a chance: The GTX 295 only has one SLI bridge connector. NVIDIA designs its products intentionally to only support a maximum of four graphics cores, and in doing so eliminates the need to make its drivers support more.
    5
  • I'd like to see a board that takes up 3 slots, and use both the 1st and the 3rd slot's pcie connectors to power 4 gpu's on one board. Perhaps with the second pcie being optional - so in case of not fitting the card at all, one could fit it with reduced bandwidth. That way they'd have a basis to make some proper cooling. Perhaps a small h2o system, or a peltier coupled with some propler fan and heatsink.

    ie. a big 3x3x9" box resting on the expansion slots, dumping warm air outside.
    0
  • edit: more like 2x3x9" actually

    and propler=proper
    -6
  • "...Radeon HD 4870 X2 knocked the GeForce GTX 280 from its performance
    thrown." --> "throne"? or am I just misunderstanding the sentence?
    2
  • So the conclusion should read:
    Congrats on quad-sli, though, for anything that doesnt already get 100+ fps with a single GX2, you're welcome to throw in a second and get at most a 10-20% increase, unless of course you want to get an increase to a game that doesnt already have 100 FPS (crysis), in which case you're screwed - dont even bother with it.
    3
  • Why test with AA and AF turned on with such high end cards? Anyone who pays +$400 * X wouldn't be playing any game with AA AF turned off or with low res. display. (If I'd pay $800 for graphics cards, I'd have of course had a display with no less than 1920x1200 resolution. Not even 1680x1050)
    And I'm a little disappointed with the scaling of all solutions. They still don't scale well.
    -1
  • The performance per watt char is exactly what I wanted to see (it would be even better with some temps listed though). Thanks THG, This will help things along nicely.
    0
  • duzcizgi, don't forget about the real hardcore players (those who play tournaments for example), who prefer to play with the lower graphics settings and ensure > 100 FPS.
    3
  • Very informitive. I have one request going forward; when the testing teams do a 3Dmark Vantage test would they please include the standard 3Dmark06 benchmark results? Most users won't/don't use the vantage paid test. Vantage results have a different scoring and I and probably others can't interept what those scores mean for systems tested with just the 3Dmark06 standard tests.
    5
  • Its kinda funny that the 3 way 280 GTX is still on top of the food chain, although it is more expensive:)

    I still think that the best invested videocard to date would be who ever bought a 8800 GTX or Ultra considering they can still max out games and do well in high resolutions even copared to the 9800 GTXs, and alot of 512 meg card.

    I mean 2006-2009 absolutely worth the 600$ in my opinion if you want to get by in gaming:D
    7
  • Bah, these new cards are made just to take back the top spot from ATI. BAH! I want a GT300!
    3
  • L1qu1dIts kinda funny that the 3 way 280 GTX is still on top of the food chain, although it is more expensiveI still think that the best invested videocard to date would be who ever bought a 8800 GTX or Ultra considering they can still max out games and do well in high resolutions even copared to the 9800 GTXs, and alot of 512 meg card.I mean 2006-2009 absolutely worth the 600$ in my opinion if you want to get by in gaming


    Correct - if we ignore that many g80 chips have been killed by heat since then, making it a bit less likely to make do with just one card in 3 years.

    ps. released november 2006 - so basicly it's "just been" 2007 and 2008 without upgrades.
    -1
  • L1qu1dIts kinda funny that the 3 way 280 GTX is still on top of the food chain, although it is more expensiveI still think that the best invested videocard to date would be who ever bought a 8800 GTX or Ultra considering they can still max out games and do well in high resolutions even copared to the 9800 GTXs, and alot of 512 meg card.I mean 2006-2009 absolutely worth the 600$ in my opinion if you want to get by in gaming


    The GTX285 tri sli does a little better. http://www.guru3d.com/article/geforce-gtx-285-review--3way-sli/13
    0
  • haha nvidia takes the crown!!...but truly ATI will come back and take the crown, then nvidia, then ATI and so forth and so-on... but its good to see the green giant back on top. i cant wait to see the new gt300 series cards!... i have to agree L1qu1d i have a 8800gt...dude it kicks some serious bootay!!! and heat was definetly the killer for the g80 series but using riva tuner i have been able up the fan speed to 80% then overclock my 8800gt and ROCKS!!!
    -1
  • Very very nice review. I have been an advocate for 1.5GB+ RAM on video card ever since I got GTA4. It seems like ATI/Nvidia haven’t realize the need for larger buffer on $400+ cards.

    Either way, I just bought a 4850 1GB OC to replace a fairly new G92b 9800GT as it only cost me about $20 to upgrade. I think the 512MB RAM on 9800GT is bottlenecking a few game at 1920*1200. Hopefully the 50% faster 4850 will solve that problem.
    1
  • i don't know, i'm getting pretty tired of sli... it should have died with the monster 3dII. I for one was releaved that the geforce series came out, a single card with enough power to play all the games you wanted on high settings, but now you have to spend a lot of cash to get the best graphics. Focus on a single card that can handle everything damnit!
    3
  • akandyThe GTX285 tri sli does a little better.


    Yeah but I mean its just an Overclocked 280 GTX other than that you have heat and power consumption. I really don't care about those since I have a 1000 watt corsair and really doesn't affect me much. (plus I don't pay for electricity).

    I mean I got my 280s to 712 core clock stable soo really it wouldn't be worth the extra 20$ or so for these cards:) (if I do the bfg up)
    1
  • Nice write up. A few comments: Why include 1680x1050? Anyone who has 2 gtx 295's and plays at 1680x1050 should be beaten and then shot. IMO it just made the graphs busier and harder to read.

    Interesting that Crysis can still distroy even these god-like gpu setups.

    I still say the GTX 280 is the king. Crossfire/SLI and multiple GPU's on one card don't impress me. They suck too much power, cost too much, require more expensive mobo's and psu's, and simply don't deliver bang/buck.
    -4