Rumors of a GT200-based dual-GPU solution from Nvidia quickly began circulating after AMD’s Radeon HD 4870 X2 knocked the GeForce GTX 280 from its performance throne. Nvidia certainly had the design experience, with its GeForce 7950 GX2 and GeForce 9800 GX2 paving the way for further improvements in its multiple-GPU product line. However, those rumors were quickly quashed by the logic that two full GTX 280 processors at 65nm would require too much power and create too much heat to be combined in a single package.
A recent die-process shrink from 65nm to 55nm helped to reduce both heat and power consumption, allowing Nvidia to pursue its two-cards-in-a-brick GX2 design using the most recent variation of its high-end graphics processor. As with previous GX2 products, Nvidia further reduced heat and power consumption by slightly downgrading its twin graphics processors. The new board uses the memory interface and clock speeds of its GeForceGTX 260, but with the full 240 stream processors its GeForce GTX 280 is known for.
Given Nvidia’s propensity for recycling the names of former high-end parts, one might have expected the company to call its new product the “GTX 280 GX2” or “GTX 380 GX2”. Instead, it chose a middle road, removing the GX2 designation to title this product the GeForce GTX 295.
We saw excellent performance in our GeForce GTX 295 preview, but wondered what advancements improved drivers could bring. Also in the backs of our minds were SLI scaling issues that had plagued 7950 GX2 and 9800 GX2 Quad-SLI configurations, causing these to fall behind SLI pairs of single-GPU 7900 GTX and 8800 GTX cards at ultra-high graphics settings. With these questions in mind, we set about procuring a second GTX 295 unit and two HD 4870 X2 cards for comparing Quad SLI to CrossFireX performance, plus three GTX 280 cards to compare Nvidia’s highest-end 3-way SLI to its current Quad-SLI solution.

I would have liked to have seen some temperatures in there somewhere as well. With top end cards becoming hotter and hotter (at least with ATI) I wonder if cheaper cases are able to cope with the temperatures these components generate.
BTW any chance of doing some sextuple SLI GTX295 on the old Intel Skulltrail?
Not a chance: The GTX 295 only has one SLI bridge connector. NVIDIA designs its products intentionally to only support a maximum of four graphics cores, and in doing so eliminates the need to make its drivers support more.
ie. a big 3x3x9" box resting on the expansion slots, dumping warm air outside.
and propler=proper
thrown." --> "throne"? or am I just misunderstanding the sentence?
Congrats on quad-sli, though, for anything that doesnt already get 100+ fps with a single GX2, you're welcome to throw in a second and get at most a 10-20% increase, unless of course you want to get an increase to a game that doesnt already have 100 FPS (crysis), in which case you're screwed - dont even bother with it.
And I'm a little disappointed with the scaling of all solutions. They still don't scale well.
I still think that the best invested videocard to date would be who ever bought a 8800 GTX or Ultra considering they can still max out games and do well in high resolutions even copared to the 9800 GTXs, and alot of 512 meg card.
I mean 2006-2009 absolutely worth the 600$ in my opinion if you want to get by in gaming
Correct - if we ignore that many g80 chips have been killed by heat since then, making it a bit less likely to make do with just one card in 3 years.
ps. released november 2006 - so basicly it's "just been" 2007 and 2008 without upgrades.
The GTX285 tri sli does a little better.
Either way, I just bought a 4850 1GB OC to replace a fairly new G92b 9800GT as it only cost me about $20 to upgrade. I think the 512MB RAM on 9800GT is bottlenecking a few game at 1920*1200. Hopefully the 50% faster 4850 will solve that problem.
Yeah but I mean its just an Overclocked 280 GTX other than that you have heat and power consumption. I really don't care about those since I have a 1000 watt corsair and really doesn't affect me much. (plus I don't pay for electricity).
I mean I got my 280s to 712 core clock stable soo really it wouldn't be worth the extra 20$ or so for these cards
Interesting that Crysis can still distroy even these god-like gpu setups.
I still say the GTX 280 is the king. Crossfire/SLI and multiple GPU's on one card don't impress me. They suck too much power, cost too much, require more expensive mobo's and psu's, and simply don't deliver bang/buck.