Big memory capacity hasn’t always been specific to the highest-end graphics cards. It's actually still fairly common for vendors to put a bunch of inexpensive, low-speed memory on entry-level boards, hoping that big number would register as better performance to the folks who haven't yet wrapped their heads around the idea that a big frame buffer is of limited use to an underpowered GPU. The uninitiated would readily jump for the “better” specification they knew, rather than the faster graphics processor they didn't understand as well.
But gamers have since become better-educated with respect to hardware, and many of today’s mainstream cards use fewer of the same expensive memory components as their high-end counterparts. For mainstream and better parts, the economic incentive to expand the memory capacities of low-end cards is almost gone.
More recently, the industry has seen an explosion in both graphics processing and system demands, with games becoming increasingly complicated and gorgeous 30" displays supporting 2560x1600 pixel resolutions. Putting aside marketing tricks of the past, the high end just might be where memory really needs to expand.

You won’t find a higher-end graphics processor than Nvidia's GeForce GTX 285, and that’s why Gigabyte chose it as the first model to support two gigabytes of super-fast GDDR3-2400 memory. Hints of a custom design are seen in its combination of HDMI and VGA connections, rather than a fistful of adapters, and bold claims of a “2 ounce Copper Inner Layer” for its circuit board.
Spreading the VGA and DVI outputs across two connections rather than relying on a second DVI-I interface and adapters eliminated the space Nvidia’s reference design used for its S-Video/Composite combo interface. That’s no big loss to most of us, since those legacy TV outputs were only able to support ultra-low resolutions. Anyone putting this much effort into gaming on a home-theater display should certainly have HDMI, or at least DVI by now.

Gigabyte’s custom circuit board doesn’t look much different than the reference-design cards we’ve seen, though Gigabyte claims slight improvements in its voltage regulator components.

Hynix’s super-fast H5RS1H23MFR-N2C 1Gb GDDR3-2400 memory is clocked at its rated speed, though Gigabyte does overclock the GeForce GTX 285 GPU slightly to 660 MHz, compared to the standard 648 MHz. Unlike Gigabyte’s card, reference models also underclock RAM to a 2,322 MHz data rate.
- Wasn't 1 GB Already Enough?
- Comparison Cards: When 1 GB Is Greater Than 1.8 GB
- Test Settings
- Benchmark Results: Crysis
- Benchmark Results: Far Cry 2
- Benchmark Results: Tom Clancy’s H.A.W.X
- Benchmark Results: Left 4 Dead
- Benchmark Results: S.T.A.L.K.E.R. Clear Sky
- Benchmark Results: World In Conflict
- Benchmark Results: 3DMark Vantage
- Performance Summary and Power
- Conclusion

Is there any possibility to test this? this could actually be the only game that will use this amount of memory..
Tom's Hardware was hoping to find more 2560x1600 scenarios where the 2GB advantage would play out. When very few advantages were found, Tom's did the honest thing and published the numbers anyway.
I think you can take a lot from this article. I just spoke to a guy who asked "2GB or water cooling?" when looking at cards of the same price. He has a powerful water cooling loop, so the answer was easy.
PS: I'm thinking slower GPU might benefit from more memory
Which is why I said it was worth exploring. I realize you wouldn't do all this work and NOT publish your results, as mundane as they may be.
Three-way would have been best, but there's just not enough samples to go around.
Same card means same driver, regardless of RAM.
Are you the guy who bought that 256MB MX440 because it had more memory than the 128MB Ti 4200? Just kidding, but the high-end card is the one that can use the highest detail levels, which requires a greater frame buffer.
Is there any possibility to test this? this could actually be the only game that will use this amount of memory..
Yup. And the reason we never see any gains is because the driver is written with the stock frame buffer size and the OS/games don't know how to take advantage of the extra VRAM. All the cards with higher than stock amounts of VRAM are wastes of money.
well first the technology is introduced into the market and then companies like Diamond Multimedia will utilize its full potential. I'm sure GTX 295 owners will not be interested in "upgrading". Perhaps Crysis 2 will see too in seeing the 285 will not be money spent poorly.
What I am curious about, could you buy a 1 GB card and a 2 GB card SLI them and have the primary set as the 2 GB so that the full 2 GB gets used between the 2 GPUs? Or would you still have to pay for 2 2GB cards