2x gtx 690s or 2 gtx 780s

Solution
Two GTX780's.

1) A GTX690 contains TWO GPU's, thus you'd have FOUR in total which is a nightmare.

2) A GTX690 contains 2GB per GPU (4GB total) which means you'd only have 2GB for your framebuffer (the 780's have 3GB).

Also:
The 780's run fairly quiet, but if you're dead set on water cooling then get one with a built-in waterblock:
http://www.evga.com/Products/Product.aspx?pn=03G-P4-2789-KR

I'm not sure how much performance you'd gain, you might want to research that. Modern NVidia cards run closer to the GPU maximum than they used to which reduces the advantage of overclocking (people complained about how cards suck now as they don't overclock as well. No, this is a good thing people.)

Also, a single GTX780 runs pretty much any...
Two GTX780's.

1) A GTX690 contains TWO GPU's, thus you'd have FOUR in total which is a nightmare.

2) A GTX690 contains 2GB per GPU (4GB total) which means you'd only have 2GB for your framebuffer (the 780's have 3GB).

Also:
The 780's run fairly quiet, but if you're dead set on water cooling then get one with a built-in waterblock:
http://www.evga.com/Products/Product.aspx?pn=03G-P4-2789-KR

I'm not sure how much performance you'd gain, you might want to research that. Modern NVidia cards run closer to the GPU maximum than they used to which reduces the advantage of overclocking (people complained about how cards suck now as they don't overclock as well. No, this is a good thing people.)

Also, a single GTX780 runs pretty much any game at full spec above 60FPS (I synch to 60FPS with VSYNC ON). You might wish to just get a single GTX780 or GTX780Ti then look into a G-Sync monitor in the future (first out soonish from Asus).

G-Sync is incredible. Basically to reduce LAG significantly you need to run with VSYNC OFF (and get screen tearing) or get above 120FPS with VSYNC ON.

Even then, there is some stutter and jutter that happens that G-Sync can fix.

You can Google G-Sync if you wish, but it works by having the new G-Sync monitor refresh the screen on command from the Graphics card (GTX600/700) which eliminates the buffering VSYNC requires.
 
Solution

Matsku man

Honorable
Jan 6, 2014
97
0
10,640

Could you send me g-sync monitor link and what do you mean by its nightmare to have 4 gtx 690s
 
Update:
http://www.overclockers.com/evga-gtx780-classified-hydro-copper-waterblock-review

He added the waterblock, but it's from EVGA so I'd expect similar performance. He overclocked the GPU to around 1400MHz prior to installing the waterblock but couldn't overclock the memory at that frequency (plus it would be really, really loud).

After the waterblock, his numbers compared to STOCK are:

GPU: +46%
Memory: +20%

Thus, I think the GPU frequency should be dropped a little as the Memory didn't overclock as well so there's a bit of a bottleneck there, and to maintain stability. If you got similar numbers I'd recommend a 30% overclock. Overclocking also reduces the lifespan of any transistor chip proportional to the amount of overclocking (voltage more than frequency).

Other:
There's a $30 more "Classified" version of the Hydro Copper which you might wish to get instead if it's a "binned" chip (you can e-mail them). GPU's get tested for overclocking potential, and the best ones go into the higher-end products sometimes.
 
Update 2:

G-SYNC:
Review: http://www.slashgear.com/nvidia-g-sync-first-impressions-12308604/

"G-Sync completely changes the benchmark for the top-tier visual experience. It's like seeing a Blu-ray for the first time after relying on DVDs for generations"

G-Sync is completely about the SMOOTHNESS of the experience, again by reducing LAG to almost nothing, avoiding screen tearing, and fixing stutter/jutter elements. There is absolutely NO SOLUTION that can do this as well, but gaming at 120FPS (with VSYNC ON and not dropping below that) is as close as you can get. Though G-Sync can give a similar experience with much less hardware at about 50FPS.

NVidia site: http://www.geforce.com/hardware/technology/g-sync

2xGTX690:
By "nightmare" I mean you are talking about FOUR GPU's. There's a Youtube video with the main spokesman from NVidia discussing this, and he says flat out to avoid Quad SLI. In theory it could work fine, but it rarely does because the driver team must create an SLI profile for games to use SLI and they really put any effort into it for Quad SLI. A good rule to live by in PC world is that the more you get away from the common configurations, the more likely you'll have issues. That goes for too new as well as too old.

Quad SLI is prone to stutter, jutter and performance issues.

I'll post this, and if I can find a link I will, if not you can Google for it or take my word for it.

*Also, it's a non-issue as you definitely don't want only a 2GB framebuffer with cards that expensive. SLI works by CLONING the data so each GPU has its own set to process. The GTX690 has 2GB per GPU just like a GTX770 does.

Thus, in a game like BATTLEFIELD 4, when you crank up the visuals to MAX with a 2x690 you'll run out of Video RAM with only 2GB.
 

Matsku man

Honorable
Jan 6, 2014
97
0
10,640
 
(corrected my post. I meant "2x690" at the end there. Not sure if you noticed but you had a post with no answer.)

I can't find the video link, and I'm too tired now to find good links. I bet if you asked about Quad SLI you'd bet many tell you the same thing, to avoid it.

I expect games to start using above 2GB on the highest settings more frequently. I'm not concerned with a GTX770 card, but anything above that should use 3GB per GPU though I don't think 4GB is needed in general.

Crysis 3 is a current anomaly, but it can use 2769MB of memory at the highest settings, at 2560x1600.
http://www.guru3d.com/articles_pages/crysis_3_graphics_performance_review_benchmark,8.html

Most people with a 2xGTX780 will likely use a 2560x1440 monitor or greater resolution (I dislike Triple monitors but if interested it's solely the PIXEL COUNT that matters, which is usually 3x1920x1080 which is thus 68% more than even the above high-res monitor (thus would bring you above 3GB). That doesn't change my opinion that 3GB should be enough though.

Game codes:
If you decide on TWO cards, then you'll have two sets of CODES for the three games. You should sell one set (or give it away) before they expire.

G-Sync hardware:
If you can wait, I'd hold off for a monitor with the following specs:
- G-Sync (or the capability to add the G-Sync card)
- 2560x1440
- 3D at 1920x1080
- good reviews
- good warranty (dead pixel count, 3-year warranty etc..)

*You can't get 2560x1440 monitors at 120Hz yet for hardware standards reasons. However, you might be able to get one with a 2560x1440 resolution at 60Hz (or slightly higher) than can run at a higher refresh rate (up to 144Hz) when using 3D at a resolution of 1920x1080. Maybe you can even now (but not with G-Sync).

As a new product, price will be an issue. Despite assurances of a $150 premium, we'll just have to see. Anything like the above specs may be a while until the price is reasonable. Just something to keep in mind.