Had a nice look through the SLI FAQ etc and thought Id try and reach out to you guys for some advice/help.
First things first...my specs...
Vista Ultimate 32
PCP&Cooling Silencer 750w SLI/XFIRE PSU (the red one)
Q6600 stock (rev G0)
4GB 800mhz RAM stock
P5N-E SLI MoBo
2 x 8800Gt 600/900/1500 stock (Currently running 169.25)
1 x 500GB Sata HDD
1 x 80gb IDE HDD
1 x 250 IDE HDD
22" lcd screen (normally at 1680x1050 & 60hz)
Ok so recently I saw a new 8800gt for cheaps (£60) had been seeing reviews and benchies for 8800gt sli rigs getting nice improvements, so I thought....why the hell not, ya only live once ^^
I baught this card knowing I would have to upgrade my then 500w POS PSU so basically, £160 later ive got a nice sli setup.
However (theres always a 'However aint there?')..... ive spent 2 days self teaching with the aid of trial and error and google, to try and better my rigs performance.
This is where the problem starts, in virtually every game ive tried performance is virtually exactly the same as with 1 card and in some cases, performance is actually worse (go figure).
Now Im not a TOTAL nub,
Ive flipped the SLI board card to DUAL
Ive enabled SLI (duh) I can also successfully view the 'Green Bars' in games (is this sufficient to let me know that SLI is indeed working correctly?) NVCP shows SLI enabled as does GPUZ.
Temps are not an issue both cards remain under 60c.
Ive tried a variety of drivers, but it seems that the newer drivers (from 180.xx upwards) just totally freeze/BSOD any game I try. Seems to be any driver with the physics option on the NVCP bricks it.
Thats why Im currently running the oldies 169.25.
These yield stability but seemingly no performance gain.
3DMark06 scores come in at 11800 single gpu and 11400 or so WITH SLI (although I understand that linked adapters stating false probably has something to do with that)
Now I wasnt expecting 2 x performance just cos I SLI'd but I was kinda expecting a little better than.....the same ^^ Was hoping for maybe 10-30 fps more ? realistic?
I have tried the various SLI performance modes and other than NVIDIA Recommended, find that Force alternate frame rendering 2 seems to be the ?best?
I needed to get a new PSU anyway so thats moot, (If I cant get better performance, I can always find a buyer for a 8800gt)
My ultimate questions are I guess,
1) Is SLI broke on Vista?
2) Is SLI just....broke?
3) Are the new NVIDIA drivers SLI unfriendly?
4) Is there something bottlenecking my system?
5) Am I expecting too much? I fear that it may be my display holding me back (max rez is 1680x1050, damn lcd's)
I appreciate any help or advice guys cos I dont really wanna have a 8800gt acting as just an extra fan ^^
By the way, if it helps, when i tried with the 181.22 drivers (and indeed most of the 18x.xx) there WAS a DEFINATE fps increase in games, but the games just freeze after a few minutes
example Farcry2 single gpu = 40-60 fps 181.22 drivers gave me 50-95 fps...and then froze ^^
depending on the game, you may mess around with the frame rendering modes and take your pre-rendered frames to 0. i left mine at default but have seen the suggestions fly on it before.
your CPU being stock clock definitely impacts your card performance. the sweet spot for my 260's seems to be 3.6 GHz clock speed (so that means take it up to 3.82 ).
just about any SLI issue or question i can think of has already been answered here: http://www.evga.com/forums/tm.asp?m=151488
Yeah Im in the process of looking for better cooling before I OC the CPU. On stock cooling I can kick out 2.8 easy with idles of 30C & loads of around 60C so Ill invest in some aftermarket cooling and crank the bugger up
i think the referance roofus was making was in regard to sli scaling in regards to cpu clock....currently sli scales fairly close 1.85 for dual and 2.5 for tri-sli on well equiped quad core cpu's (obviously best case results). not even dualies can get that scaling any more though its not much worse at 1.6 or 1.7. the bigger hit for dual cores is on tri sli/ quad sli. they just can't keep up as well.
actually want a good comparsion for scaling for both crossfire and sli on multi platform check this one link out...
which brings me to what i disagree with roofus about about. i think a core 2 at 3.6 should give you good scaling as you can see a i7 at 3.2 gives you the best scaling to date which is about what i mentioned above of 1.85 and 2.5 for dual/ tri
3dmark dont show that much of an improvement for me, its the game that show the real benefit. my scores only gained 2000pts ...
But now every games runs with 2xAA and even more without going under 60FPS ... 1650x1080
Virtually exactly the same 3dmark improvment for me, but thats by the by I care about ingame performance, and again my games now run like you say above, 2 or 4xAA none/hardly any drop below 60fps with usually full details at 1680x1050
I had a similar issues using a MSI board with the 650 chipset. I used to have two 7800GTX in SLI for a long while and worked great. I then wanted to upgrade and went with two 8800GTS cards in SLi. I didn't notice that much of a difference. Mind you I am greatly biased to Nvidia and my brother is hardcore ATI his HD3870 was killing me. I decided to up the board to a new chipset that supported the 780i chipset and viola the improvements were amazing. I got the Asus 780Sli board and Mass Effect runs perfect along with anything i throw at it. Just some input. Good luck in your graphics debacle.
When you try running sli with the load bars on, is the green bar near the center or is it more spread out? The length of that green bar is indicative of the load on the cards. If it's near the top and bottom of the screen, it means both cards are working heavily. If it's tucked in the middle, it means the cards are taking it easy.
If you're constantly seeing the green bar in the middle, it might mean a profile issue. I had a similar thing happen with crysis; where one card performed the same as with sli enabled. I reset all my game profiles and that fixed it.