NVidia Geforce GTX 260 SLI

iode

Distinguished
Dec 19, 2008
47
0
18,530
This is my motherboard:
http://www.newegg.com/Product/Product.aspx?Item=N82E16813157134

I used to have a single nVidia Geforce GTX 260 core 192 in this board, I was able to run Crysis Warhead on maxed out settings 1440x900 resolution at about 32-38 fps. I recently got another graphics card to run in SLI. This time I got a GTX 260 Core 216. I went into the nVidia control panel and enabled SLI and I checked GPU-Z and it said I was indeed running in SLI mode. I started up Crysis Warhead again on maxed out settings, same resolution and played the same part of the game at about 34-45 fps. That said, I was a bit dissappointed that I spent an extra $160 on another GTX260 for such minimal performance gains :sweat: . Does anyone have any advice or opinions on why this happens (of if it's supposed to happen at all)?

My system:
AMD Phenom II x3 720 @ 3.6 GHZ
4 GB Gskill DDR2 800 5-5-5-15
2xSLI GTX 260
750W Corsair Powersupply
Windows 7 Professional 64 bit

PS. Since my motherboard support up to 3 graphics cards running in SLI (though they would then be running at x8/x8/x8 instead of x16/x16) do you think it would be to my benefit in the future to get another GTX 260 and run it in triple SLI?

Any helps or advice would be greatly appreciated, thanks!!
 
I have the same motherboard you are using.... I have used this mobo with dual GTS 250's and with a single GTX 260. In my opinion you will be wasting your time if you decide to install a third GPU on that board. What I would do is run 3dmark 06 with a single GTX 260 and they do a second test with both GPU's and compare the scores, this way you will know the difference between having 1-2 GPU's .....

http://www.futuremark.com/benchmarks/3dmark06/introduction/
 

iode

Distinguished
Dec 19, 2008
47
0
18,530

No, no settings were changed. Both trials were done maxed out @ 1440x900 res
 

iode

Distinguished
Dec 19, 2008
47
0
18,530

Ah okay, I have 3Dmark Vantage, it came with my XFX graphics card (along with this neat little door knob sign), will that work?
 
I just installed a 2nd GTX 260 Core 216 a couple days ago.

Q6600 @ 3.2Ghz
XFX 680i LT SLI
2x MSI GTX 260 Core 216 SLI (655Mhz)
4GB Corsair XMS2 DDR2 @ 800Mhz
Win 7 RC

I hardly saw any change in 3dMark06 (From 15k to 16k after SLI).

However, I can play Crysis Warhead with all settings @ "Gamer" with a resolution of 1680x1050 (Crysis won't do 1920x1080 for some reason) and 4xAA, in DX10 mode, and was showing a solid 30FPS (Fraps) for a couple hours.

EDIT: BTW, forget triple SLI. Waste of time, waste of money, waste of resources.
 
^ You have something wrong with your card or SLi setup.

You should be able to easily max out Crysis at 1680x1050 at Enthusiast mode with AA turned up, and I've read reviews that said two GTX260s can also do 1920x1080 at Very High (enthusiast) with lowered AA...and get well above 30fps average.


1. Try running the free version of Vantage to see if there is any difference
2. Is SLi enabled in your control panel? And do you have the SLi bridge?
3. Does the 680i LT Sli have enough PCIe bandwidth?
 


1) Haven't tried Vantage yet. Just 3dMark06 with default settings.

2) SLI is indeed enabled in nVidia control panel. I've also used nHancer to activate 3dMark06 nVidia profiles. Same results. SLI Bridge is indeed connected between both cards. I know they're both getting used, because they both heat up, both fans spin fast/loud when gaming in Crysis.

3) I believe the 680i LT SLI has 2 x 16x PCI-E slots, so I theoretically shouldn't be bottle necked due to that.

As far as Crysis Warhead goes, the game literally will not let me activate 1920x1080. It gives it to me as an option, but once I hit apply it reverts to 1024x? Very strange. It will let me Apply 1680x1050 settings, so that's where I've been playing it.

I can lay Fear 2 @ 60FPS without a hiccup with absolutely everything maxed @ 1920x1080 though.

BTW, I'm using the latest 191.07 nVidia Forceware drivers.
 
I think his mobo is the bottleneck. I have the same mobo as the OP with a single 260 and at 3.7Ghz I get over 16k in 3dmark06. Back when I had dual 250's I would get almost 18k at 3.6Ghz..... And btw I am using a cheap dual core....
 

iode

Distinguished
Dec 19, 2008
47
0
18,530
bottleneck? by what? isn't x16/x16 the best? I went out of my way to buy a board with x16 on sli config
 

invisik

Distinguished
Mar 27, 2008
2,476
0
19,810
^ I have no clue what his talking about. pcic x16 is enough.

No your mobo is not the bottleneck. At such low resolution its more cpu dependent then gpu. I have 2 gtx 260 sli at 1680x1050 crysis maxed out with 40fps and when i upped the resolution to 1920x1080 only a 4-5fps drop. Sli isnt suppose to give you twice the power but rather 40-75% improvement depending on game.
 
My 680i LT is certainly not the latest motherboard, let alone the best of it's time. However, I'm pretty certain it runs 2 full 16x PCI-E slots during SLI. So theoretically the bandwidth shouldn't be an issue. Unless the 680i chip itself is just too slow for these GPUs or something. Other than that, it really could only be my CPU that could limit these cards.

I'm using nVidia's latest driver release from 5 days ago, so I'd figure it's not a driver issue. I honestly haven't tried Crysis Warhead @ Enthusiast yet, will have to do that tonight and see if there's any FPS change.

However, why I get such low 3dMark06 scores is really what puzzles me. I logged into ORB and confirmed that my single GPU score was 15.8K, where as my last SLI run through 3dMark06 using nVidia profiles was 16.1K. Of course, my 15.8K run was with older drivers, and with Vista 64bit.

Now I'm using newer drivers, SLI, and Win 7 64bit. So the 300 to 400 point difference is likely just drivers and OS. So I'd imagine 3dMark06 for whatever reason just isn't getting to use both GPUs.

Everything else seems to be using them though as far as I can tell (Fear 2 & Crysis Warhead seem to benefit). And Age of Conan seems to recognize both cards, because there's a severe drop in FPS with that game (which apparently is an issue for all SLI users with AoC).
 


Well I never said that x16 was not enough, my question is why would I get higher scores using a dual core compared to his quad? I will run an instance of 3dmark06 @ 3.2Ghz just to compare scores.
 
Well, the other issue could be the difference between PCI-E 1.0 on my board, versus PCI-E 2.0 on newer boards. Though, even with any limitation there, I dont' see why I really have no affect in 3dMark06.

Anyhow, I'll have to try Vantage trial version tonight and see what it shows.
 
With a single GTX 260 (216) :

@ 3.2Ghz the score is 14,565

@ 4.0Ghz the score is 16,621

of course there are 2 major factors here:

Using Windows XP Pro and my CPU is a dual core. So in all I cannot compare my scores to whatever you guys have.
 
Yes, but it does give some sort of insight. Ultimately, the more relevant factors are your specific GPU scores. The SM 2 and SM 3 scores from 3dMark06, that way we're not necessarily factoring in the CPU 'as much.'
 

iode

Distinguished
Dec 19, 2008
47
0
18,530
Okay, here are my full 3dmark06 settings and results:

Settings:
Sm2.0 graphics tests 2/2
CPU tests 2/2
HDR/SM3.0 Graphics tests 2/2
Feature tests 7/7
Batch Size Tests 6/6
Resolution 1440x900
AntiAliasing 8 Sample
Anisotropic Filtering
HLSL VS Target 3_0
HLSL PS Target 3_0

Scores:
16222 3Dmarks
SM2.0 Score 6910
HDR/SM3.0 Score 7720
CPU Score 3957

System Specs:
AMD Phenom II 720 x3 @ 3.6 ghz
nVidia Geforce GTX 260 SLI'd with a core 192 and core 216
ASRock K10N780SLIX3 Motherboard
Corsair 750W Power Supply
4 GB DDR2 800 5-5-5-15

Anyone else want to post their full settings and specs for a good comparison?
 
I just tested my system with the Demo/Trial 3dMark Vantage. (with 2 x GTX 260 Core 216)

3DMark Score
P19945 3DMarks
CPU Score
33807
Graphics Score
17547


As far as my 3dMark06 test, this is the last one I recorded through ORB (with Vista 64bit & 1 GTX 260 Core 216.)

3DMark Score
15688 3DMarks
SM 2.0 Score
6387
SM 3.0 Score
7180
CPU Score
4404
 

iode

Distinguished
Dec 19, 2008
47
0
18,530
I have 3Dmark Vantage, but I'm having some trouble getting it to work at the moment, I have gotten it to work before, but I just can't manage to do it right now. I'll post my scores ASAP once it gets up and running. Anyways, I heard that 3dmark06 has some really bad SLI scaling for Nvidia's 200 series cards, is that true?
BTW, jerreece, what settings were u running your 3D marks at?
 


I just actually got finished running 3dMark06 with the settings you posted. Here's the results:

3DMark Score
15776 3DMarks
SM 2.0 Score
6015
HDR/SM 3.0 Score
7637
CPU Score
4419
 

one-shot

Distinguished
Jan 13, 2006
1,369
0
19,310
I actually have 2 GTX 260 Core 216s in SLI here is my score.

Full Score 24502
SM 2.0 9494
SM 3.0 12437
CPU 6118

Core i7 @ 3.9GHz
GTX 260 Core 216 overclocked.

With one GPU and i7 @ 3.7GHz, here is my scores

Full Score 18751
SM 2.0 7890
SM 3.0 8025
CPU 5659

Stock CPU and GPU

Full Score 15546
SM 2.0 6261
SM 3.0 6783
CPU 4922


I don't seem to have any issues with scaling. An issue with scaling can be largely impacted because of a CPU limitation. Not a bottleneck, a limitation. Even I'm CPU limited @ 3.9GHz with my 2 GPUs. Overclock your CPU a little and you should see your score rise. I got just over 3k with a CPU and minor GPU overclock.
 

iode

Distinguished
Dec 19, 2008
47
0
18,530


Hmm, those results just aggravate the problem of inconsistency even more. I don't understand why your SM2.0 and SM3.0 so large? Aren't they CPU independent GPU tests? Why do they differ so greatly from the other GTX260 SLI scores here? Me and jerreece's results came out to be pretty consistent with each others, but yours is just sky high with the same GPU configuration. I'm so confused :(