Sign in with
Sign up | Sign in
Your question

NVidia Geforce GTX 260 SLI

Last response: in Graphics & Displays
Share
October 9, 2009 8:02:09 PM

This is my motherboard:
http://www.newegg.com/Product/Product.aspx?Item=N82E16813157134

I used to have a single nVidia Geforce GTX 260 core 192 in this board, I was able to run Crysis Warhead on maxed out settings 1440x900 resolution at about 32-38 fps. I recently got another graphics card to run in SLI. This time I got a GTX 260 Core 216. I went into the nVidia control panel and enabled SLI and I checked GPU-Z and it said I was indeed running in SLI mode. I started up Crysis Warhead again on maxed out settings, same resolution and played the same part of the game at about 34-45 fps. That said, I was a bit dissappointed that I spent an extra $160 on another GTX260 for such minimal performance gains :sweat:  . Does anyone have any advice or opinions on why this happens (of if it's supposed to happen at all)?

My system:
AMD Phenom II x3 720 @ 3.6 GHZ
4 GB Gskill DDR2 800 5-5-5-15
2xSLI GTX 260
750W Corsair Powersupply
Windows 7 Professional 64 bit

PS. Since my motherboard support up to 3 graphics cards running in SLI (though they would then be running at x8/x8/x8 instead of x16/x16) do you think it would be to my benefit in the future to get another GTX 260 and run it in triple SLI?

Any helps or advice would be greatly appreciated, thanks!!
October 9, 2009 8:14:18 PM

When you had just one card did you run the game at max also, sorry I see you did, did you change any settings.
a b U Graphics card
October 9, 2009 8:19:11 PM

I have the same motherboard you are using.... I have used this mobo with dual GTS 250's and with a single GTX 260. In my opinion you will be wasting your time if you decide to install a third GPU on that board. What I would do is run 3dmark 06 with a single GTX 260 and they do a second test with both GPU's and compare the scores, this way you will know the difference between having 1-2 GPU's .....

http://www.futuremark.com/benchmarks/3dmark06/introduct...
Related resources
October 9, 2009 8:25:20 PM

baddad said:
When you had just one card did you run the game at max also, sorry I see you did, did you change any settings.

No, no settings were changed. Both trials were done maxed out @ 1440x900 res
October 9, 2009 8:27:05 PM

OvrClkr said:
I have the same motherboard you are using.... I have used this mobo with dual GTS 250's and with a single GTX 260. In my opinion you will be wasting your time if you decide to install a third GPU on that board. What I would do is run 3dmark 06 with a single GTX 260 and they do a second test with both GPU's and compare the scores, this way you will know the difference between having 1-2 GPU's .....

http://www.futuremark.com/benchmarks/3dmark06/introduct...

Ah okay, I have 3Dmark Vantage, it came with my XFX graphics card (along with this neat little door knob sign), will that work?
a b U Graphics card
October 9, 2009 8:29:01 PM

When I had dual 250's installed the only difference in FPS that I could see was when I upped the xAA ..... As far as FPS with xAA off, I saw maybe a 10 frame increase.....
a b U Graphics card
October 9, 2009 8:31:51 PM

Just download 3dmark06, it is free and it will help you see the difference between a single a GPU and SLI.....
a b U Graphics card
a b Î Nvidia
October 10, 2009 5:22:23 PM

I just installed a 2nd GTX 260 Core 216 a couple days ago.

Q6600 @ 3.2Ghz
XFX 680i LT SLI
2x MSI GTX 260 Core 216 SLI (655Mhz)
4GB Corsair XMS2 DDR2 @ 800Mhz
Win 7 RC

I hardly saw any change in 3dMark06 (From 15k to 16k after SLI).

However, I can play Crysis Warhead with all settings @ "Gamer" with a resolution of 1680x1050 (Crysis won't do 1920x1080 for some reason) and 4xAA, in DX10 mode, and was showing a solid 30FPS (Fraps) for a couple hours.

EDIT: BTW, forget triple SLI. Waste of time, waste of money, waste of resources.
a b U Graphics card
October 10, 2009 5:38:32 PM

^ You have something wrong with your card or SLi setup.

You should be able to easily max out Crysis at 1680x1050 at Enthusiast mode with AA turned up, and I've read reviews that said two GTX260s can also do 1920x1080 at Very High (enthusiast) with lowered AA...and get well above 30fps average.


1. Try running the free version of Vantage to see if there is any difference
2. Is SLi enabled in your control panel? And do you have the SLi bridge?
3. Does the 680i LT Sli have enough PCIe bandwidth?
a b U Graphics card
a b Î Nvidia
October 10, 2009 5:47:10 PM

Bluescreendeath said:
^ You have something wrong with your card or SLi setup.

You should be able to easily max out Crysis at 1680x1050 at Enthusiast mode with AA turned up, and I've read reviews that said two GTX260s can also do 1920x1080 at Very High (enthusiast) with lowered AA...and get well above 30fps average.


1. Try running the free version of Vantage to see if there is any difference
2. Is SLi enabled in your control panel? And do you have the SLi bridge?
3. Does the 680i LT Sli have enough PCIe bandwidth?


1) Haven't tried Vantage yet. Just 3dMark06 with default settings.

2) SLI is indeed enabled in nVidia control panel. I've also used nHancer to activate 3dMark06 nVidia profiles. Same results. SLI Bridge is indeed connected between both cards. I know they're both getting used, because they both heat up, both fans spin fast/loud when gaming in Crysis.

3) I believe the 680i LT SLI has 2 x 16x PCI-E slots, so I theoretically shouldn't be bottle necked due to that.

As far as Crysis Warhead goes, the game literally will not let me activate 1920x1080. It gives it to me as an option, but once I hit apply it reverts to 1024x? Very strange. It will let me Apply 1680x1050 settings, so that's where I've been playing it.

I can lay Fear 2 @ 60FPS without a hiccup with absolutely everything maxed @ 1920x1080 though.

BTW, I'm using the latest 191.07 nVidia Forceware drivers.
a b U Graphics card
October 10, 2009 5:51:14 PM

I think his mobo is the bottleneck. I have the same mobo as the OP with a single 260 and at 3.7Ghz I get over 16k in 3dmark06. Back when I had dual 250's I would get almost 18k at 3.6Ghz..... And btw I am using a cheap dual core....
October 10, 2009 6:07:43 PM

bottleneck? by what? isn't x16/x16 the best? I went out of my way to buy a board with x16 on sli config
a b U Graphics card
October 10, 2009 6:16:43 PM

I am refering to jerreece, he has an old 680i ........ Of course I am not 100% sure it is his mobo or drivers.... But there is definetley something wrong there....
a b U Graphics card
October 10, 2009 6:21:15 PM

^ I have no clue what his talking about. pcic x16 is enough.

No your mobo is not the bottleneck. At such low resolution its more cpu dependent then gpu. I have 2 gtx 260 sli at 1680x1050 crysis maxed out with 40fps and when i upped the resolution to 1920x1080 only a 4-5fps drop. Sli isnt suppose to give you twice the power but rather 40-75% improvement depending on game.
a b U Graphics card
a b Î Nvidia
October 10, 2009 6:34:51 PM

My 680i LT is certainly not the latest motherboard, let alone the best of it's time. However, I'm pretty certain it runs 2 full 16x PCI-E slots during SLI. So theoretically the bandwidth shouldn't be an issue. Unless the 680i chip itself is just too slow for these GPUs or something. Other than that, it really could only be my CPU that could limit these cards.

I'm using nVidia's latest driver release from 5 days ago, so I'd figure it's not a driver issue. I honestly haven't tried Crysis Warhead @ Enthusiast yet, will have to do that tonight and see if there's any FPS change.

However, why I get such low 3dMark06 scores is really what puzzles me. I logged into ORB and confirmed that my single GPU score was 15.8K, where as my last SLI run through 3dMark06 using nVidia profiles was 16.1K. Of course, my 15.8K run was with older drivers, and with Vista 64bit.

Now I'm using newer drivers, SLI, and Win 7 64bit. So the 300 to 400 point difference is likely just drivers and OS. So I'd imagine 3dMark06 for whatever reason just isn't getting to use both GPUs.

Everything else seems to be using them though as far as I can tell (Fear 2 & Crysis Warhead seem to benefit). And Age of Conan seems to recognize both cards, because there's a severe drop in FPS with that game (which apparently is an issue for all SLI users with AoC).
a b U Graphics card
October 10, 2009 6:39:13 PM

invisik said:
^ I have no clue what his talking about. pcic x16 is enough.

No your mobo is not the bottleneck. At such low resolution its more cpu dependent then gpu. I have 2 gtx 260 sli at 1680x1050 crysis maxed out with 40fps and when i upped the resolution to 1920x1080 only a 4-5fps drop. Sli isnt suppose to give you twice the power but rather 40-75% improvement depending on game.


Well I never said that x16 was not enough, my question is why would I get higher scores using a dual core compared to his quad? I will run an instance of 3dmark06 @ 3.2Ghz just to compare scores.
a b U Graphics card
a b Î Nvidia
October 10, 2009 6:43:31 PM

Well, the other issue could be the difference between PCI-E 1.0 on my board, versus PCI-E 2.0 on newer boards. Though, even with any limitation there, I dont' see why I really have no affect in 3dMark06.

Anyhow, I'll have to try Vantage trial version tonight and see what it shows.
a b U Graphics card
October 10, 2009 7:07:15 PM

With a single GTX 260 (216) :

@ 3.2Ghz the score is 14,565

@ 4.0Ghz the score is 16,621

of course there are 2 major factors here:

Using Windows XP Pro and my CPU is a dual core. So in all I cannot compare my scores to whatever you guys have.
a b U Graphics card
a b Î Nvidia
October 10, 2009 8:23:37 PM

Yes, but it does give some sort of insight. Ultimately, the more relevant factors are your specific GPU scores. The SM 2 and SM 3 scores from 3dMark06, that way we're not necessarily factoring in the CPU 'as much.'
October 10, 2009 10:41:00 PM

Okay, here are my full 3dmark06 settings and results:

Settings:
Sm2.0 graphics tests 2/2
CPU tests 2/2
HDR/SM3.0 Graphics tests 2/2
Feature tests 7/7
Batch Size Tests 6/6
Resolution 1440x900
AntiAliasing 8 Sample
Anisotropic Filtering
HLSL VS Target 3_0
HLSL PS Target 3_0

Scores:
16222 3Dmarks
SM2.0 Score 6910
HDR/SM3.0 Score 7720
CPU Score 3957

System Specs:
AMD Phenom II 720 x3 @ 3.6 ghz
nVidia Geforce GTX 260 SLI'd with a core 192 and core 216
ASRock K10N780SLIX3 Motherboard
Corsair 750W Power Supply
4 GB DDR2 800 5-5-5-15

Anyone else want to post their full settings and specs for a good comparison?
a b U Graphics card
a b Î Nvidia
October 11, 2009 4:44:06 AM

I just tested my system with the Demo/Trial 3dMark Vantage. (with 2 x GTX 260 Core 216)

3DMark Score
P19945 3DMarks
CPU Score
33807
Graphics Score
17547


As far as my 3dMark06 test, this is the last one I recorded through ORB (with Vista 64bit & 1 GTX 260 Core 216.)

3DMark Score
15688 3DMarks
SM 2.0 Score
6387
SM 3.0 Score
7180
CPU Score
4404
October 11, 2009 5:12:49 AM

I have 3Dmark Vantage, but I'm having some trouble getting it to work at the moment, I have gotten it to work before, but I just can't manage to do it right now. I'll post my scores ASAP once it gets up and running. Anyways, I heard that 3dmark06 has some really bad SLI scaling for Nvidia's 200 series cards, is that true?
BTW, jerreece, what settings were u running your 3D marks at?
a b U Graphics card
a b Î Nvidia
October 11, 2009 5:40:41 AM

iode said:
I have 3Dmark Vantage, but I'm having some trouble getting it to work at the moment, I have gotten it to work before, but I just can't manage to do it right now. I'll post my scores ASAP once it gets up and running. Anyways, I heard that 3dmark06 has some really bad SLI scaling for Nvidia's 200 series cards, is that true?
BTW, jerreece, what settings were u running your 3D marks at?


I just actually got finished running 3dMark06 with the settings you posted. Here's the results:

3DMark Score
15776 3DMarks
SM 2.0 Score
6015
HDR/SM 3.0 Score
7637
CPU Score
4419
October 11, 2009 9:25:08 AM

I actually have 2 GTX 260 Core 216s in SLI here is my score.

Full Score 24502
SM 2.0 9494
SM 3.0 12437
CPU 6118

Core i7 @ 3.9GHz
GTX 260 Core 216 overclocked.

With one GPU and i7 @ 3.7GHz, here is my scores

Full Score 18751
SM 2.0 7890
SM 3.0 8025
CPU 5659

Stock CPU and GPU

Full Score 15546
SM 2.0 6261
SM 3.0 6783
CPU 4922


I don't seem to have any issues with scaling. An issue with scaling can be largely impacted because of a CPU limitation. Not a bottleneck, a limitation. Even I'm CPU limited @ 3.9GHz with my 2 GPUs. Overclock your CPU a little and you should see your score rise. I got just over 3k with a CPU and minor GPU overclock.
October 11, 2009 9:33:57 AM

one-shot said:
I actually have 2 GTX 260 Core 216s in SLI here is my score.

Full Score 24502
SM 2.0 9494
SM 3.0 12437
CPU 6118

Core i7 @ 3.9GHz
GTX 260 Core 216 overclocked.

With one GPU and i7 @ 3.7GHz, here is my scores

Full Score 18751
SM 2.0 7890
SM 3.0 8025
CPU 5659

Stock CPU and GPU

Full Score 15546
SM 2.0 6261
SM 3.0 6783
CPU 4922


I don't seem to have any issues with scaling. An issue with scaling can be largely impacted because of a CPU limitation. Not a bottleneck, a limitation. Even I'm CPU limited @ 3.9GHz with my 2 GPUs. Overclock your CPU a little and you should see your score rise. I got just over 3k with a CPU and minor GPU overclock.


Hmm, those results just aggravate the problem of inconsistency even more. I don't understand why your SM2.0 and SM3.0 so large? Aren't they CPU independent GPU tests? Why do they differ so greatly from the other GTX260 SLI scores here? Me and jerreece's results came out to be pretty consistent with each others, but yours is just sky high with the same GPU configuration. I'm so confused :( 
a b U Graphics card
October 11, 2009 5:11:27 PM

I guess we all have to at least have the same OS and a quad (around 3.2Ghz) in order to get some good comparisons. Try comparing your scores to others on the website, maybe you will find someone that has more or less your configuration.
a b U Graphics card
a b Î Nvidia
October 11, 2009 7:24:40 PM

Keep in mind he's running a new i7 920 with the X58 chipset. He's going to beat us both hands down anyhow. Seeing his scores, ultimately I believe my 680i chipset, and older Q6600 are what limit my GPUs. Really can't be any other way around it.
October 11, 2009 8:06:09 PM

My friend has a Q6600 with 2 9600GTs. With his CPU @ stock speeds (2.4GHz), he gets around 11k, when he overclocked his CPU to 2.8, he got ~14.5K. Overclocking makes a huge difference. With my CPU @ 3.7GHz and 2 GPUs mildly overclocked I can get around 22.5K on 3DMARK06. Keep in mind, though, that I have the Core 216 variant which allows for a little more performance.
October 11, 2009 8:28:55 PM

On a side note...
I'm finally getting a new video card :love:  for my rig that I built 12 months ago-(after waiting for years). I was using a friends old Radeon x1600, it's the only thing I haven't really replaced.

Specs.
Gigabyte X48-DS4
3.00 gigahertz Intel Core2 Duo Q6600
DELL S2409W [Monitor]
3328 Megabytes Usable Installed Memory
Windows XP

So I see this conversation as relevent. I was planning on getting the 260GTX for roughly 200$ now and hopefully going SLI within 6 months give or take whenever I see a reasonable price drop. My real question is do you guys think it is a good value for the money or is it the best fit with my system?
I just got RE5 and the x1600 is too old to play it on and well mostly I am just looking forward to spending a lot of close personal time with Dragon Age: Origins.
Any recomendations are helpful, I look to purchase by the end of the month with the top of my scale being at 200-250$.
a b U Graphics card
October 11, 2009 8:34:12 PM

kamita said:
On a side note...
I'm finally getting a new video card :love:  for my rig that I built 12 months ago-(after waiting for years). I was using a friends old Radeon x1600, it's the only thing I haven't really replaced.

Specs.
Gigabyte X48-DS4
3.00 gigahertz Intel Core2 Duo Q6600
DELL S2409W [Monitor]
3328 Megabytes Usable Installed Memory
Windows XP

So I see this conversation as relevent. I was planning on getting the 260GTX for roughly 200$ now and hopefully going SLI within 6 months give or take whenever I see a reasonable price drop. My real question is do you guys think it is a good value for the money or is it the best fit with my system?
I just got RE5 and the x1600 is too old to play it on and well mostly I am just looking forward to spending a lot of close personal time with Dragon Age: Origins.
Any recomendations are helpful, I look to purchase by the end of the month with the top of my scale being at 200-250$.


At $200 the GTX 260 is way overpriced. Get an HD 4890. It's cheaper and is on par with the GTX 275 in most games and GTX 285 in some games. SLi really isn't cost effective right now, especially with a generation old card. For cheaper ($380) there's a better option: the 5870 (or a 5850 for $260); if you can wait [who knows how long], whenever the Nvidia's 3 series comes out you could get that.
a b U Graphics card
October 11, 2009 9:11:35 PM

You can find the cheapest GTX260s for around $160, so it really depends on if you want to pay $200 for the next step up (GTX275/4890) or $259 for dx11 5850s
October 11, 2009 9:52:45 PM

If I were to buy now I'd get a 5850 and if I couldn't find anyone of those I'd get 2 4890s for Xfire. A GTX 260 C 216 should be well below $200 now. I would get one GPU and see how it plays all of your games. If you do SLI or Crossfire you need to overclock your CPU to get more out of it. So again, try one card, if it works great, you don't need another.
a c 271 U Graphics card
a c 171 Î Nvidia
October 11, 2009 10:30:37 PM

Nice thread hijack there, now who is responding to which poster's questions?
October 12, 2009 7:02:29 AM

Hey bro its your cpu that has become the bottleneck now. Try running with same settings and a higher resolution, i bet you get near the same fps...
October 12, 2009 11:56:02 AM

harteman said:
Hey bro its your cpu that has become the bottleneck now. Try running with same settings and a higher resolution, i bet you get near the same fps...

So your saying that the CPU is the reason me and jerreece are getting similar 3Dmark numbers while oneshot is getting much higher ones? (720 vs Q6600 vs i7) This is the first time I've encountered CPU bottnecks before, I always thought bottlenecks would occur at some other hardware, like the hard drive or something.
Anonymous
a b U Graphics card
October 12, 2009 1:56:56 PM

jerreece said:
Keep in mind he's running a new i7 920 with the X58 chipset. He's going to beat us both hands down anyhow. Seeing his scores, ultimately I believe my 680i chipset, and older Q6600 are what limit my GPUs. Really can't be any other way around it.


q6600 at 3.0 is pretty enough for gtx 260 sli setup...so dont worry maybe oc to 3.2 will help a little but not too much
a b U Graphics card
October 12, 2009 3:44:31 PM

@iode

Quote:
System Specs:
AMD Phenom II 720 x3 @ 3.6 ghz
nVidia Geforce GTX 260 SLI'd with a core 192 and core 216
ASRock K10N780SLIX3 Motherboard
Corsair 750W Power Supply
4 GB DDR2 800 5-5-5-15


First off I see a problem with you pairing a 192 SP card with a 216 SP card, just because it seems to be working OK in SLI, you're really only supposed to pair identical cards, IE; Same Clock, Memory, and definitely same SP[Stream Processors] count.

Also 3DM06 favors Intel over AMD anyway, you'll get truer results running 3DMVantage on either Vista or Win7 as far as a AMD vs Intel comparison, when you ask others to post their scores to see if your SLI pairing is a problem, theres no true comparison unless they test on the exact same setup as yours.

You can get an idea as to whats going on if you test 3DM06 on each card individually hardware wise, hardware wise meaning only one card physically installed at the time in the machine, that will tell you how much performance difference the 2 cards produce individually.

I believe one card is hurting the other in SLI simply because they're not identical cards, and if you discover a major performance difference between the two cards individually, that would probably indicate the pairing being a problem.
a b U Graphics card
October 12, 2009 4:35:04 PM

iode said:
Okay, here are my full 3dmark06 settings and results:

Settings:
Sm2.0 graphics tests 2/2
CPU tests 2/2
HDR/SM3.0 Graphics tests 2/2
Feature tests 7/7
Batch Size Tests 6/6
Resolution 1440x900
AntiAliasing 8 Sample
Anisotropic Filtering
HLSL VS Target 3_0
HLSL PS Target 3_0

Scores:
16222 3Dmarks
SM2.0 Score 6910
HDR/SM3.0 Score 7720
CPU Score 3957

System Specs:
AMD Phenom II 720 x3 @ 3.6 ghz
nVidia Geforce GTX 260 SLI'd with a core 192 and core 216
ASRock K10N780SLIX3 Motherboard
Corsair 750W Power Supply
4 GB DDR2 800 5-5-5-15

Anyone else want to post their full settings and specs for a good comparison?



I ran the bench with your same settings 1440 x 900, Anti Aliasing 8, Anistropic Filtering, All tests enabled.

Scores:
19,582 3Dmarks
SM2.0 Score 8102
HDR/SM3.0 Score 8298
CPU Score 6248

Disregard my CPU Score, as I already told you 3DM06 favors Intel anyway, the reason I posted my scores are because of the SM2 and SM3, even with a lower CPU score the SM2 and SM3 should be higher because thats graphics dependent, granted the CPU does have an effect there but its more GPU in those tests period.

FYI; When you run 3DM06 you really don't have to run all the feature and batch tests to get a score, just the GPU and CPU tests are required to get a score, takes less time.

I still think if your GPUs were identical your SM2 and SM3 would be higher, are there any differences in the GPU core Clock Speed and Memory Speed on those cards of yours? Just Curious?

Sorry I should have mentioned I'm running 2 MSI 260GTX 216SP cards in SLI.
October 12, 2009 5:32:25 PM

4Ryan6 said:
I ran the bench with your same settings 1440 x 900, Anti Aliasing 8, Anistropic Filtering, All tests enabled.

Scores:
19,582 3Dmarks
SM2.0 Score 8102
HDR/SM3.0 Score 8298
CPU Score 6248

Disregard my CPU Score, as I already told you 3DM06 favors Intel anyway, the reason I posted my scores are because of the SM2 and SM3, even with a lower CPU score the SM2 and SM3 should be higher because thats graphics dependent, granted the CPU does have an effect there but its more GPU in those tests period.

FYI; When you run 3DM06 you really don't have to run all the feature and batch tests to get a score, just the GPU and CPU tests are required to get a score, takes less time.

I still think if your GPUs were identical your SM2 and SM3 would be higher, are there any differences in the GPU core Clock Speed and Memory Speed on those cards of yours? Just Curious?

Sorry I should have mentioned I'm running 2 MSI 260GTX 216SP cards in SLI.

Here is the GPU-Z screenshot of both my gtx260s

As you can see, NVIDIA SLI is detected and enabled. The Core 192, with its higher GPU clock seems to outperform the core 216 in both pixel fillrate and texture fillrate as well as a faster bandwidth. (*note: I never touched these cards out of the box, apparently, my 192 was factory overclocked) I read somewhere that Nvidia designed the 192 and the 216 so they can be paired without detriment, they are essentially identical cards, so I'm not sure if its the difference between the 216 and the 192 shader cores that's causing possible lesser values on 3Dmarks. I also read somewhere that when two cards are in SLI, their clock, memory, and shader speeds are aligned and made identical, why isn't that the case here?

PS I'm running Windows 7 Ultimate Retail (as provided by my university)
a b U Graphics card
October 12, 2009 6:25:27 PM

The biggest negative I see in your screenshot is the fact its 2 completely different core manufacture process a 55nm vs a 65nm, stream processor count difference, core clock difference, memory clock difference, shader clock differences?

Have you consulted Nvidia support?

Actually its amazing they'll even run in SLI, I'd say kudos to Nvidias new drivers, but what are they actually clocking down to, to be able to run together in the first place?

Since its 2 completely different GPU manufacturing processes, you cannot even BIOS flash them to the same settings, I'm surprised Nvidia would have allowed 2 different core manufacturing processes to have the same GTX260 branding period.

Quote:
I read somewhere that Nvidia designed the 192 and the 216 so they can be paired without detriment, they are essentially identical cards


Post that article if you will, because in your screenshot they're definitely not 2 identical cards.

Also what are their individual 3DM06 scores?
October 12, 2009 6:50:39 PM

4Ryan6 said:
The biggest negative I see in your screenshot is the fact its 2 completely different core manufacture process a 55nm vs a 65nm, stream processor count difference, core clock difference, memory clock difference, shader clock differences?

Have you consulted Nvidia support?

Actually its amazing they'll even run in SLI, I'd say kudos to Nvidias new drivers, but what are they actually clocking down to, to be able to run together in the first place?

Since its 2 completely different GPU manufacturing processes, you cannot even BIOS flash them to the same settings, I'm surprised Nvidia would have allowed 2 different core manufacturing processes to have the same GTX260 branding period.

Quote:
I read somewhere that Nvidia designed the 192 and the 216 so they can be paired without detriment, they are essentially identical cards


Post that article if you will, because in your screenshot they're definitely not 2 identical cards.

Also what are their individual 3DM06 scores?

No, I've never consulted Nvidia support, because I don't know what's wrong, if there really is something wrong. I just wanted to know if my results were consistent with the rest of the world's, I don't want to be lacking in performance especially after spending so much money buying the cards, it feels like a waste.

At your request I dug up the article where I read this: http://www.firingsquad.com/hardware/nvidia_geforce_gtx_260_216shader/page2.asp

They say that "If you’ve already got a GeForce GTX 260 and would like to purchase another for SLI, we can confirm that the new 216 shader GTX 260 boards are 100% compatible with the 192-shader GTX 260, allowing both GPUs to be combined together for SLI. Each board will run with all its shaders enabled, giving you a grand total of 408 shaders for the SLI system."

Is there a way to test individual graphics cards without manually taking them out of my motherboard? I'm currently in college and don't have the tools to perform maintenance on my computer at the moment.
a b U Graphics card
October 12, 2009 7:09:44 PM

Interesting article, Thanks for looking that up!

Quote:
Is there a way to test individual graphics cards without manually taking them out of my motherboard? I'm currently in college and don't have the tools to perform maintenance on my computer at the moment.


Well you could disable SLI in the Nvidia controll panel, and swap the monitor connection back and forth.
October 12, 2009 7:14:23 PM

One of my GPUs was factory overclocked even when I ordered the same version as the other one. The faster one just down clocks to the slower card's speeds. But I set them both to run at the overclocked card's speed with is 626-1350-1053, or they run at 576-1242-999. I would take each card out and manually test each one to try and see if indeed there is a problem. You also have all power connectors plugged into the GPUs, right?
a b U Graphics card
a b Î Nvidia
October 12, 2009 9:14:11 PM

Quote:
q6600 at 3.0 is pretty enough for gtx 260 sli setup...so dont worry maybe oc to 3.2 will help a little but not too much


I'm actually @ 3.2Ghz now, and unfortunately this particular 680i LT board (or CPU) won't let me do anything higher. Haven't been able to get a stable OC higher than that. :( 

4Ryan6 said:
I ran the bench with your same settings 1440 x 900, Anti Aliasing 8, Anistropic Filtering, All tests enabled.

Scores:
19,582 3Dmarks
SM2.0 Score 8102
HDR/SM3.0 Score 8298
CPU Score 6248

Disregard my CPU Score, as I already told you 3DM06 favors Intel anyway, the reason I posted my scores are because of the SM2 and SM3, even with a lower CPU score the SM2 and SM3 should be higher because thats graphics dependent, granted the CPU does have an effect there but its more GPU in those tests period.

FYI; When you run 3DM06 you really don't have to run all the feature and batch tests to get a score, just the GPU and CPU tests are required to get a score, takes less time.

I still think if your GPUs were identical your SM2 and SM3 would be higher, are there any differences in the GPU core Clock Speed and Memory Speed on those cards of yours? Just Curious?

Sorry I should have mentioned I'm running 2 MSI 260GTX 216SP cards in SLI.


I'd be interested in your CPU details. Since you're running the same video card setup I am. My scores are far less than yours. Likely that you just have a higher CPU clock speed I'm guessing?
a b U Graphics card
October 12, 2009 9:30:33 PM

jerreece said:
I'm actually @ 3.2Ghz now, and unfortunately this particular 680i LT board (or CPU) won't let me do anything higher. Haven't been able to get a stable OC higher than that. :( 



I'd be interested in your CPU details. Since you're running the same video card setup I am. My scores are far less than yours. Likely that you just have a higher clock speed I'm guessing?


He's using:

Quote:
Intel Q9550 @ 3.83G w Xigmatek HDT S1283 Cooler
a b U Graphics card
a b Î Nvidia
October 12, 2009 9:41:32 PM

Thanks. :)  I didn't get the chance to check his member config. At work, and being interrupted every 2-3 minutes doesn't help with that sort of thing. :)  The difference between 3.83Ghz & 3.2Ghz I think is the telling factor between his score and mine.

I think it's somewhat safe to say my SLI config is being limited by my CPU.
October 12, 2009 10:34:35 PM

one-shot said:
One of my GPUs was factory overclocked even when I ordered the same version as the other one. The faster one just down clocks to the slower card's speeds. But I set them both to run at the overclocked card's speed with is 626-1350-1053, or they run at 576-1242-999. I would take each card out and manually test each one to try and see if indeed there is a problem. You also have all power connectors plugged into the GPUs, right?

Did you look at the screenshots I posted earlier? My two 260's running at different clock speeds even in SLI. Can anyone explain that anomaly to me?
Yep, I have two 6-pin power connectors plugged into each card.
October 12, 2009 10:36:54 PM

jerreece said:
Thanks. :)  I didn't get the chance to check his member config. At work, and being interrupted every 2-3 minutes doesn't help with that sort of thing. :)  The difference between 3.83Ghz & 3.2Ghz I think is the telling factor between his score and mine.

I think it's somewhat safe to say my SLI config is being limited by my CPU.

:??:  Do you think mine is as well? I have my CPU clocked at 3.6 Ghz yet me and you get similar numbers.
a b U Graphics card
a b Î Nvidia
October 12, 2009 11:12:00 PM

I guess that depends on how well 3dMark06 is capable of using multi-cores. One could theorize that my 4 cores versus your 3 cores could make them equal. But I'm not thinking 3dMark06 was optimized for quads... though I could certainly be wrong.

Although with a 192 core and a 216 core, I don't know if one card might theoretically force the other to run at the slower card's abilities to make things seamless or not.
October 13, 2009 9:44:31 AM

iode said:
Did you look at the screenshots I posted earlier? My two 260's running at different clock speeds even in SLI. Can anyone explain that anomaly to me?
Yep, I have two 6-pin power connectors plugged into each card.


Mine also say different speeds in GPUz, but one is still downclocked to the slower card's speeds. The same things applies when using different speed RAM in a system. I use EVGA Precision for my GPUs and I have them set to run at the overclocked card's speed. Otherwise they would run at the slowest speed. It sounds like a CPU limitation, but if games play well it should be fine until it becomes a big issue.
!