My Gtx 660s in SLI will not scale.
Tags:
- EVGA
- Battlefield
- Nvidia
- 660
- Control Panel
- Gtx
- SLI
-
Graphics
- scaling
- issues
Last response: in Graphics & Displays
Relentlesstroll
September 5, 2013 6:41:03 PM
Hey guys. First off, pc specs:
Gigabyte 970A-G46
Amd fx 6200 (oc at 4.1)
8gb corsair vengeance
2x SLI Evga Geforce Gtx 660 superclocked(non ti)
Thermaltake 850w PSU
So the problem I'm having is that I'm experiencing no scaling whatsoever in games such as Battlefield 3. In fact, i think my performance is worse. Ive enabled SLI in Nvidia control panel, made sure the SLI bridge is attached properly, turned off vsync, made sure both cards worked before install, tried force alternating frames, etc. I've literally tried everything and I am not scaling. Nvidia control panel acknowledges both cards and says they're running in sli. Ive seen videos of people getting an avg of about 55 fps(which is my average) using one card on bf3. Then when they SLI they get around 110, so i know this game scales on this card. I just cant figure out the problem.
This is so frustrating. Please, any advice helps guys. Thanks in advance.
Gigabyte 970A-G46
Amd fx 6200 (oc at 4.1)
8gb corsair vengeance
2x SLI Evga Geforce Gtx 660 superclocked(non ti)
Thermaltake 850w PSU
So the problem I'm having is that I'm experiencing no scaling whatsoever in games such as Battlefield 3. In fact, i think my performance is worse. Ive enabled SLI in Nvidia control panel, made sure the SLI bridge is attached properly, turned off vsync, made sure both cards worked before install, tried force alternating frames, etc. I've literally tried everything and I am not scaling. Nvidia control panel acknowledges both cards and says they're running in sli. Ive seen videos of people getting an avg of about 55 fps(which is my average) using one card on bf3. Then when they SLI they get around 110, so i know this game scales on this card. I just cant figure out the problem.
This is so frustrating. Please, any advice helps guys. Thanks in advance.
More about : gtx 660s sli scale
ur6beersaway
September 5, 2013 6:48:21 PM
This Driver is just released 8/30...326.98 beta=http://www.guru3d.com/files_details/geforce_326_98_open...
m
0
l
Relentlesstroll
September 5, 2013 6:52:16 PM
ur6beersaway said:
This Driver is just released 8/30...326.98 beta=http://www.guru3d.com/files_details/geforce_326_98_open...I'll give it a whirl and let you know the results.
m
0
l
Related resources
- 4098x768 with Two GTX 660s in SLI? - Forum
- Will 2 GTX 660s SLI let me use 6 monitors? - Forum
- Best Power Supply for SLI GTX 660s - Forum
- SLI Twin Frozr GTX 660's? - Forum
- GTX 660s sli + FX-6100 | TX650W? - Forum
ur6beersaway
September 5, 2013 6:57:49 PM
expl0itfinder
September 5, 2013 7:13:19 PM
Relentlesstroll
September 5, 2013 7:14:22 PM
Relentlesstroll
September 5, 2013 7:32:04 PM
ur6beersaway said:
This Driver is just released 8/30...326.98 beta=http://www.guru3d.com/files_details/geforce_326_98_open...Clean installed it as you suggested. Restarted pc. Enabled SLI. No change.
m
0
l
ur6beersaway
September 5, 2013 7:50:43 PM
Test OCCT=http://www.ocbase.com/
Select GPU @fps set the value at zero (to disable the limit) & run the fps are displayed top left.
also =http://www.fraps.com/download.php for in game monitoring
Select GPU @fps set the value at zero (to disable the limit) & run the fps are displayed top left.
also =http://www.fraps.com/download.php for in game monitoring
m
0
l
Relentlesstroll
September 5, 2013 8:00:59 PM
ur6beersaway said:
Test OCCT=http://www.ocbase.com/Select GPU @fps set the value at zero (to disable the limit) & run the fps are displayed top left.
also =http://www.fraps.com/download.php for in game monitoring
195-198 fps consistently for 1 min on windowed. 240 fps solid fullscreen 1 min.
m
0
l
Relentlesstroll
September 5, 2013 8:27:14 PM
Relentlesstroll
September 5, 2013 8:47:55 PM
See the frame rate difference to check the scaling. Benchmark programs always got optimization from amd and nvidia so it is a good place to start to check if the sli or cf working or not. If the performance did not increase even with sli being enabled in control panel then something must be wrong. Anyway did you use the latest beta? Have you tried using different drivers?
m
0
l
Relentlesstroll
September 5, 2013 9:36:06 PM
renz496 said:
See the frame rate difference to check the scaling. Benchmark programs always got optimization from amd and nvidia so it is a good place to start to check if the sli or cf working or not. If the performance did not increase even with sli being enabled in control panel then something must be wrong. Anyway did you use the latest beta? Have you tried using different drivers?Ok, I just finished running Heaven on Extreme presets in 1600x900. The only settings i changed between SLI and non-SLI is turning the SLI on.
Non SLI : Fps-32.1;Score-807;min fps-7.3;max fps-79.6
SLI: Fps-56.1;Score-1413;min fps-13.6;max fps-114.5
So its working...what do?
And yes, Ive tried some different drivers.
m
0
l
So sli indeed is working which means your hardware is good. Personally i did not play bf3 but i heard there were problem with nvidia and bf3 for much recent driver. Maybe you can ask other people in bf3 official forum what driver version that actual work with bf3. What other games did you tried that sli not working?
m
0
l
Relentlesstroll
September 5, 2013 10:05:49 PM
renz496 said:
So sli indeed is working which means your hardware is good. Personally i did not play bf3 but i heard there were problem with nvidia and bf3 for much recent driver. Maybe you can ask other people in bf3 official forum what driver version that actual work with bf3. What other games did you tried that sli not working? Minecraft, but I heard that is more cpu dependant(correct me if I'm wrong).
Chivalry: Medieval Warfare.
Skyrim. I did have a frame increase of about 40 fps on there.
Thats about it. Like I said, I mainly play Battlefield, so thats what I've been trying to improve.
m
0
l
I look around and found this:
http://www.minecraftforum.net/topic/1384129-minecraft-a...
http://steamcommunity.com/app/219640/discussions/0/8289...
That 40 fps in skyrim already an improvement. Different game will have different scaling.
http://www.minecraftforum.net/topic/1384129-minecraft-a...
http://steamcommunity.com/app/219640/discussions/0/8289...
That 40 fps in skyrim already an improvement. Different game will have different scaling.
m
0
l
Relentlesstroll
September 5, 2013 11:20:24 PM
renz496 said:
I look around and found this:http://www.minecraftforum.net/topic/1384129-minecraft-a...
http://steamcommunity.com/app/219640/discussions/0/8289...
That 40 fps in skyrim already an improvement. Different game will have different scaling.
I ran MSI Afterburner and noticed that my maximum GPU use is only around 45-50%. Do you think the fx6200 is bottlenecking the cards?
m
0
l
actually gpu usage is depending on games. for games that is very hungry on gpu like crysis series, metro series and the like it is normal for the gpu usage are high or at 99% most of the time. but for games that quite light on gpu utilization can be low. often people take this as poor driver optimization while in fact fully optimized = 100% gpu usage cannot be apply into all games or situation. also enabling v-sync can also lowered the gpu usage if your gpu capable of exceeding 60 fps.
about your processor i'm not really sure but i do think there were some kind bottleneck to it.
about your processor i'm not really sure but i do think there were some kind bottleneck to it.
m
0
l
Relentlesstroll
September 6, 2013 11:39:42 AM
renz496 said:
See the frame rate difference to check the scaling. Benchmark programs always got optimization from amd and nvidia so it is a good place to start to check if the sli or cf working or not. If the performance did not increase even with sli being enabled in control panel then something must be wrong. Anyway did you use the latest beta? Have you tried using different drivers?renz496 said:
actually gpu usage is depending on games. for games that is very hungry on gpu like crysis series, metro series and the like it is normal for the gpu usage are high or at 99% most of the time. but for games that quite light on gpu utilization can be low. often people take this as poor driver optimization while in fact fully optimized = 100% gpu usage cannot be apply into all games or situation. also enabling v-sync can also lowered the gpu usage if your gpu capable of exceeding 60 fps. about your processor i'm not really sure but i do think there were some kind bottleneck to it.
Little update-I ran a CPU usage program while playing BF3 and it said I'm not using anywhere close to the limit of my CPU in game. So I don't think the CPU is choking the GPUs.
m
0
l
expl0itfinder
September 6, 2013 1:35:47 PM
CPU usage is not a good indicator either. Since windows takes into account of all 6 cores, the cores that are not being utilized will show up as low CPU usage. It sounds like a good ole CPU bottleneck to me. Try turning down the mesh quality graphics setting in BF3 (It determines how far away enemies can be spotted, and how many at a time.) as it is a huge CPU killer. If the frames jump, then it's a CPU bottleneck. Also, you can enable the SLI indicator. TO explain this, I will quote an answer from "Neospiral", a user who helped me in another thread:
"These days nVidia's control panel makes setting up SLI basically a one step process. Seat and adequately power both cards, make sure the SLI bridge is securely connected, and connect your monitor(s). Note that you can plug them into any port on either of the cards. I usually keep all my screens connected to the primary card to keep it simple. To enable SLI, open nVidia control panel. Under Configure SLI, Surround, PhysX, select Maximize 3D performance. Done. The power supply in that rig is plenty and then some for SLI.
First thing to do is run GeForce experience. It will detect the games installed, read your hardware, and determine the optimal settings for your games.
Next, back to nVidia control panel, in the 3D Settings menu up top, check to Enable SLI indicator. This will enable an overlay in games that shows you your rough GPU scaling (i.e., how much of your 2nd GPU is being utilized). The bar should appear mostly full during gameplay. If it doesn't, you might want to fiddle with your CPU's clock speed some and do some research online about CPU/GPU timing in SLI. nVidia's website has a great guide explaining it in easy terms.
Otherwise, that's basically it. The only thing I noticed about your setup there is that you didn't mention your cooling solution. While SLI doesn't directly affect your CPU's temperature, having two GPU's can increase ambient temperature some in most cases. Make sure you've got good air flow in the case and plenty of space around for ambient dissipation and you're in good shape.
Good luck and have fun!"
And here is the entire thread if you want to take a look at it:
http://www.tomshardware.com/answers/id-1723607/sli-tips...
"These days nVidia's control panel makes setting up SLI basically a one step process. Seat and adequately power both cards, make sure the SLI bridge is securely connected, and connect your monitor(s). Note that you can plug them into any port on either of the cards. I usually keep all my screens connected to the primary card to keep it simple. To enable SLI, open nVidia control panel. Under Configure SLI, Surround, PhysX, select Maximize 3D performance. Done. The power supply in that rig is plenty and then some for SLI.
First thing to do is run GeForce experience. It will detect the games installed, read your hardware, and determine the optimal settings for your games.
Next, back to nVidia control panel, in the 3D Settings menu up top, check to Enable SLI indicator. This will enable an overlay in games that shows you your rough GPU scaling (i.e., how much of your 2nd GPU is being utilized). The bar should appear mostly full during gameplay. If it doesn't, you might want to fiddle with your CPU's clock speed some and do some research online about CPU/GPU timing in SLI. nVidia's website has a great guide explaining it in easy terms.
Otherwise, that's basically it. The only thing I noticed about your setup there is that you didn't mention your cooling solution. While SLI doesn't directly affect your CPU's temperature, having two GPU's can increase ambient temperature some in most cases. Make sure you've got good air flow in the case and plenty of space around for ambient dissipation and you're in good shape.
Good luck and have fun!"
And here is the entire thread if you want to take a look at it:
http://www.tomshardware.com/answers/id-1723607/sli-tips...
m
0
l
Relentlesstroll
September 6, 2013 2:17:49 PM
expl0itfinder said:
CPU usage is not a good indicator either. Since windows takes into account of all 6 cores, the cores that are not being utilized will show up as low CPU usage. It sounds like a good ole CPU bottleneck to me. Try turning down the mesh quality graphics setting in BF3 (It determines how far away enemies can be spotted, and how many at a time.) as it is a huge CPU killer. If the frames jump, then it's a CPU bottleneck. Also, you can enable the SLI indicator. TO explain this, I will quote an answer from "Neospiral", a user who helped me in another thread:"These days nVidia's control panel makes setting up SLI basically a one step process. Seat and adequately power both cards, make sure the SLI bridge is securely connected, and connect your monitor(s). Note that you can plug them into any port on either of the cards. I usually keep all my screens connected to the primary card to keep it simple. To enable SLI, open nVidia control panel. Under Configure SLI, Surround, PhysX, select Maximize 3D performance. Done. The power supply in that rig is plenty and then some for SLI.
First thing to do is run GeForce experience. It will detect the games installed, read your hardware, and determine the optimal settings for your games.
Next, back to nVidia control panel, in the 3D Settings menu up top, check to Enable SLI indicator. This will enable an overlay in games that shows you your rough GPU scaling (i.e., how much of your 2nd GPU is being utilized). The bar should appear mostly full during gameplay. If it doesn't, you might want to fiddle with your CPU's clock speed some and do some research online about CPU/GPU timing in SLI. nVidia's website has a great guide explaining it in easy terms.
Otherwise, that's basically it. The only thing I noticed about your setup there is that you didn't mention your cooling solution. While SLI doesn't directly affect your CPU's temperature, having two GPU's can increase ambient temperature some in most cases. Make sure you've got good air flow in the case and plenty of space around for ambient dissipation and you're in good shape.
Good luck and have fun!"
And here is the entire thread if you want to take a look at it:
http://www.tomshardware.com/answers/id-1723607/sli-tips...
Thanks for all the info! I appreciate it immensely. Just FYI I'm running 4 120mm fans, one of which is mounted on the side panel, so theres always fresh air blowing on the cards. They're all on a fan controller, which i turn way up when playing demanding games. And my CPU is cooled using a Coolermaster Seidon 120m. (I noticed a 17 degree temp decrease with liquid!!) And I did a MAJOR overhaul of my cable management only an hour ago. So the cooling should be good.
m
0
l
Relentlesstroll
September 6, 2013 3:05:10 PM
expl0itfinder said:
CPU usage is not a good indicator either. Since windows takes into account of all 6 cores, the cores that are not being utilized will show up as low CPU usage. It sounds like a good ole CPU bottleneck to me. Try turning down the mesh quality graphics setting in BF3 (It determines how far away enemies can be spotted, and how many at a time.) as it is a huge CPU killer. If the frames jump, then it's a CPU bottleneck. Also, you can enable the SLI indicator. TO explain this, I will quote an answer from "Neospiral", a user who helped me in another thread:"These days nVidia's control panel makes setting up SLI basically a one step process. Seat and adequately power both cards, make sure the SLI bridge is securely connected, and connect your monitor(s). Note that you can plug them into any port on either of the cards. I usually keep all my screens connected to the primary card to keep it simple. To enable SLI, open nVidia control panel. Under Configure SLI, Surround, PhysX, select Maximize 3D performance. Done. The power supply in that rig is plenty and then some for SLI.
First thing to do is run GeForce experience. It will detect the games installed, read your hardware, and determine the optimal settings for your games.
Next, back to nVidia control panel, in the 3D Settings menu up top, check to Enable SLI indicator. This will enable an overlay in games that shows you your rough GPU scaling (i.e., how much of your 2nd GPU is being utilized). The bar should appear mostly full during gameplay. If it doesn't, you might want to fiddle with your CPU's clock speed some and do some research online about CPU/GPU timing in SLI. nVidia's website has a great guide explaining it in easy terms.
Otherwise, that's basically it. The only thing I noticed about your setup there is that you didn't mention your cooling solution. While SLI doesn't directly affect your CPU's temperature, having two GPU's can increase ambient temperature some in most cases. Make sure you've got good air flow in the case and plenty of space around for ambient dissipation and you're in good shape.
Good luck and have fun!"
And here is the entire thread if you want to take a look at it:
http://www.tomshardware.com/answers/id-1723607/sli-tips...
Sir, I believe you have found the problem. I turned my mesh to high and was getting around 55 avg. Turned it to low, and jumped to 75. So yeah, any recommendations on good cpus? I was looking at the Amd fx 8350.
m
0
l
expl0itfinder
September 6, 2013 3:48:57 PM
8320 has awesome bang for buck. Also, since you have water cooling, you will likely be able to get it to outperform the 8350 with minimal overclocking effort. Also, did you try turning on the SLI scaling indicator? (Explained in my previous post) It will let you know for sure if SLI is scaling properly. We do not want to drop cash on a new CPU just to find out it wasn't the issue after all.
m
0
l
Relentlesstroll
September 6, 2013 3:57:45 PM
expl0itfinder said:
8320 has awesome bang for buck. Also, since you have water cooling, you will likely be able to get it to outperform the 8350 with minimal overclocking effort. Also, did you try turning on the SLI scaling indicator? (Explained in my previous post) It will let you know for sure if SLI is scaling properly. We do not want to drop cash on a new CPU just to find out it wasn't the issue after all.I just turned it on for BF3. It said SL with a green box next to it.(I assume that means its good to go). There was also a bar on the left side that was only filled about 1/3 of the way with green and it was fluctuating. What does this mean?
m
0
l
expl0itfinder
September 6, 2013 4:07:33 PM
expl0itfinder
September 6, 2013 4:09:18 PM
This is a good representation of what happens in a CPU bottleneck:
http://www.geforce.com/sites/default/files-editorial/at...
http://www.geforce.com/sites/default/files-editorial/at...
m
0
l
Related resources
- SolvedWill GTX 660s in sli (non-ti) be bottle-necked by an FX-6100? Forum
- Twin Gtx 660's in SLI vs GTX 970? Forum
- SolvedFX 8350 and 2 GTX 660's in SLI = Low GPU Usage and lower than expected fps. Forum
- Best CPU/Mobo for around 300$ for SLI GTX 660s? Forum
- What single card gpu is the performance equivalent of two GTX 660's in SLI on games which support SLI? Forum
- Is my 700w PSU powerful enough for 2 GTX 660s in SLI? Forum
- GTX 660s won't recognize as SLI linked Forum
- What kind of PSU will I need to run two GTX 660's in SLI? Forum
- Will the FX 6300 bottleneck 2 way SLI GTX 660's Forum
- Best Power Supply for SLI GTX 660s? Forum
- 2-way SLI with GTX 660's Forum
- Shall I buy gtx 880m sli for battlefield 4 ultra & %200 resolution scale? Please help Forum
- SolvedPSU ideas for SLI 660s and 8350 Forum
- SolvedBest PSU under $75 for Aftermarket OC'd SLI 660s Forum
- SolvedComputer is Crashing/Freezing when i installed SLI 660's Forum
- More resources
Read discussions in other Graphics & Displays categories
!