Sign in with
Sign up | Sign in
Your question

SLI problems

Last response: in Graphics & Displays
Share
January 1, 2012 8:56:27 PM

I'm getting bad scaling in most games after adding a second EVGA GTX560ti Superclocked. In games like COD4, Portal 2, BFBC2, and Mafia II, my performance is great. I get almost perfect scaling in those. However, in games like L4D and BF3, I get very low usage of each card, and my framerate either does not improve (L4D), or drops considerably (BF3). I understand BF3 has had problems with SLI, but L4D? Across the board however, I'm not getting a consistent usage. In Cod4, my usage sometimes drops between 80% and 60%, with my FPS going down to about 190.

I'm running on the latest drivers. Specs below:

AMD Phenom II X4 975 @ 3.6 GHz
EVGA GTX560 Ti Superclocked x2
8GB DDR3 1600mhz
750w PSU
Gigabyte 990fxa-ud3 mobo



1920x1080 res.

More about : sli problems

a b Î Nvidia
January 2, 2012 2:08:50 AM

At 19x10 rez, I don't see what you're complaining about. 190 FPS! Please...
m
0
l
January 2, 2012 4:09:32 AM

AMD Phenom II X4 975 @ 3.6 GHz, EVGA GTX560 Ti SLI, @ 1920x1080 res. getting 190 FPS. I'm sorry but I'm throwing the BS flag on this one.
m
0
l
Related resources
January 2, 2012 5:52:03 AM

^+1 no tears should be shown here.
m
0
l
January 2, 2012 8:08:51 AM

At those framerates you'll run into other bottlenecks besides GPU.
m
0
l
a b Î Nvidia
January 2, 2012 8:22:41 AM

since your gfx power goes unused on load, the bottleneck might be somewhere else. anyway, did you update all the drivers and gamne patches?
imo the phenom cpu is the prime suspect for bottleneck.
next could be the pcie lanes on the second slot... is it x4 or x8?
use a program like msi afterburner/gpu z for gfx card monitoring and cpu-z/hwmonitor/speccy for cpu and system monitor. if your cpu keeps hitting 100% while gfx cards go less used, the cpu might be the bottleneck.
m
0
l
a b Î Nvidia
January 2, 2012 10:26:20 AM

190fps, are you trolling?
m
0
l
January 2, 2012 5:20:38 PM

I'm not trolling. I would have no problem with a steady 190 fps. Problem is, i'm getting 250 fps, and it will randomly drop to 190 and then come right back up again. My problem isn't with the framerate, its with the cards usage randomly dropping and picking up again.

And the board is 16x/16x.

The CPU shouldn't be bottlenecking. I rarely get higher than 80% CPU usage in game. I'm pretty sure there is a software problem.
m
0
l
a b Î Nvidia
January 2, 2012 6:48:57 PM

Dude, you can't tell the difference between 190fps and 250 fps, it does not matter, just as long as you are getting above 60 fps your game is perfectly fine.
m
0
l
January 2, 2012 11:07:43 PM

amuffin said:
Dude, you can't tell the difference between 190fps and 250 fps, it does not matter, just as long as you are getting above 60 fps your game is perfectly fine.


Are you dense? I repeat "BF3, I get very low usage of each card...and my framerate drops considerably".

I have no problem with 190-250 fps. My problem is with the USAGE. The drop between 250 and 190 shows that there is a problem with the USAGE. This problem with the USAGE is causing problems across the board in all games.
m
0
l
a b Î Nvidia
January 3, 2012 12:20:33 AM

Which exact power supply?
m
0
l
January 3, 2012 12:27:03 AM

Bottleneck from the CPU
OC it or get a better one.

BF3 isnt just a graphics intensive game, it also uses the CPU a lot so i think the drop in usage is caused by the cpu.

Cheer up, people are here to help you. Its not helping your case any if youre calling people dense and getting angry from people helping you.
m
0
l
January 3, 2012 12:55:36 AM

Ferinthul said:
Bottleneck from the CPU
OC it or get a better one.

BF3 isnt just a graphics intensive game, it also uses the CPU a lot so i think the drop in usage is caused by the cpu.

Cheer up, people are here to help you. Its not helping your case any if youre calling people dense and getting angry from people helping you.


I could see that if I had a problem with 1 card. The CPU isn't being overworked by the game. I get 40 fps less with SLI enabled than I do with 1 card. Also, I never get the CPU to full load, so I can't imagine its a bottleneck.
m
0
l
January 3, 2012 12:56:14 AM

amuffin said:
Which exact power supply?


Its an HEC Zephyr PSU. Not a big brand, but solid reviews and SLI certified.
m
0
l
a b Î Nvidia
January 3, 2012 1:13:19 AM

The 560ti requires 30 amps to run, your psu has exactly 30 amps on 12v rails, its probably because you are maxing out your psu.
m
0
l
a c 172 Î Nvidia
January 3, 2012 2:07:52 AM

amuffin said:
The 560ti requires 30 amps to run, your psu has exactly 30 amps on 12v rails, its probably because you are maxing out your psu.

No it doesn't.
m
0
l
January 3, 2012 7:25:56 PM

Yeah its not a hardware problem. I ran the Unigine Heaven benchmark, and I got a solid 99% usage on each card, with expected scaling. The system can definitely make use of the cards. It has to be something in the SLI Configuration.
m
0
l
a b Î Nvidia
January 6, 2012 1:46:43 AM

Just what is the concern you seem to have with this "usage' thing? SLI doesn't scale perfectly with many games, if that is what you are refering to. I for one don't see any issue as long as you are getting good/excellent framerates. If you are experiencing 'stuttering' however while still experiencing good framerates (and blaming it on usage) , that's sometimes common with 2 cards.
m
0
l
a b Î Nvidia
January 6, 2012 1:49:31 AM

amuffin said:
The 560ti requires 30 amps to run, your psu has exactly 30 amps on 12v rails, its probably because you are maxing out your psu.


30amps X 12v = 360w. 360w actually can feed a stock gtx580 and a dedicated physx card quite nicely.
m
0
l
a c 172 Î Nvidia
January 6, 2012 1:57:49 AM

nforce4max said:
30amps X 12v = 360w. 360w actually can feed a stock gtx580 and a dedicated physx card quite nicely.

With enough left over to boil a kettle and microwave a beef and onion pie! :lol: 
m
0
l
a b Î Nvidia
January 6, 2012 2:12:11 AM

Mousemonkey said:
With enough left over to boil a kettle and microwave a beef and onion pie! :lol: 


Na I'll pass, killed a turkey a few hours ago and cooked it. Turned out pretty good ;) 
m
0
l
a c 172 Î Nvidia
January 6, 2012 2:15:14 AM

nforce4max said:
Na I'll pass, killed a turkey a few hours ago and cooked it. Turned out pretty good ;) 

Don't get many of them walking around the area I live in.
m
0
l
a b Î Nvidia
January 6, 2012 2:27:41 AM

math...i don't like math....
lessee... stock gtx 560ti can use roughly 15 amps, higher if overclocked.
sli would use roughly 30 amps at stock, more if overclocked..
i think i saw a guru 3d article where a gtx 560ti used like 196w on load.this one uses over 200w.
i am assuming that amuffin meant the overall ampere requirement. i could be wrong.
@OP: the fps drop seems like a software problem. i read from this following article that bf3 is fps-locked by frostbite engine at 200 fps.
http://vr-zone.com/articles/amd-radeon-hd-7970-quad-cro...
so the fps drop you are experiencing might be from the fps throttling..
m
0
l
a c 172 Î Nvidia
January 6, 2012 2:37:07 AM

de5_Roy said:
math...i don't like math....
lessee... stock gtx 560ti can use roughly 15 amps, higher if overclocked.
sli would use roughly 30 amps at stock, more if overclocked..
i think i saw a guru 3d article where a gtx 560ti used like 196w on load.this one uses over 200w.
i am assuming that amuffin meant the overall ampere requirement. i could be wrong.
@OP: the fps drop seems like a software problem. i read from this following article that bf3 is fps-locked by frostbite engine at 200 fps.
http://vr-zone.com/articles/amd-radeon-hd-7970-quad-cro...
so the fps drop you are experiencing might be from the fps throttling..



How is 188w over 200w? :heink: 
m
0
l
a b Î Nvidia
January 6, 2012 2:55:06 AM

ugh.. this is why i don't like math. i always ... well..
okay. how about this review? i am not trying to argue with you, i am just trying to make sense of the numbers. tpu's says that the (maximum) power was measured for the card, not calculated like guru3d. however, i'd take real life measurements over benchmarks anyday. afaik reviewer sites usually use clean pcs to test hardware.
m
0
l
a c 172 Î Nvidia
January 6, 2012 3:00:55 AM

de5_Roy said:
ugh.. this is why i don't like math. i always ... well..
okay. how about this review? i am not trying to argue with you, i am just trying to make sense of the numbers. tpu's says that the (maximum) power was measured for the card, not calculated like guru3d. however, i'd take real life measurements over benchmarks anyday. afaik reviewer sites usually use clean pcs to test hardware.

The line :-
Quote:
Just like the GTX 580 and GTX 570, the GTX 560 comes with a current limiter system which reduces clocks and performance in case the card senses it is overloaded by stress testing applications. We disabled this feature for our "Maximum"
is enough to throw that result out of the window. I have not disabled that feature on my cards therefore they are not using that much power.
m
0
l
a b Î Nvidia
January 6, 2012 3:06:36 AM

Mousemonkey said:
The line :-
Quote:
Just like the GTX 580 and GTX 570, the GTX 560 comes with a current limiter system which reduces clocks and performance in case the card senses it is overloaded by stress testing applications. We disabled this feature for our "Maximum"
is enough to throw that result out of the window. I have not disabled that feature on my cards therefore they are not using that much power.

ah. makes sense.
i should have noticed that in the review.
thanks mousemonkey. :) 
m
0
l
a c 172 Î Nvidia
January 6, 2012 3:14:43 AM

de5_Roy said:
ah. makes sense.
i should have noticed that in the review.
thanks mousemonkey. :) 

No worries, I had another poster throw that exact same link at me earlier in their attempt to prove that a GTX560Ti used 225w! :lol: 
m
0
l
a b Î Nvidia
January 6, 2012 6:54:35 AM

Omg not 30 amps. my bad guys :) 
m
0
l
January 6, 2012 7:08:44 PM

If you want to increase your GPU usage, upgrade your CPU.
m
0
l
!