Getting Lower FPS in SLI than I was with Single GPU

Hello,

I recently decided to upgrade my system to SLI as my GPU does struggle with some games at higher resolutions.

Initially the machine had an Nvidia GTX 280 1GB (and has done for 4 years now). I got another from eBay to run SLI and hopefully be able to run games at a higher resolution with acceptable frame rate.

I have read several guides to setting up SLI on several sites and then installed the new card and set up SLI.

I have since run Racedriver GRID, and It run painfully slowly, even on the menu screen, there is a lag between key presses and things happening. If I switch the SLI mode to Single GPU (or Disable SLI) it runs fine (though with everything at full it pulls the GPU to 90-100%)

In WoW I also don't seem to get any improvement in FPS in fact in some cases it seems to be worse.

I am now wondering if I have set something up wrong, and have come for some advice having spent several days now unplugging things and replugging them and changing settings.

I'm not sure what you will want to know, so here are some basics:

Asus P5N-T Deluxe Mobo
Intel Core 2 Quad Q6600 @ 3.0GHz
4 x 2GB 800MHz DDR2 RAM
2 x Nvidia GTX 280 1GB
880W Hiper PSU

Dual 24" 1900x1200 Monitors on DVI-D

The GPU's are connected with an SLI bridge.

Thanks in advance if anyone can help me figure this out and improve my performance.
35
answers
Last reply
More about getting lower single
  1. Apparently WOW does not support sli or crossfire. I got like 50FPs in ultra at 1920x1080 with a 6990 which is crossfire 6970's. It was terrible performance. In am not familiar with Racedriver Grid but I suspect it is the same issue. Try running 3dmark11 or Heaven 3.0. These will tell you if sli is working or not.

    http://www.3dmark.com/3dmark11/download/

    http://unigine.com/products/heaven/
  2. sounds like your cpu bottlenecking it massively try to put a single card those cards are power hunger and your psu is also weaker.
  3. alrobichaud said:
    Apparently WOW does not support sli or crossfire. I got like 50FPs in ultra at 1920x1080 with a 6990 which is crossfire 6970's. It was terrible performance. In am not familiar with Racedriver Grid but I suspect it is the same issue. Try running 3dmark11 or Heaven 3.0. These will tell you if sli is working or not.

    http://www.3dmark.com/3dmark11/download/

    http://unigine.com/products/heaven/


    My WoW FPS increased dramatically after SLI, back when I used to play.
  4. overclock your cpu further more upto 3.6ghz you will make a good difference.. But i'm not sure this will kill bottleneck.
  5. Helltech said:
    My WoW FPS increased dramatically after SLI, back when I used to play.



    Really? I activated my old account with a one week free coupon mainly to do some benchmarking as another forum member was having issues with WOW and I found that my 6990 was running with two gpu's at around 15%. Another forum member mentioned that sli and crossfire support was almost non exsitent for WOW. I don't know about sli from experience but crossfire does not work very well.
  6. Helltech said:
    My WoW FPS increased dramatically after SLI, back when I used to play.

    I haven't played past WotL, but up to that point, SLI and CF were not officially supported. As a result, some cards did not get good scaling at all, but others still managed to get great scaling. Back when I was playing the 4870x2 and 5870's were getting great performance in CF, while just about every other card was not.

    Unless things have changed, you must just have an Nvidia card that has good SLI scaling, while many others will not.
  7. alrobichaud said:
    Really? I activated my old account with a one week free coupon mainly to do some benchmarking as another forum member was having issues with WOW and I found that my 6990 was running with two gpu's at around 15%. Another forum member mentioned that sli and crossfire support was almost non exsitent for WOW. I don't know about sli from experience but crossfire does not work very well.


    I was aware with CF problems with WoW, that was a long time ago, surprised they haven't fixed it by now. I recall many upset people with the 5970 and WoW. However SLI did help me for sure, it went from lagging in raids to being smooth gameplay.

    Also I would like to note that Tom's Hardware has tested SLI and wow, and has seen dramatic improvements in 8xAA. Looks to be working to me.

  8. Hi all,
    Thank you for the replies.

    Quote:


    I attempted to run 3D mark and was confronted with an error for not having DX11 support in the GTX 280. I also looked at the second option and that required DX 11 hardware too.

    Quote:
    sounds like your cpu bottlenecking it massively try to put a single card those cards are power hunger and your psu is also weaker.


    Maybe I am missing something but the cpu is coping fine if I run the game in single GPU mode, why would there be such a drastic change when SLI is active?

    Also when I only had the one GPU I could run most games on full settings at full resolution, and when it struggled I would just turn down the resolution a notch and get perfectly good FPS, from what I understand that means the cpu is fine and the bottle neck was at the GPU? Hence geeting a second for SLI.

    I used a PSU calculator to check my PSU would cope with SLI before the purchase, Yes the cards are power hungry, but it should be able to cope, no?

    Quote:
    overclock your cpu further more upto 3.6ghz you will make a good difference.. But i'm not sure this will kill bottleneck.


    I'm weary of pushing the CPU any further, it has been running clocked from 2.4 to 3.0 for around 4 years now, I don't know how much life there is left in it as it has done many 24/7 sessions in its time as it is used for other things than gaming; which I can't afford to lose by melting the CPU or something silly like that.

    Dean
  9. are you using latest driver??
  10. I think I am, as I had to reinstall windows last weekend and downloaded the driver straight from the website instead of hunting for the discs.

    The Driver I installed is:
    296.10-desktop-win7-winvista-64bit-international-whql
    http://www.nvidia.co.uk/object/win7-winvista-64bit-296.10-whql-driver-uk.html

    Driver Version: 8.17.12.9610 (from device manager)
  11. Sorry. Guess I and anyone who read my post missed the fact that the gtx 280 is DX 10.1. Must be one of the last gtx cards that was before the release of dx 11. Anyway, you can always try this.


    http://www.3dmark.com/3dmarkvantage/
  12. Helltech said:
    I was aware with CF problems with WoW, that was a long time ago, surprised they haven't fixed it by now. I recall many upset people with the 5970 and WoW. However SLI did help me for sure, it went from lagging in raids to being smooth gameplay.

    Also I would like to note that Tom's Hardware has tested SLI and wow, and has seen dramatic improvements in 8xAA. Looks to be working to me.

    http://media.bestofmicro.com/Q/T/331445/original/wow 2560.png



    Interesting. I wonder what version of CCC they used. It was a few months ago that I tried WOW.
  13. The results from 3D Mark:

    3DMark Score: 14364 3DMarks
    Graphics Score: 17207
    CPU Score: 9604
    Jane Nash: 48.01 FPS
    New Calico: 52.88 FPS
    AI Test: 1294 operations/s
    Physics Test: 13 operations/s

    Graphics Card NVIDIA GeForce GTX 280
    Vendor NVidia Corporation
    # of cards 2
    SLI / CrossFire On
    Memory 1024 MB

    So it seems SLI is working fine and giving me nice performance enhancement in the benchmarker.

    But how do I transfer this to the real world, and my games. Is it a complex set up of specific options for each game to get noticeable benefits? that seems like a lot of hassle.
    Or have I just set my SLI up generally wrong in some way.
    Or are we saying the games I have tried so far just don't scale well for SLI?


    On a side note, I was reading that SLI only benefits one monitor? how do I tell which monitor it is, I have been guessing so far, and assuming it would be the primary monitor. But you never know, My screens are the same some come up with the same name in the control panel.

    Also one final thing, one of my GPU's is running a lot hotter at idle than the other. currently:
    GPU1 Temp: 59C
    GPU2 Temp: 70C

    It just seems like a large difference to be cause by just position in the case?


    Dean
  14. You should run the test with one card active and then with both so you can compare. These numbers really don't transfer to the real world, they will just show you if sli is an improvement or not. Your score probably won't be double with 2 cards vs 1 card but it should be a lot higher. More than likely, the games that you have tried just don't scale well with your gpu's or maybe it is the drivers.... I don't know. Like I said two or three months ago, my 6990 performance was terrible in WOW and helltech posted a pic from a review that contradicts what I said. Drivers tend to be a funny thing sometimes. The top card will run a bit hotter due to the heat rising from the bottom card. I have also found that even on a dual gpu card like the 6990, one gpu is always hotter than the other when they really should be the same. My current crossfire 7970's are separated by a full double sized pci-e slot with a 230MM side mount fan blowing down on them both providing lots of cool air and the top card is always roughly 5 degrees hotter under full load.
  15. Should I physically take out one card for the single card test? or is it enough to disable SLI?

    Also I read that if one of the cards has a faster clock, it should be in the top slot, is that true?
    One is 602MHz, and one is 640MHz, or would it be better to just OC the other to 640, I guess this disables adaptive power management type systems that rely on reducing clock speed though?

    Quote:
    My current crossfire 7970's are separated by a full double sized pci-e slot with a 230MM side mount fan blowing down on them both providing lots of cool air


    This is why I was a bit worried by the difference as I have the same set up, just with a 200mm Fan not 230.
    Dean
  16. It doesn't matter which card goes in which slot. You can always reverse them and see if the temperatures are the same and that way you can be sure that there is nothing wrong with either card. I suspect the top card will be hotter again. There really is no reason why one gpu in a dual gpu card should run hotter than the other but it does. So I would not be too concerned about the top card being hotter in your setup. If you want, you can use msi afterburner to set both clock speeds to the same thing. Either 602 or 640 or just overclock them both a bit if you like. Honestly, I wasn't aware you could sli two nvdia cards that were not idenctical such as yours with different gpu speeds. If they work like AMD then they will default to the slowest card speed. You can also use afterburner to set a custom fan profile if you want your cards to run a bit cooler. I find the default settings on any card tends to let the gpu's get a bit hotter in favour of a slower and quieter fan. I am not sure how adaptive power management works with those cards but I can tell you that it still works with AMD cards even if you use afterburner to set the gpu speed.
  17. Hi, I will have a look at swapping them and checking the temperatures, and the custom fan profile sounds interesting. I am no to bothered about how loud the machine is, it already has 3 x 200MM & 4 x 120MM case fans, with 3 Spinning Disk HDDs, so it's not exactly quiet, I am more bothered about running temps and longevity of hardware.
    I might have a look around for a guide to over-clocking the cards, at the moment the fans run at about 15% so I guess there's a bit of room in there for them running hotter before anything melts, though when at 100% these cards are very noisy.

    Dean
  18. Overclocking is really straight forward. Increase the gpu and memory clock then run a benchmark such as 3dmark Vantage. If yoursystem does not crash, you are probably safe. If you are already getting higher FPS than the refresh rate of your monitor, I wouldn't bother overclocking. IS that 15% at load? I find my gpu fans only become audible once they reach 50% and that is with my case fans turned all the way down. IF your fans are only running 15% that could be why your gpu's are getting so hot. I have a custom curve setup where idle gpu sits at 40% and curves upwards until at 70 degrees my fans are at 60% and at 80 degrees I am at 70%. Running two 7970's overclocked, my temps peak at around 70 with my gpu fans at 60%. For everyday use 60% is a bit loud but if I am playing bf3 I typically have the volume loud enough to drown out the case and gpu fans. Keep in mind my 7970's have XFX's DD coolers which work better than the reference design and the new 7XXX cards are designed to use less power and run cooler than previouse AMD cards. Under full load your gpu's run at 85 degrees according to this Guru3d review. That is typical of previouse generation AMD and Nvidia cards. The new 7xxx and 6xxx consume less power and produce less heat.

    http://www.guru3d.com/article/geforce-gtx-280-review-test/1
  19. alrobichaud said:
    It doesn't matter which card goes in which slot. You can always reverse them and see if the temperatures are the same and that way you can be sure that there is nothing wrong with either card. I suspect the top card will be hotter again. There really is no reason why one gpu in a dual gpu card should run hotter than the other but it does. So I would not be too concerned about the top card being hotter in your setup. If you want, you can use msi afterburner to set both clock speeds to the same thing. Either 602 or 640 or just overclock them both a bit if you like. Honestly, I wasn't aware you could sli two nvdia cards that were not idenctical such as yours with different gpu speeds. If they work like AMD then they will default to the slowest card speed. You can also use afterburner to set a custom fan profile if you want your cards to run a bit cooler. I find the default settings on any card tends to let the gpu's get a bit hotter in favour of a slower and quieter fan. I am not sure how adaptive power management works with those cards but I can tell you that it still works with AMD cards even if you use afterburner to set the gpu speed.


    The top card is almost always hotter due to having less air flow.

    From my experience, AMD card do no such thing. They do not sync up their speeds on their own. You can see it as clear as day if you OC from within the CCC. They allow you to adjust each card individually, and if you use GPU-Z, it will show you they are running at different clocks.

    MSI Afterburner, on the other hand, tries to sync up their clocks and does not give you control of each card separately. However, I found my last set of AMD cards still would not sync regardless.
  20. I was making reference to a dual gpu in the same card always showing one gpu hotter even though they should really be the same. I have been told by many in this forum that when you crossfire amd gpu's such as a 6950 with a 6970 everything defaults to operating at the slowest speed on a software level. It did make sense even though gpu-z shows you the default clock speeds. Otherwise how could two gpu's with different gpu clocks, memory clocks and number of stream processors work without any problems. It would be nice to see some concrete data on this topic.
  21. Have just been having a play with crysis, see what that shows on any performance gains, and the FPS wasn't changing at all between SLI and Single GPU Mode. So I did a bit of searching and came across this:
    http://www.tomshardware.co.uk/charts/gaming-graphics-charts-2008-q3/compare,740.html?prod%5B2060%5D=on&prod%5B2059%5D=on
    If that link works that is.
    Which pretty much seems to show this card in SLI performing at a lower FPS than a single card in almost all test cases at all different resolutions and qualitites....
    How can this be so?
    And what sort of benefit am I going to get from SLI?
  22. alrobichaud said:
    I was making reference to a dual gpu in the same card always showing one gpu hotter even though they should really be the same. I have been told by many in this forum that when you crossfire amd gpu's such as a 6950 with a 6970 everything defaults to operating at the slowest speed on a software level. It did make sense even though gpu-z shows you the default clock speeds. Otherwise how could two gpu's with different gpu clocks, memory clocks and number of stream processors work without any problems. It would be nice to see some concrete data on this topic.


    Yes, I've seen people say that a lot, but it isn't true. You can monitor the actual running clocks on monitor tab in GPU-Z or even MSI Afterburner, and the cards will in fact run at different clocks. It works just fine because they use alternate frame rendering. One card renders a frame, then the other card renders the next frame. It's possible it does cause them to be rendered at slightly different speeds, but that is what it does according to the monitoring software I've looked at while testing.
  23. dean8020 said:
    Have just been having a play with crysis, see what that shows on any performance gains, and the FPS wasn't changing at all between SLI and Single GPU Mode. So I did a bit of searching and came across this:
    http://www.tomshardware.co.uk/charts/gaming-graphics-charts-2008-q3/compare,740.html?prod%5B2060%5D=on&prod%5B2059%5D=on
    If that link works that is.
    Which pretty much seems to show this card in SLI performing at a lower FPS than a single card in almost all test cases at all different resolutions and qualitites....
    How can this be so?
    And what sort of benefit am I going to get from SLI?


    There are a couple factors to consider here. At lower resolutions, a single card can often max out the performance available by the CPU, and since SLI/CF takes some CPU overhead to run, it can slow things down. You are also looking at a 285, which doesn't scale nearly as well as the newer cards.

    Getting a monster graphics card setup is not always a good idea if you are not using a monitor setup that will need it. Most 1080p monitors are 60hz, meaning they can only display up to 60 frames per second. Going beyond 60 FPS will not make any visible difference other than screen tearing.

    Let's say you have a 120hz monitor and can see up to 120hz FPS. If you are using a slow CPU, like a low clocked Phenom II or older Intel core2duo, the CPU may not be able to keep up with the GPU's, causing a bottleneck and not allowing the GPU to show improvement when in SLI.
  24. bystander said:
    There are a couple factors to consider here. At lower resolutions, a single card can often max out the performance available by the CPU, and since SLI/CF takes some CPU overhead to run, it can slow things down. You are also looking at a 285, which doesn't scale nearly as well as the newer cards.

    Getting a monster graphics card setup is not always a good idea if you are not using a monitor setup that will need it. Most 1080p monitors are 60hz, meaning they can only display up to 60 frames per second. Going beyond 60 FPS will not make any visible difference other than screen tearing.

    Let's say you have a 120hz monitor and can see up to 120hz FPS. If you are using a slow CPU, like a low clocked Phenom II or older Intel core2duo, the CPU may not be able to keep up with the GPU's, causing a bottleneck and not allowing the GPU to show improvement when in SLI.



    The issue I was having was that I was getting around 20 - 25 FPS at 1900x1200 on some games, so the aim of SLI was to try and bring that a bit closer to the 60Hz refresh. 20FPS is OK and doesn't really lag, but the moment it drops slightly for any reason you notice.

    I am planning a big upgrade pretty soon anyway, and will hopefully have an ok amount to spend, but I was hoping that going SLI would mean I didn't need a new GPU straight away and could spend the lot on CPU, mobo and RAM.

    You say it may still be a CPU bottle neck, so there's a chance it would do the trick while I saved up again.

    Also do you think running 2 1900x1200 Monitors from it will be affecting anything?
    I wont really need the second monitor for anything in a couple of months time.
  25. I don't honestly know what type of performance to expect on the game you are having issues with, but 20-25 FPS does seem lower than it should be.
  26. bystander said:
    Yes, I've seen people say that a lot, but it isn't true. You can monitor the actual running clocks on monitor tab in GPU-Z or even MSI Afterburner, and the cards will in fact run at different clocks. It works just fine because they use alternate frame rendering. One card renders a frame, then the other card renders the next frame. It's possible it does cause them to be rendered at slightly different speeds, but that is what it does according to the monitoring software I've looked at while testing.



    It is quite funny if you try to google search that. Every response to the question has the same answer which is the faster card auto scales to the slowest speed but not one post that I read from any forum was able to back that claim up with actual facts.
  27. alrobichaud said:
    It is quite funny if you try to google search that. Every response to the question has the same answer which is the faster card auto scales to the slowest speed but not one post that I read from any forum was able to back that claim up with actual facts.


    It's an Urban legend I guess. Seriously, if you have a crossfire setup, you can test it yourself.

    I've tested it with 6950's, with one that has a higher stock OC. I've tested it with a 6950 and a flashed one with 6970 clocks. The clocks never actually change.
  28. I've done that already with a 6990 and 6970 with the same results as you in gpu-z. I figured the masses had to be correct and somehow the faster card would downclock to the slower card. Must be an urban myth.
  29. alrobichaud said:
    I've done that already with a 6990 and 6970 with the same results as you in gpu-z. I figured the masses had to be correct and somehow the faster card would downclock to the slower card. Must be an urban myth.


    It's good to see someone else noticed what I've seen. I think the masses have a tendency to make assumptions, and aren't always right. Not many really test things for themselves, so it's easy to see false info spread.
  30. Sorry to hijack your thread, Dean. Just read this 5 minutes ago. See, I am not totally nuts.

    http://www.tomshardware.com/forum/353578-33-super-overclock-series-question


    Back to your problem, Dean. The second monitor will not affect your fps. I am running 4 monitors and I had the same question so I ran heaven 2.5 with one monitor then again with all 4 monitors active and my results were as close as close can get. Less than 1% difference. Don't know if I believe the cpu is bottlenecking you. You have 4 cores running at 3GHz which should be plenty to keep up with 2 gtx 280's.

    That link you provided looked really dismal for sli results but guru3d did a 2 way and 3 way sli review and they provided slightly different results.

    http://www.guru3d.com/article/geforce-gtx-280-sli-triple-review-test/1
  31. This is what I kept thinking about the CPU, I know its getting on a bit now, but its not exactly sluggish, and the cores max around 50% usage when in a game.
    I think I may just have a combination of cards that don't scale particularly well, and games that don't either meaning I'm not really seeing the improvement yet. My results don't seem to be too much below the benchmarks in my last link on crysis.

    With the fans set to a custom profile with afterburner things are a lot happier in the temp department (for my liking anyway) 47C and 57C, and under load they both hit around 80C together and stay there.

    I will try installing something a bit newer later and see if I can spot a difference.
  32. Just had a look at the link you posted too, and that does look much more promising, I guess I have a habit of jumping straight in and turning everything right up, when there may be a better compromise to reach for better gameplay...
  33. Honestly, even the guru3d review was not great for sli compared to newer cards. Heck, even moving a bit forward in time, the gtx 460's scale much better than those 280's. Perhaps it is time for a gpu upgrade. One last thing I can think of is to use afterburner to force both cards to the same gpu clock and try again. Maybe the slight difference is affecting your performance.
  34. alrobichaud said:
    Honestly, even the guru3d review was not great for sli compared to newer cards. Heck, even moving a bit forward in time, the gtx 460's scale much better than those 280's. Perhaps it is time for a gpu upgrade. One last thing I can think of is to use afterburner to force both cards to the same gpu clock and try again. Maybe the slight difference is affecting your performance.


    Ok, so say I decide to upgrade graphics, (I know there are lots of threads about this next question already, but I will ask anyway and see what your opinion is)
    Do you reckon it is better to get something like the new GTX 670, or get 2 of the GTX 5xx Series and SLI those?
    Or would you say a Radeon and possibly crossfire is the way to go?
  35. Since this is new gpu, you should always get the single best card you can afford. Do not get two lesser quality cards to sli or crossfire or you will end up with issues just like the one you are having now. As great as sli and crossfire works on new cards, you will still find that there are some minor isses with drivers and certain games. The gtx 670 looks to be the best buy at the moment. It should max just about any game at your resolution. Anything more would be overkill. I have two 7970's but I typically game at 6048x1080 in ultra settings whenever I can.
Ask a new question

Read More

Graphics Cards GPUs SLI Graphics