GTX 580 SLI Question

I recently picked up a second EVGA GTX 580 and here is my current setup...

i7 970 Stock
Asus P6X58D-E
2X EVGA GTX 580 stock clocks
Corsair XMS3 6GB tri, CL7 @ 1600mhz
2x Vertex 30GB Raid 0
X-Fi SB Titanium
Corsair TX-750.......will upgrade to Corsair HX-1000
Antec Twelve Hundred Case....

Every game runs perfect... Crysis is finally playable 4x AA, enthusiast @ 1080p and can leave v-sync on and mins are around 50... Loving it
Also Metro 2033 @ 1080p with 4XAA, DOF enabled, and Physx gets me around 45 FPS average, DOF off, AAA, and no Physx gives me 75FPS average
Heaven 2.5 benchmarks at default give me an average of 105FPS.....

Now note that I am not having issues with my PSU and will definitely upgrade to a kilo-watt, everything is stock... no overclocks on anything

Games run perfect obviously, but 3dmark 11 is what is having a little issues
I am on the beta 270.51 drivers....
running performance setting runs just fine and I get over a 10K score...
and on Extreme... I get 3440....
but while running on extreme, it seems like there is stuttering...
Is this what they call Microstuttering? Its not smooth as the frames seem to jump from 20 to 22, back to 20, then back and forth...every second or so...
so the game play just looks unbearable... as it lags then gets smooth then back to lagging...

I checked the temps and they were obviously different, note both cards are right next to each other but also the SLI connector doesn't allow for such a spaced connection, so i would need to purchase a longer one...
Top card reached 88 C, and the bottom one was 81C under load...

Could the temps be why i'm experiencing these slow downs? Only happens in the extreme 3dmark11...
Or could it be my PSU not providing enough power?

Anyone else with the same graphics card experiencing such issues?
7 answers Last reply
More about question
  1. Those temperatures are about right, even good, with an SLI setup. Microstuttering is generally associated with ATI/AMD cards, particularly the 5970, and I have never heard of it with Nvidia SLI.

    I believe your problem is just basically a problem with 3D Mark 11 and possibly the drivers you are using or the Control Panel settings. Nvidia cards are noticably slower on 3D Mark 11, as opposed to 3D Mark Vantage, where they are notably faster. Your PSU wattage is down at the low end of the minimum recommendation, but with your high quality PSU, I'm leaning more towards the software (3D Mark 11) and the drivers. Just keep trying different games and benchmarks to see if the problem occurs in other situations.

    Then be sure to check the Nvidia Control Panel program settings for 3D Mark 11. Try to keep everything at Default/Application Controlled. Make sure you are using "Single Display Performance Mode" and Adaptive Power Management. Finally, some of the more advanced enhancements sometimes cause problems with certain games, prime among these are Ambient Occlusion, Gamma Antialiasing, and Transparency Antialiasing. Try disabling these, especially Ambient Occlusion.

    By the way, I am using the 270.51's and they are working really well with my single card. Good luck, and don't worry about it if it's only happening in 3D Mark 11.
  2. I agree, temps are good. I've seen microstddering in 3Dmark runs before. Never in game though. I wouldn't worry unless you start to see it in games and sometimes with those, turning on vsync fixes it all. Low frames will cause the screen to studder though. I wouldn't worry about it unless you see it in game.
  3. You all are right, it's just major stuttering and annoying though for 3dmark... None of the games slow down and thankfully there is no mouse lag with V-SYNC on any games so far like Crysis... Everything is amazing and no tearing at max graphics....... Might still upgrade the PSU to a HX-1000 and overclock the CPU...
  4. Yeah, I'd be careful since an i7 and (2) GTX580's will consume 850W running FurMark. Gaming is something like 650-700W.

    May I suggest an alternative PSU? One of the Enermax Revolutions would be awesome. There is a 920 and 1020W model. Both have (4) 35A rails and scream awesome.
  5. The nVidia SLI site show no PSU's under 900 watts. The TX series being 2nd tier isn't helping.

    You can diagnose GPU / Power related problems with OCCT. Run the CPU / GPU tests, monitoring the results for the 1st 5 minutes. After that you can find something else to do for the next 55 minutes .... OCCT will graph the results over the 60 minute tests. Look at the temperature and voltage graphs to see if how much they vary from the reference 3.3, 5.0 and 12.0 volts.

    The ATX specification allows 5%

    Anything w/ a 9.5 to 10.0 rating on jonnyguru should eliminate any PSU issues

    I consider:

    < 3% acceptable for office PC's
    < 2 % acceptable for enthusiast PC's
    < 1% acceptable for over clocked enthusiast PC's
  6. Didn't get my new PSU yet, but fixed the issue. Seemed to be related to the 'Nvidia Power Management' set to "Adaptive." I heard this controls fan speeds and after setting it to maximum performance, it fixed the slow downs....
  7. chris13002 said:
    Didn't get my new PSU yet, but fixed the issue. Seemed to be related to the 'Nvidia Power Management' set to "Adaptive." I heard this controls fan speeds and after setting it to maximum performance, it fixed the slow downs....

    Thanks for the note, which prompted me to do some research on the Power Management settings. Just some info for those doing troubleshooting, maybe look at switching to "Prefer Maximum Performance":

    "Power Management Mode: Available only for the GeForce 9 series and above, this feature makes use of these graphics cards' abilities to support different performance levels depending on how much power is required by a 3D application. The available options here are Adaptive and 'Prefer Maximum Performance'. Adaptive is the default, and when chosen the graphics card automatically steps down in clock speed in 3D applications if they are not drawing much GPU power. Adaptive is the recommended setting for all users because it ensures that the GPU steps down its clock speed and hence power usage when it is not required. In 3D gaming the Adaptive setting should not cause any problems, as the GPU will always run at full speed when required without interruption. Typically only very old games and very basic 3D applications may see the graphics card reduce its power, and even then this may be desirable. However if you are concerned that a game is not performing properly, particularly for troubleshooting purposes, then you can change this setting to 'Prefer Maximum Performance' to ensure that the card is always running at maximum clock speed. Remember that this setting only relates to 3D applications and games, not to the Windows Desktop for example."
Ask a new question

Read More

Nvidia Gtx Graphics Product