Trouble with GTX 660 SLI, 4770k, and Tomb Raider

minerva330

Honorable
Dec 27, 2013
449
0
10,960
So I upgraded my system with another GTX 660 and a 4770K (see sig below) and everything has been going great.

I haven't had a lot of time but I have been playing about and hour or so a night of mostly Batman:AO and I get about 100fps, My GPU temps and CPU temps under load, have been in the 60's and mid, 40's respectively.

Last night I decided to play Tombraider and while everything was going fine (averaging 100fps), about 45mins in the fps started to stutter, I took a look at me sensors and while my temps were fine my total CPU usage (including threads) was at 100%, I tried to scroll down to look at my GPU but my PC locked-up and I had to do a hard reset.

I ran 20 passes of x264 overnight to completely rule out any CPU issues and it came back clean with no errors. When I get home from work today I will run a GPU stress test (OCTT/3dmark or more likely just benchmark TR).

I have never had this type of issue before so I was hoping someone may have experience into what is going on and how to properly troubleshoot it. To me it sounds like my CPU is bottlenecking my GPU, but I have 4770k, granted I haven't OCed yet but come on...it's two 660's and I haven't had any issues with any other games (although I haven't had a chance to play since the incident, I will do so tonight)


I am gaming at 1080 and running Win 7 64-bit

P.S. In a Google search I did read about some issues with CPU usage, SLI, and TR but it was driver related and new drivers have been released since then (something about turning down laura's hair effects). All my settings are automated through Geforce Experience, save the random tweak here and there.
 
Solution
It is correct for the cards to be at 100%. If they were not at 100% you either can easily run the game or there is a problem. Example, you are playing skyrim Vsync'd to 60fps. The cards are WAY overkill for that game so it will only use like 40-50% of each card. But, if you were playing Crysis across 3 monitors, the cards would probably not be able to max the game at 60fps, so you would see 100% GPU usage.

Tress FX is TERRIBLE and even VERY high end computers cannot run it.
I run 660ti's in sli on my old i5 and have no problems so the CPU is not the issue.

Geforce experience is junk. I always ditch it and set everything up by hand.

What is your CPU temp under 100% load burn test like Prime95?

Also, was anything else running? I have had some usage spikes also from time to time but it didn't lock the computer up.

What PSU do you have?

Also, TR is not the best coded game by a LONG shot.
 

minerva330

Honorable
Dec 27, 2013
449
0
10,960


The stress test I use is x264, its an encoding test, I ran it for 20 passes (basically overnight) and the highest temp my CPU got 47C.

Maybe my browser and like 4 or 5 start-ups programs, i.e., MSI afterburner, trend AV, etc

My PSU is a Corsair TX 850M

Yeah, I have to put on my big boy pants and start to do everything manually versus relying on GE (recently converted console gamer). Basically I am running TR on Ultra (vs. ultimate) with AA set FXAA
 

minerva330

Honorable
Dec 27, 2013
449
0
10,960
Alright update: I just ran Unigine Heaven at max settings for an hour and then TR benchmark for another hour...no issues

My temps and for both my GPUs and CPU was fine. Also my CPU usage only got to 38.9%, which is more of what I am use to.

However, both of my 660's hit 100% of their power and usage for the duration off both tests but this is normal...right (I ask because I am reading a lot of conflicting evidence but nothing from a reliable source, just forums. Although from what I have read on TH it seems to be perfectly fine and actually preferred so I am inclined to believe that)?

I think it may have been the graphic settings in TR, prior Tress FX was on max and I dropped it down to normal. It seems as though it really is poorly coded
 
It is correct for the cards to be at 100%. If they were not at 100% you either can easily run the game or there is a problem. Example, you are playing skyrim Vsync'd to 60fps. The cards are WAY overkill for that game so it will only use like 40-50% of each card. But, if you were playing Crysis across 3 monitors, the cards would probably not be able to max the game at 60fps, so you would see 100% GPU usage.

Tress FX is TERRIBLE and even VERY high end computers cannot run it.
 
Solution