Hi all, I recently built a new system and had a question regarding the graphics performance. I am playing Rift currently (mmorpg) and the game looks beautiful on Ultra settings. I get 40-50 fps in crowded areas like the main city and 60-70 or higher in other areas of the game.
However, after playing for a few hours, it seems that my fps decreases. I go down to high 20's-30's in the crowded areas and 30's-40's in other areas. After I noticed this, I rebooted my system and I was back up to the great framerates I mentioned above.
Is this due to high CPU/GPU temperatures under load for so long? (I don't have a system monitor to tell me the temps, but I can get one if there is something recommended. But hope my question can be answered generically.) Or is it possible that Rift has a memory leak or something (I heard this term related to other games I've played, but never quite understood what it mean, so just throwing it out there).
Below are my system specs. Note that I am currently using the stock CPU cooler. When I get brave enough to try OC'ing I will upgrade to something better.
Cooler Master HAF X Case
EVGA Superclocked GTX 570
i5-2500k (stock cooler)
MSI P67A-GD65 (B3)
G.Skill Ribjaw X Series DDR3 1600 (CL8)
Literally just built the system this past weekend and Rift is the only game I have on it right now, so I'm not able to check performance in other games currently.
One other bit of information, I built 2 identical systems for myself and my wife - her monitor is better than mine and she plays at 1920x1080. I play at 1680x1050. We both have experienced the same decrease in fps after awhile.
In general can high temps lead to loss of framerate? And does running out of VRAM decrease the framerate over time or would that be overall and constant?
High temps will lead to whats known as thermal throttling, this basically means the hardware (lets say for example your GTX570) will decrease its speed when it reaches a potentialy dangerous temperature, the reason it does this is because its trying to save itself from cooking. Lowering the clock speed and voltage will lower the power it uses and so will in turn lower the heat output.
When a GPU runs out of VRAM the textures still have to be saved somewhere, so it can save textures to the system RAM or HDD, both of these options are alot slower than the RAM on your GTX570, for example my GTX570's RAM runs at 4200Mhz, but my system RAM runs at roughly 1300Mhz...This texture swapping may cause slow downs
It's a little frustrating, tbh, I figured I could run pedal to the metal at ultra with this system. I'm assuming it depends on the game though and maybe I have to back off of settings for Rift in particular. Don't get me wrong it looks outstanding - maybe I didn't have an appreciation for what framerates to expect.
I'm going to try playing on lower settings and see if I experience the same degradation over time and also try to monitor my temps.
CPU (Value, Min, Max)
Core 0 - 38 C, 31 C, 52 C
Core 1 - 32 C, 29 C, 50 C
Core 2 - 31 C, 29 C, 51 C
Core 3 - 36 C, 33 C, 52 C
GPU (Value, Min, Max)
44 C, 31 C, 68 C
Interestingly, changing the graphics settings around A LOT in game had very slight effects on framerates overall. Kind of strange but seems to be common with Rift, along with the fact that was mentioned above, that Rift is not really optimized very well and even high systems get pretty mediocre framerates.