Bad performance on good hardware - usage problems?

ThatAfricanDude

Reputable
Nov 3, 2014
621
0
5,160
There is about a million different threads regarding bad performance and low component usage but I cant find anything that helps me out, so maybe someone can help me out here.

Specs:

R5 1500x (no OC)
Evga Gtx 980 (bought used, 125mhz OC)
16gb corsair ram (2133mhz)
750w xfx 80+ gold psu
Ssd + hdd

To prevent me from wasting your time:

I live in an hot climate, and even with my minor OC on the gpu it rarely hits 70C during games(cpu is always in the 60c range, ryzen stock coolers are lit) so i doubt thermals is the issue.

I have multiple tweaks going on in nvidia control panel (yes power management is set to prefer max performance, also i have adaptive vsync on if thats relevant)

Now my issue: I can easily hit 60fps, but its been hit or miss. Sometimes i turn on my pc, 60fps constant, over times fluctuation between 50 and 60fps. Note I only play on 1080p.

The thing is during these dips, even lowering graphical settings make NO DIFFERENCE AT ALL. Like seriously its weird, fallout 4 everything ultra, 50fps, same game everything turned down, still 50 fps.

World of tanks also, maximum graphics, i get 50 fps on some maps, turn it wwayyy down, still 50 fps.

Now whats odd for me, is whenever i go to afterburner to check on my hardware, temps are way below my set limits, but usage is only hovering in the 50% for both my cpu and gpu? Why does this happen? I mean i have low fps, why doesnt the components increase usage for a better more stable fps? Maybe this might not be the issue, but i feel it might be related, whenever i watch benchmark videos of even weaker hardware like a gtx 970, the gpu and cpu usage are almost always above 80%, and then they achieve the same if not better performance?

I dont really complain, my performance is alright, but due to the vsync turned on (my tv gives a ton of screen tearing), this is quite annoying, also i feel im not getting the most efficiency out of my hardware.

Does anybody have any idea whats going on? Thanks.
 
Solution
Depending on what you are doing, especially in a video game with different area or a lot of movement, shadows, etc... fps will change based on what it is rendering.

Unless you are exactly recreating a scene in a game. Comparing one instance of a game to a few hours later instance, it wont be the same...

This makes troubleshooting FPS issues pretty difficult.

I would monitor your temps very closely, not just when the dips happen. If your system is being thermal throttled, the cpu or gpu will downclock to compensate for the heat thus, heat will drop and fps will drop. Then when temps go back down the cpu or gpu will return back to normal, reach the heat limits and thermal throttle again.

Start your PC from a fresh boot and right away...
Sound like you have set a fps limit somehow. Maybe with RIVATUNER if you use afterburner, maybe in the game profile. That would explain the 50 fps limit Adaptive sync only works if you have a capable monitor with either g-sync or freesync (VESA adaptive sync is the basis of freesync).
 

ThatAfricanDude

Reputable
Nov 3, 2014
621
0
5,160
The dip is down to 50, even now during my last run of fallout 4 into the 40s in open space, no stable 60, alright so if adaptive doesnt work, what option should i choose? Because without some sort of vsync enabled my games are a tearing mess.


Also, i just tested something, disabling vsync caused up to 80% gpu usage, but my fps were very unstable, fluctuating from 120 fps to 48 during my fallout 4 run. I have already clean installed drivers, but if you ask me, this nvidia is starting to gimp the 900 series because 980s never performed this poor?

EDIT: This is my first gaming rig, barely 2 months old, and so far everything with nvidia has been a hit or miss, the only reason i got it in the first place is due to the shortage of any other midrange gpu due to miners.
 
Depending on what you are doing, especially in a video game with different area or a lot of movement, shadows, etc... fps will change based on what it is rendering.

Unless you are exactly recreating a scene in a game. Comparing one instance of a game to a few hours later instance, it wont be the same...

This makes troubleshooting FPS issues pretty difficult.

I would monitor your temps very closely, not just when the dips happen. If your system is being thermal throttled, the cpu or gpu will downclock to compensate for the heat thus, heat will drop and fps will drop. Then when temps go back down the cpu or gpu will return back to normal, reach the heat limits and thermal throttle again.

Start your PC from a fresh boot and right away use a program like hwmonitor to monitor those temps. Start a game and keep monitoring it, see what the max temps each devices reaches and go from there.

For these tests. I would also remove your overclocks. Set things back to factory defaults and see if it is more stable. Even a small OC can cause these types of issues if something is off like voltage.
 
Solution
I think there's some confusion over "Adaptive Sync" and "Adaptive Refresh." Adaptive Sync is essentially Vsync when you're producing as many fps as your monitor's refresh rate, and then uncapped below that. Adaptive Refresh is the name given to technologies like G-Sync and Freesync. If Adaptive Sync is what you were using, and it worked for you, by all means keep it on. But double check all your settings including monitor refresh rate, and the in-game as well as driver settings. A 50 fps cap doesn't sound right.

And when changing graphical settings doesn't affect framerate, it usually means it's not the GPU but some other component that's causing issues. Could be anything from a single faulty RAM chip to a flaky hard drive to some weird system configuration.
 

ThatAfricanDude

Reputable
Nov 3, 2014
621
0
5,160
Yeah, ryzen really benifits from higher speed ram, but at the time of buying ddr4 prices were insane and i couldnt afford better, i honestly think it is the ram. Ill buy a higher clocked kit when i have the money.


Interestingly, when i tried to play minecraft, my fps sucked, like 45fps average which is really weird, cpu and gpu usage was only in the 10% region, can someone explain this? Is there in any way a correlation between this and low fps in other games?