So I'm not new to building PCs. I grew up on them. But at the dawn of last gen's consoles, and the fall of my wallet book I jumped to what I saw (possibly incorrectly) as a cheaper form of gaming. With the end of that gen I jumped off the console band wagon a few years back to get back into PC gaming and haven't looked back...until now...and I don't want to. So here is my setup and then I'll tell my problem.
win 8.1 64 bit
Asrock Extreme4 Mobo
I5-3570k
8 gb ram 1866 mhz
Gigabyte GTX 770 4 gb edition
120 gb ssd
1 tb hdd
1920x1080 monitor
So my issue is I have well above the system requirements for every game except Watch Dogs (but who can play that right now anyway?) and seeing that I do I would expect consistent frame rates from them. Games like Tomb Raider, Crysis 2, Far Cry 3, etc. seem to dip in fps in moderately extreme situations and above even if I use Ultra, High, or Medium settings. Albeit with less frame drops with lesser settings but they are still noticeable. Sometimes from 60 down to 50, or down to 40, or even down to 30. Now I don't mind capping a game at 30 if absolutely necessary to keep a consistent frame rate but doesn't that kind of defeat the purpose of having a beefier gaming rig? Is it normal for games to do this even if you have better than recommended requirements? I have cleaned and updated drivers. I have re installed windows. Monitored temps. etc. to see if anything is funky but to no avail. Any help would be great. Am I expecting too much. I keep reading people playing the same games as me with lesser rigs boasting they can keep a consistent 60 fps but it always seems I have to sacrifice alot of graphical fidelity to do so. So much so that I have considered purchasing the newer consoles to stop worrying about it. Thanks in advance for any help you may have.
win 8.1 64 bit
Asrock Extreme4 Mobo
I5-3570k
8 gb ram 1866 mhz
Gigabyte GTX 770 4 gb edition
120 gb ssd
1 tb hdd
1920x1080 monitor
So my issue is I have well above the system requirements for every game except Watch Dogs (but who can play that right now anyway?) and seeing that I do I would expect consistent frame rates from them. Games like Tomb Raider, Crysis 2, Far Cry 3, etc. seem to dip in fps in moderately extreme situations and above even if I use Ultra, High, or Medium settings. Albeit with less frame drops with lesser settings but they are still noticeable. Sometimes from 60 down to 50, or down to 40, or even down to 30. Now I don't mind capping a game at 30 if absolutely necessary to keep a consistent frame rate but doesn't that kind of defeat the purpose of having a beefier gaming rig? Is it normal for games to do this even if you have better than recommended requirements? I have cleaned and updated drivers. I have re installed windows. Monitored temps. etc. to see if anything is funky but to no avail. Any help would be great. Am I expecting too much. I keep reading people playing the same games as me with lesser rigs boasting they can keep a consistent 60 fps but it always seems I have to sacrifice alot of graphical fidelity to do so. So much so that I have considered purchasing the newer consoles to stop worrying about it. Thanks in advance for any help you may have.