So I'm not new to building PCs. I grew up on them. But at the dawn of last gen's consoles, and the fall of my wallet book I jumped to what I saw (possibly incorrectly) as a cheaper form of gaming. With the end of that gen I jumped off the console band wagon a few years back to get back into PC gaming and haven't looked back...until now...and I don't want to. So here is my setup and then I'll tell my problem.
So my issue is I have well above the system requirements for every game except Watch Dogs (but who can play that right now anyway?) and seeing that I do I would expect consistent frame rates from them. Games like Tomb Raider, Crysis 2, Far Cry 3, etc. seem to dip in fps in moderately extreme situations and above even if I use Ultra, High, or Medium settings. Albeit with less frame drops with lesser settings but they are still noticeable. Sometimes from 60 down to 50, or down to 40, or even down to 30. Now I don't mind capping a game at 30 if absolutely necessary to keep a consistent frame rate but doesn't that kind of defeat the purpose of having a beefier gaming rig? Is it normal for games to do this even if you have better than recommended requirements? I have cleaned and updated drivers. I have re installed windows. Monitored temps. etc. to see if anything is funky but to no avail. Any help would be great. Am I expecting too much. I keep reading people playing the same games as me with lesser rigs boasting they can keep a consistent 60 fps but it always seems I have to sacrifice alot of graphical fidelity to do so. So much so that I have considered purchasing the newer consoles to stop worrying about it. Thanks in advance for any help you may have.
it happens in all games across all platforms that user poligons or float maths for rendering. on consoles its less noticeable for a simple reason. they tend to have a way lower polly count so tend not to have as many rat nests. as console games tend to have shallower draw distance which limits polygonal overlapping. basically the larger the draw distance the more chance there are of the game engine throwing up an error.
rats-nest are render errors created by either lazy model making or render errors thrown up by the game engine not converting the 3d model correctly to the games needed format.
there are other reasons too but genraly its the above that throws up the largest amount of problems.
physx has its own set of issues as it adds another level of math ontop of the poligon math. this can result in errors being created where there was no errors before. http://www.youtube.com/watch?v=LvuM3OwYr40 for instance.
Ok so I pretty much need to stop over thinking it. LOL. Another quick question however. How well do you think my setup will perform for next years new releases? Such as Witcher 3, Rise of the Tomb Raider, The Division etc.
tomb raider it will max out pretty easily or it should. as you have a 4gig buffer on the gpu which is big enough to handle the highest quality textures.
the gpu itself is pretty meaty so there should be no issues with that either.
witcher however likes to push the boundary as its pc exclusive so only the very top tier cards will be able to max that out and even then there gonna struggle if the last game is anything to go by. but you will still be able to run it at near max without issue.
devision is gonna max out pretty much as easily as tombraider as it to is on console.
(the above is an educated guess based on previous iterations of both games)
If I max out Tomb Raider with Tressfx it definitely won't stay stable. If i turn tressfx off it still has unstable places. Now If i turn the draw distance down it tends to stay exactly 60 fps flat. I'm guessing this is normal and I should stop over thinking it?
Also, what do you think about a 770 sli configuration for upcoming games like the ones i listed?
770 will be an impressive performer on the latest games assuming they support sli configs.
tresfx cripples nvidia gpus on that game because of the lower than amd performance on opengl or cl(sorry i forget which is used for tresfx)
i know tomb raider didnt perform particularly well on nvidia cards of the time and im not sure if they actually managed to fix it properly... hopefully it will be a better effort this time round.