CPU bottleneck in TC The Division at 3440x1440?

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
I've just got my 34" ultrawide G-Sync and the 1080 Ti FTW3 and I'm having some trouble with Tom Clancy's The Division with frequent fps drops detracting from the smooth 90 fps gameplay I'm looking for.

I'm uncertain whether it's a CPU or GPU bottleneck but the CPU avg usage of 85% is a lot higher at this 3440x1440 90 fps than it was at lower res 60 fps. I have the 4690k at 4.3 Ghz...

I've dropped the settings in The Division all the way back to medium but the judders continue, so I wonder if this is pointing more to a CPU bottleneck?
 
Solution
Yeah at this point more threads, not a platform upgrade to same number of threads, is usually the answer. I am so glad this worked out well for you.

atomicWAR

Glorious
Ambassador
nah your CPU is more then fine, it is your GPU that is having trouble here. Your gaming at a close to 4K resolution where all GPUs have trouble maintaining a 60FPS or better frame rate. It is only the fact your not quite 4K your getting as high a frame rate as you are. If your really question your CPU as the issue you can check very easily. Turn down your resolution and in game settings as low as they go. What ever that frame rate is you should be able to get it, assuming your GPU is strong enough, at any resolution CPU wise.
 

atomicWAR

Glorious
Ambassador
for the record I game at 4K with 2 GTX 1080s and even when SLI is supported they have trouble maintaining 60FPS at the highest settings with max filtering and AA. I am also using a much older CPU. At higher resolutions it is more about GPU then CPU. As i stated above you can test a CPUs max render rate easily (or close to it). I am using a 55" UHDTV so 60hz/FPS is all i need to maintain but my CPU is actually good up to 100hz/FPS in most modern games. Sad truth is Intel's new CPUs did not bring much in terms of performance to the table. I am Glad AMD put the pressure on with Ryzen and Threadripper. Hopefully we'll start to see Intel step up there game.
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
Hmm yeah, The Division is known to be pretty demanding, I wonder if it's just a particular graphics setting that it's just not happy with at 3440x1440? Maybe 90 fps is to ambitious?

So I would change to the lowest resolution and graphics settings and try to obtain 90 fps from there? What I'm experiencing is drops from 90 to 70 fps resulting in judders...
 

atomicWAR

Glorious
Ambassador
If you were shooting for greater then 100FPS I would think 2 more cores or a newer arch might help some but even then your talking single digit percentage gains from my experience. Which might be enough to get you there but might not be either. It is tough 4c/8t is increasing just enough in gaming and intel lingered with their core count for so long.
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
True yeah, I think your right about it being a GPU limitation here, I didn't have the CPU showing in the osd so wasn't sure what is was doing but now I see it's quite normal averaging 70-80%, it's not pegged on 100% or anything.

I cranked the refresh rate down to 80 fps and that seems to have made a big difference just in that 10 fps, I'm holding that 80 fps just fine now, I guess 90 fps was a little unstable... It could also be a graphics setting that's tanking the fps, I'm currently slowly moving up from medium, it's those demanding settings you have to look for. So I dunno it could have been 90 fps was to too demanding or a particular graphics setting but now I've dropped back to 80 fps I seem to be holding that quite nicely.

I was thinking about trying to get hold of a 4790k, do you think could help with some games with my setup? What Intel architecture do you think will be worth waiting for as a significant upgrade to Haswell processors?

Edit: So originally I used Geforce Experience optimized settings which was a mix of high and ultra and 90 fps, i'd say it's a combination of those graphics settings and refresh rate that need tweaking to find a stable fps...

Does Geforce Experience make optimizations for high refresh rates if it detects you have it?
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810


Didn't know that, but how is that an issue?
 

atomicWAR

Glorious
Ambassador


If the primary core is at 100% then you CPU bottlenecked. It won't disperse the rest of the load on the other cores evenly to make up for it. Some games are poorly threaded others are well threaded like BF1 which will take every core/thread you have.
 

atomicWAR

Glorious
Ambassador
No it uses more then one thread just fine. Sorry I wasn't clear in my last post. The Division uses 8+ threads no problem. I was saying in some games they don't thread well. I think Carnaxus was stating an issue that affect some users where you need to set process affinities in task manager to make sure your CPU never hits 100%. It cost you some frames but can help with stuttering.

https://www.youtube.com/watch?v=6agsI3syryM

That said unless you have stuttering you shouldn't need to worry. The division will eat every thread you have just fine.
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810


Right I see.

Do you mean don't thread well on 4 core CPU's or 4 Core 4+ thread CPU's? I know that games taking advantage of 4+ threads is quite recent and games only using 1 core I imagine would be more older games?

My 4690K often hits 100% in The Division but it doesn't cause stutters, I guess it's still fast enough that when it is under 100% load it can make the calculations fast enough to not stutter, Watch Dogs 2 is another story, where 100% usage occasionally does cause stutters.

Do you think it would be worth getting hold of a 4790K if I don't plan on a platform upgrade for some time yet?
 

Carnaxus

Reputable
Apr 18, 2017
1,431
3
5,665
Actually, I did in fact mean that The Division inexplicably launches locked to a single core. I don't know what idiot coding Ubisoft has crammed into the game, but I've had to go into my task manager and Set Affinity to use all cores every single time I launch the game ever since the Underground update.

Tl;dr: The Division can and will happily use as many cores as you have...provided you tell it to do so, manually, every time you launch it.
 

atomicWAR

Glorious
Ambassador


Why it doesn't scale as well as say, BF1, it shouldn't be locked to one core. The largest benefit is going from dual core to quad core when gaming at 1080P-1440P however at 4K a dual core is enough since the GPU is the over tasked component at such a high resolution. So i disagree it runs a a single core. You clearly get benefit for up to 4 cores...after which point it actually loses performance according to Tom's, techspot and a most other sites I checked.

http://www.tomshardware.com/reviews/multi-core-cpu-scaling-directx-11,4768-7.html
http://www.techspot.com/review/1148-tom-clancys-the-division-benchmarks/page5.html


 

Carnaxus

Reputable
Apr 18, 2017
1,431
3
5,665
Every time I launch The Division, even if Uplay and Steam are both set to use all 8 of my cores, if I go to my task manager and open up the "Set Affinity" dialog on thedivision.exe it's set to use only one core. I can select all the other cores and hit "OK" and it'll stick until next time I run the game.

I'm not saying it can't use multiple cores. I'm saying for some dumb reason it launches on only a single core, and you have to manually go tell it to use the rest of your cores or you'll have terrible performance.
 

atomicWAR

Glorious
Ambassador
Well your using an AMD CPU FX CPu. While not everyone has the issue some users like yourself (almost always AMD FX chips from what I have read) have the CPU affinty issue you speak of. I actually posted a link on. Point being he is on intel so should less likely be an issue but easy to see if it is. just check task manager and core affinity to see if one core is being used. Regardless of said bug the game will use 4 cores great and more cores but at the cost of frame rate decrease. Not sure why more cores lower frame rate just what I have been reading.
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
Yeah I checked and the affinity for The Division is already set to use all processors on my system, guess this is an AMD issue as you suggest...

More cores lowering frame rates? Where are you reading that?
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
Hey, do you think there's any chance that a 4690K might be a bottleneck at 3440x1440, especially 90-110 fps? There is a lot of extra objects to render at that res and FOV and I know higher framerates tax the CPU slightly more too.

I'm just trying to work out The Division, it appears to be a highly taxing game at this res... I'm going to run some tests at the medium and low presets to see if frame drops still occur, I guess if there are still frame drops on the low settings it could be a CPU restriction...

Some behavior seems a bit odd since moving to 3440x1440, like very brief sharp frame drops when transitioning or loading new areas as though the GPU or CPU can't load it quick enough...

On my mix of med-high last night I had a pretty stable 85 fps until I got into a large firefight with 4 other players where frames were constantly jumping around 70-85 fps. I guess this is where G-Sync is meant to play it's part adjusting to the frame rates... If I didn't have the monitors fps counter turned on I may not have noticed the frame drops although it did feel a little jumpy, but this is meant to be the purpose of G-Sync, to keep things smooth when frame rates vary...

I'm thinking of picking up Ghost Recon Wildlands as it's currently on sale on UPlay... but I'm a little worried about playing it at 3440x1440 as this game is meant to be one of most demanding titles currently out, do you think it will possible to get this game running over 60 fps?
 

atomicWAR

Glorious
Ambassador
games like Ghost Recon Wildlands, BF1 and Watch Dogs 2 will all bottleneck an i5 4690K. They take every thread you have you only have 4. I have seen complaints for all those games when it comes to i5 CPUs in the forums. So yeah in 90-110FPS you will likely be hurting but you never know. With 4C/4T you'll be missing 10+FPS compared to 4C/8T and nearly 25+FPS approx vs 6C/6T while 6C/12T seems to slow it down some from 6C/6T but is still faster then 4C/8T. You can see more on the core scaling here.

http://www.dsogaming.com/pc-performance-analyses/tom-clancys-ghost-recon-wildlands-pc-performance-analysis/2/
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
Was The Division meant to be one of those as well? Interestingly I'm having no trouble running BF1 at 90 fps on ultra at 3440x1440... I had trouble with Watch Dogs 2 on my old monitor/GPU but but found a graphics combination to work quite well, it's very both CPU and GPU intensive, I think that game has some optimization issues too though...

Well yeah I'll be looking into a higher core/thread build later down the road, that's why I asked what you thought the next best CPU/platform would be worth waiting for as an upgrade from Haswell...

I might see if I can get a hold of a 4790K in the mean time to help with those games...
 

atomicWAR

Glorious
Ambassador
Sorry i am not seeing where you asked the best CPU platform only that you were thinking about getting the i7 4790K which would buy you some time before needing a full platform upgrade for now. With the core war started and getting ready to go into full swing with the release of intels x299 and AMDs x399 things will only escalate. Though those platforms are not as gamer-centric as the mainstream tends to be. Going with the 4790K gives you some time to see how things pan out. It is a fairly cheap upgrade compared to a full platform switch. My guess is 6C/12T to 8C/16T threads will be the new mainstream and likely what devs focus there attention on. Once those threads are properly utilized in games HEDT platforms may be advantageous to the gamer's who stream. Much like the 6C/12T to 8C/16T market as been for the last few years when 4C CPUs were plenty for the just gaming aspect. Time will tell.

Right now my money is on AMD as their platform is cheaper and as time has passed many of those games that didn't do well at launch have been patched and AMDs Ryzen is much closer to Intels Kabylake then it was before. Not to mention when gaming at higher resolutions the gap is close to non-existent. 1080P just needed some work was all and Devs were fairly quick to do it. If your a big overclocker an argument can be made for Intel but honestly when it comes down to bang for buck AMD is king of the hill ATM and from the looks of threadripper into the future as well. So what if your overclocks top out around 4-4.1ghz, unless your dead set on hitting 5 GHZ on an Intel solution AMD just seems like a smarter choice. I know I am thinking long and hard about my next build. I was going to wait for PCIe 4.0 to hit but AMD says they are waiting to 2020 and Intel may or may not launch it next year ( i am guessing not but we'll see). Not sure I want to pay the Intel premium for something as small as PCIe 4.0 when historically it makes very little difference in gaming. Even today PCIe 2.0 holds it own against 3.0 minus a very small FPS difference.

Point being with a i7 4970K you can at least wait tell the second gen Ryzen/Threadripper from AMD next year which will likely overclock much better or wait to see how Intel's Coffee Lake CPUs play out the end of this year (if rumors are to be believed) or early next year. Honestly unless you dying for 4 more threads now you could just wait. All of this is not that far off or with the 4790 you may be able to wait for 2 years or more. It is a tough call. I am in the same boat. my i7 3930K performs very well @4.2GHZ for 4K gaming (within 5% of kabylake for gaming at 4k). I probably could get away with waiting til 2018 or even 2019 but I get an itch to build a new rig every 3-4 years and i am past the 4 year mark now. Sadly Intel resting on there laurels really made upgrading almost pointless for a lot of years. The game is only now just changing.
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
Hmm yeah, I think when the 6C/12T, 8C/16T CPU's become well and truly mainstream with Dev's optimizing for them is when I'll think of building again, do you think that is still a year or so away yet? I need to have another look at Intel's and AMD's roadmaps, it's sounds as though the other CPU lines upcoming from Intel and AMD won't be far of X299/X399 CPU level of performance anyway.

If Coffeelake comes at the end of the year it's probably too soon to be mainstream, but I'm sure dev's will start working with them straight away... Maybe the Icelake Tigerlake line could be worth waiting for instead?


Edit: I'm a little lost, is it Skylake-X and Kabylake-X the new CPU's for X299?
 

atomicWAR

Glorious
Ambassador
Yeah x299 is skylake x and kabylake X. As for coffee lake coming out this year (it was due out next originally) the rumor is Intel panicked with the launch of Ryzen. The fact it performed far better then Intel had hoped and undercut both their main stream and HEDT CPUs in everything expect gaming forced Intel in a corner. Intel decided very last second to add 12-18 cores instead of the 4-10 originally planned to the HEDT. Word on the street was this even took motherboard makers by surprise and is one of the reasons we are seeing a stagger launch this time for their HEDT platform with only some of the lower core counts coming first followed by higher core counts in the following months. Anyways this is where the rumor of coffee lake also plays in. Intel wants to have a 6C/12T part in the high end of mainstream to replace 4C/8T parts sooner rather then later to better fend of AMD. Question is....is the rumor true. I have no idea I just know it has been floated around a lot of tech sites as a maybe (sometime between Aug-Oct depending on the site). As for it being to soon for a new mainstream, I agree it is. If the rumor does turn out to be true; however, AMD really did put Intel in to a panicked state as they haven't done anything like this in years. In fact the last time I saw then pull a move like this was when the P4 got floored by AMDs x64 parts. They lost the IPC crown by a lot and to make matter worse for Intel, AMD had dual core parts. Thus the Pentium D (cut and paste job of two P4 cores on the same die) was born. Regardless the next few months thru next year will be very interesting in the CPU space.