Do you think coffeelake i7 will show any improvement over kabylake with 1440p gaming with a 1080 or 1080 ti

David_24

Distinguished
Aug 26, 2015
329
1
18,795
I'm saving up to build a computer. I should have enough for the computer by feb or march. Then I'll get a volta or navi for it. Probably a 1080 equivilent not a 1080ti.
Assuming its a 6 core with 11 percent single threaded performance increase. What effect if any would that have on 1440p gaming? 5 or 10 frames? What about into the future when games start using more cores and threads and use the processors better.
Does the 1080 have any room to push more frames on 1440p with a better cpu?

Cause I'm going to get a 27 inch 1440p monitor with freesync or gsync and 144hz.
Do you think i'll ever get to a point where I see a reason to upgrade my gpu with a cpu like that? Before I ever just make a new build by then.
 
Solution
1) 11% ??
I saw the claim, but I highly doubt this applies in an apples to apples comparison of performance at the same frequency. It's probably more applicable to how TURBO BOOST works in the default settings, but likely an OVERCLOCK would even things out more (on both systems).

2) GAMING FPS boost?
As said above, first you need a CPU bottleneck which I can tell you isn't that often with a good CPU. Any SMALL gains are insignificant compared to things like the PRICE of the system.

3) FUTURE and multi-threading?
For most future games, a good 4C/8T like the i7-7700K will be plenty regardless of the GPU. Games will use more CPU processing, but this will be offset by:

a) more EFFICIENT coding (draw calls in DX12/API use less CPU...

Supahos

Expert
Ambassador
I'd be surprised by a 11% gain in most cases. Might happen in some specific area, but unlikely I'm general usage. Most of that gain is a higher single core turbo boost which is easily overcome by simply ocing the 7700k to 4.7. Intel (and everyone else) are notorious for cherry picking the best case scenario. Sounds like 4-5% ipc gain with clock speed making up the rest to me.

In most games the extra threads will mean little if anything, in the future? Maybe but most people are still running dual cores, quad cores aren't a giant market share, for a game designer to make a game run pootly on a solid 8 threaded CPU would be suicide for sales.
 

atomicWAR

Glorious
Ambassador
unless it is a game your getting bottlenecked in, say like BF1 64 player servers...probably not much. I'd wager 5-15 FPS title depending. If you are hitting any bottlenecking though the difference could be larger or at the vary lest smoother game-play which can be an even better indicator of a game experience then frame rate. Remember a time before FCAT? I would swear my Nvida machines were smoother then my AMD ones despite similar frame rates. Boy would the AMD fan-boys hate on me for saying I found the AMD GPU experience sub-par at best, back then. Then FCAT came out and oh wait we can measure what players like me were seeing? AMD scrambled to fix their drivers. And it also became clear Nvidia was aware of the frame latency issue long before and had their own tool for measuring it. Point being numbers are great but you also can't remove the subjective experience from a build.
 

InvalidError

Titan
Moderator

Then there is no point about worrying about Coffee Lake now. By that time, AMD may be about to launch Ryzen Refresh/+/whatever and turn the tables on bang-per-buck again, hopefully addressing most of fist-gen Ryzen's issues along the way. If all goes well on Intel's side, Cannon Lake may also be only a few months away from that point.

Worry about what to buy a few weeks before the fact, not half a year where current data may very well be obsolete.
 

atomicWAR

Glorious
Ambassador


Most likely in the next 2-4 years. Past that I am less sure. Folks were saying the same thing about i5 quad core/threaded CPUs even as recently as a year and a half ago (some might argue less). Surely consoles using 8 threads played a huge role in that but I believe AMD really forced Intel's hand. But your could well be right. With i5 6C/6T cpus hitting devs will likely aim for 6 to 8 core / 6 to 16 thread crowd. Saving those with newer 4C/8T machines for decent gaming longevity. Since 4 hyper threads would slightly under-perform 2 physical core threads. It will certainly be interesting to watch how well games thread from this point forward over the next 4-6 years.
 


I wouldn't assume 11% ipc increase. Intel increased the number of threads by 50% and their own slides don't claim more than a 33% increase in multi threaded performance. Expect the chips to perform about the same as Kaby at similar clock speeds (single thread), however coffee probably will struggle to reach the same clock speeds as kaby.

The actual IPC increase will probably be non-existent (just like it's been since broadwell, the only real increases in performance have been from higher clock speeds and better performance from higher clocked ram, almost no actual ipc increase other than that.)

as for your question, Kaby will benefit the few games that are fully threaded. However almost all online games are nearly completely single threaded (or close enough), which includes competitive shooters, which I assume is why you're asking. Seeing as nothing else really benefits from a 144hz monitor. For which a high clocked Kaby probably will be the best choice, as I highly doubt Coffee will reach the same clock speeds, so for your needs you'll see almost no benefit from waiting.
 

InvalidError

Titan
Moderator

2C4T/4C4T CPUs have been mainstream for most of the past 10 years and it is still only a tiny minority of games (a single-digit handful of titles out of hundreds released yearly) that make significant use of more than four threads. I suspect it'll be a very long time (5+ years) before this trend extends to 4C8T/6C6T and beyond to a significant degree.
 

David_24

Distinguished
Aug 26, 2015
329
1
18,795

I have a dream in my head that encourages me to save money every month. I look online at pc part picker and call it my sanctuary while I shop around new egg to feel good. Knowing what my options could be is fun. And I can always change my build before I buy.
 


Parallel coding is hard. very hard. I was a good enough coder I worked on a game back in 07-09, ended up the lead guy in the coding team by the end (probably 40% of the scripts were written by me). Yet I couldn't manage anything at all with parallel coding, and that's 2 cores. It's too much to juggle. the difficulty of coding seems to go up by the square of the number of threads involved. I've noticed most games are 1 thread with a bunch of maintenance tasks offloaded onto other cores. Which seems the way they try to deal with multi threaded game coding, which of course isn't really parallel coding. Worse some things have to happen in a certain order. So parallel coding doesn't even work for some things.

I suspect games will remain mostly 1-3 core monsters forever.
 

David_24

Distinguished
Aug 26, 2015
329
1
18,795


Not trying to doubt you but when I look at a lot of first person shooters 8 threads beats out 4 cores by 15-25 frames.
 
At 3.7-3.8 GHz base clock, I'd predict a performance decrease (8700K vs. 7700K) at stock clocks...; perhaps some overclocking to 4.3 GHz might surpass the 7700K. (I'm assuming the 150+ watt under full load requirements are what drove the Z370 series need....)

Update: just saw a slide shown on Youtube/Gamer's Nexus that showed some 8700k specs showing 3.7 GHz base, but a 4.3 GHz turbo for 6 cores, and 4.6 GHz turbo for single core? It jindeed ust might pass the 7700K in gaming after all! Good times a comin'!
 

David_24

Distinguished
Aug 26, 2015
329
1
18,795


Cool story. I really want to get into game design and some other content creation but I can't seem to get myself to start beyond the 10-15 hours of youtube ive watched on it. I'll be designing my own games from home using free software and unity so I probably won't need anything past a ryzen 1600 anyway.
Its just I looked at it and in 1440p the 1600 is almost the better value compared to 7700k unless you're a frame junky.
Obviously I'll look again in feb march when i'm saved up. But even then I'll probably end up buying 1600 refresh or something if I can't push myself into a 8700k.
 


check the clock speeds. if you're talking about an i7-7700k, those cpus hit the highest clock speed regularly in the intel lineup. if it's a 5ghz cpu of COURSE it will outperform a lower clocked 4 core. That's not the number of threads, that's straight up the clock speed.
 
1) 11% ??
I saw the claim, but I highly doubt this applies in an apples to apples comparison of performance at the same frequency. It's probably more applicable to how TURBO BOOST works in the default settings, but likely an OVERCLOCK would even things out more (on both systems).

2) GAMING FPS boost?
As said above, first you need a CPU bottleneck which I can tell you isn't that often with a good CPU. Any SMALL gains are insignificant compared to things like the PRICE of the system.

3) FUTURE and multi-threading?
For most future games, a good 4C/8T like the i7-7700K will be plenty regardless of the GPU. Games will use more CPU processing, but this will be offset by:

a) more EFFICIENT coding (draw calls in DX12/API use less CPU cycles), and
b) better THREADING using more of the CPU cores.

Not saying a 6-core won't have its place but unless you edit/convert video, multi-task or play games that specifically are very CPU demanding and benefit from such a CPU it won't matter.

4) GTX1080 more frames?
Not really, because again most games aren't CPU bottlenecked and should be less so in the future.

5) Upgrade GPU?
Sure, games will get more demanding. Even now I have to tweak some games (GTX1080) to maintain a mostly solid 60FPS experience. Games will continue to push graphics though I doubt I'll upgrade for another three years.

*Again, not really a CPU issue.

6) Freesync/GSync
Asynchronous monitors are awesome. This also helps games run smoother if they stutter due to a CPU or GPU bottleneck. (frequent CPU bottlenecks are still common and can often cause stutters but this is mostly a CODING issue as raising the FPS by 10% hardly matters when it's really low for a short time).

It's much easier to TWEAK GAME SETTINGS when you don't have to worry as much about a specific FPS cap (like maintaining 60FPS on 60Hz monitor with VSYNC ON... and drops cause stutter if you don't enable Adaptive VSYNC... but then that causes VSYNC to turn OFF which causes screen tear.. sigh! So GSYNC/FREESYNC rocks).

7) FREESYNC issues:
GSync is superior overall, though Freesync 2/HDR is coming or here and this tightens up the standard. Freesync has two issues right now:

a) excessive blur/ghosting on some monitors due to OVERDRIVE issues (pixels change at different rates so can't always hit the correct color)

- not saying GSYNC is perfect, but the point of the MODULE (which will improve with GSYNC 2) is to design around issues like this at a hardware level. (There's no GSYNC monitor I'm willing to buy yet... waiting for HDR version for much less than they cost now)

b) Freesync RANGE can be narrow, and info may be very hard to find. (i.e. 40Hz to 60Hz, or on some 144Hz monitors the range is only 48Hz to 90Hz which means in the last case going over 90FPS causes screen tear or stuttering depending on whether VSYNC is now ON or OFF).

c) LFC (Low Framerate Compensation)
- a non-issue for GSYNC, but due to the lack of a hardware MODULE on Freesync monitors to stay in the smooth gaming range, the drivers (software) must resend a frame to stay in synch. For example, if range is 30Hz to 75Hz and you drop to 28FPS then each frame is sent TWICE so the monitor updates at 56FPS thus in the range. Only 28 new frames, but you are in the smooth, tear-free zone at least.

- you need the max/min ratio to be at least 2.5x (75/30) so if it is 48Hz to 90Hz there is no LFC support thus dropping below 48FPS means you are using VSYNC OFF or ON (but not Freesync) thus stuttering if VSYNC ON or screen tearing if VSYNC OFF

*There are reasonably priced, good Freesync monitors but you have to do your research more. (though if I had the above 48->90 monitor I'd probably set an FPS cap of perhaps 80FPS or less depending on how fast the game is, then tweak settings so I rarely drop below 48FPS... like all PC gaming you get the best results if you understand your hardware)

**I suspect GSYNC and FREESYNC pricing will become nearly identical for similar quality products since Freesync 2 will cost manufacturers more R&D time thus the price goes up. They can't do some half-ass implementation to check the Freesync box. GSYNC is also a known module so it's probably going to be EASIER as time progresses to add GSYNC to newer monitors than to create their own Freesync 2 version.

We'll see.

Final note: XBOX ONE X supports Freesync 2/HDR, though no HDTV's exist that support it yet. I personally like the idea of an HDTV or MONITOR that can run both a gaming PC (AMD GPU) and the new XBOX console in Freesync mode.
 
Solution

David_24

Distinguished
Aug 26, 2015
329
1
18,795
xbox one x is a lopsided joke. If you don't have a 4k tv the value is just terrible.

I guess I still picture myself getting the 1600 refresh then. I really wanted a reason to buy the next i7 but I don't think the content creation I'll be doing is that demanding and gaming is the main focus of the machine im buying. Still will be fun to see the actual benchmarks on the 1080 ti when it comes out.
 

atomicWAR

Glorious
Ambassador
@ingtar33 and invaliderror

Not sure I agree with your assessment outright but certainly know where your coming from on core usage in games. I have been answering threads for over a year now on i5s (4C4T) cpus with equivalent OCs as there i7 replacements where 4 more threads was the difference between a stuttering mess and smooth game-play. Sometime as much as 15+FPS, again same clocks. Others it was nearly the same frame rate but it was a smoother experience. Now that one has been long coming in my opinion so it is less clear on going to >8 threads will mean in gaming. Likely things will be fine for another couple years plus as I stated earlier. Maybe even another 5 or 6 years. But I don't believe things will hold much longer. When I built my i7 3930K rig everyone and their grandma said nah anything more the 4 cores is overkill. I hedged my bets otherwise...plus I needed the extra PCIe lanes so I was stuck regardless. When I first finished my build most games strongly stressed a single core and used the remaining 11 threads lightly.

Fast forward a few years and now practically every new game I buy (AAA wise) uses all 12 threads evenly. No one thread is stressed to hard. So while in the past I would agree that 1-3 threads was the direction of things...usually more like 1. I don't see it as a handful of games either (BF1, Watch dogs 2, GTA V, The Division, Rise of the Tomb Raider, The Witcher 3, to name a few). I won't say it is every game but enough I wouldn't be happy with less CPU resources. Point being if they can code now for 12-16 thread fairly evenly my guess is the games that will continue to use more of these resources. Yeah with 12T CPU I usually looking at 30-50% utilization where as a 8T CPU can still handle that load but with higher usage per thread (say 75-80% or more). And as stated 4C/4T cpus have been bottlenecking in some games for awhile now so they can't keep up any longer. And yes there are plenty of dual cores out there (hyper-threaded or not) in use for gaming. A quick look at steams hardware survey is quick to reveal as much. But again I answer plenty of threads where folks are looking for confirmation that 100% CPU utilization is a CPU bottleneck on said CPUs. Granted there will always be gamers using less then they should/could for gaming and may well be in the majority for all time.

I try to build a rig for what I know my usage needs are so I don't end up one of the OPs on Toms wandering why my FPS is less then stellar. I give my advice the same way. While I get budget dictates all in a build...if a builder can afford better I surely won't suggest CPU that may be sub-par if there is no other reason , again except a budgetary restraint. Back on the point. Right now no one knows for sure what direction games will go for sure, not even me, as much as I might like to believe otherwise. We only have hints using the past and present. The past says a low number of threads will be used. The present says as many as 16+ threads can be used BUT 8 threads is still enough (4C/8T). This is why I say it is an interesting time to watch how Devs handle threading in games in the future. I am hedging my bets on the 6-12T being the new go to for gaming over the next 6 years maybe less but unlikely. I could well be wrong. Time will tell.

@invaliderror. I agree with 99.9% of your posts. You are generally a wonderful mod/poster and am not 100% against yours now. I am just not sure your right either. No question the past is on your side though.
 
(BF1, Watch dogs 2, GTA V, The Division, Rise of the Tomb Raider, The Witcher 3, to name a few).

BF1 - 3~ core game; however on multiplayer it's still functionally a single core title (as in there is 1 main thread during multiplayer)
Watchdogs 2 - 2 core game
GTAV - 3 core + game (this one has a lot of other tasks which it offloads, but it is still at its heart a 3 core title; still fast ram, lots of vram, and lots of L3 cache will significantly improve performance in GTAV, and to a point more cores can help, because it does offload a lot of nonsense onto free cores if they're availible, still you need 3 real fast cores to make this game hum)
The Division - A badly coded 2 core game (still having 4 FAST cores is probably ideal)
RotTR - this is almost a fully threaded title. however there are definately 3 main "threads"
TW3 - 3 core game

OP was talking about FPS, and the only one you listed was BF1; which when playing on multiplayer you're still getting limited by "main" core's demands, sure it will use 3 cores heavily, and it will even share out tasks to most cores you've got, but on multiplayer 1 thread will do the lionshare of the lifting. Like with most MMO and FPS.

NOW there is a major exception to this rule, if you're streaming the more cores the better. Also cache size can play a roll, in some titles lots of fast cache will make a difference where core or core count will not. remember, i7 cpus don't just have more threads then i5s they also sport more cache, which DOES make a difference in a number of titles, even some of the ones you listed will show improvement just from bigger cache.
 

atomicWAR

Glorious
Ambassador
^ Not exactly my experience with those games.

BF1 in multiplayer uses 8+ cores, especially in 64 player servers. Though one core is slightly favored (especially in single player) so I'll give you that but even then only having 4 cores can screw your frame rate in multiplayer. So jumping higher then 4 cores/4 threads is still best IMHO. Asking for more then 8T in that game for most players is a bit much. Even then have seen some new i7 4c/8t users also complain of having 64 player bottlenecking issues. So we'll have to agree to disagree there as a whole.

Updated
GTA V I have had zero issue getting that game to run on 12 threads but I am reloading it now to retest to be fair it's been a year since i played it last. *So I retested and found 4 threads were perfectly uniform in preference while the remaining 8 threads only had slightly less preference. So close I would argue the difference is mute. Or more accurately 10% or less difference on the remaining 8 threads. Preferred cores near 50-60% while less stressed cores 45-55%. That's so close I have a hard time believing the extra cores/threads aren't worth it. So as said my experience does not match yours here either.

The Division...I actually don't own that one but have trouble shot it pretty extensively last month for another OP. I like you thought it was a badly threaded 2 core ideal for 4C. After many tweaks, OS reloads, etc. The OP went with a 4C/8T i7. Only then was the game-play smooth. So We'll call this one less clear since i don't own it. Not a win either way.

RotTR Threads perfectly fine on my rig evenly all the way to twelve. Won't say there may not be preferred threads but if so it hasn't been plainly clear on my system. Even after just rechecking it before posting. So again have to disagree for the most part.

TW3 again not my personal experience with 3 cores. More like 6 cores well and the extra ones, well less so. So again have to disagree but not a true 8T game so your not totally wrong there since I did state 8T.

Looking at out build differences makes me wonder if there is an optimization issue between our CPU platforms but I haven't seen anything like this posted yet which to be honest I am now curious to collaborate with someone like yourself who has an FX build to see what we come up with. FX vs older 6C/12T Intel i7. Regardless those weren't the only games I had in mind just what I listed. If you really like I can list more but at this point something tells me you'll just shoot holes in anything I post, say like Ghost Recon Wildlands. However I don't think it is malicious the contrary actually. You seem to be someone like myself who posts info they have produced themselves. Or when not able to, from a source you trust. Self produced always being first choice.

There will always be camps on what is next and what is not. I am not discounting your view on core/thread count as it may well be accurate long term. I just happen to believe things are going a different direction from what I have observed first hand, just like you. And again this is why I say it is an interesting time to watch what Devs do in threading for games. And one point neither of us disagrees on. When game streaming, more cores/threads is certainly better. As for the cores/threads vs cache. That may be true in some titles but as I posted in some of the upgrades with OC'd i7s replacing OCd i5s...thread and cache count changed. So I can't rule it out. I'll own what I can't prove. That said your position is tough to prove as well. Finding a CPU with the same architecture but different thread counts (4v8) and the same cache at same clock speed (OC is ok but even then)? Can't say I know of a CPU of the top of my head that would fit that bill. So you are in the same situation as me. Dead locked.

Despite not seeing eye to eye I do enjoy a good conversation on gaming, where it has been and where it is going. Seriously don't take me as a Troll or some newb that doesn't know his stuff trying to cling to an argument that is lost. Most of what I posted has been tested by myself. Not pulled from some random web page or review. Not to imply your info is. Again quite the opposite. With your coding chops you clearly know more then most users. I am not certain what held true in 07-09 will stay that way indefinitely. Guess we'll find out!

 

InvalidError

Titan
Moderator

If you are seeing 30-50% "fairly evenly" in Task Manager, keep in mind that the "evenness" can be caused by Windows rotating the same 2-3 threads responsible for most CPU usage across CPU cores, not the software having that many simultaneously active threads generating "even" CPU usage. Use Process Explorer to spy on the game's thread table and what you'll see in most cases is one thread using 100% of its CPU's time, a 2nd thread using something like half as much, a 3rd thread using even less and then a bunch of miscellaneous other threads totaling another 10-25% of one core for a combined total of 2-3 cores worth of CPU usage.

Another problem with CPU usage as a measure of "progress" is that you cannot tell if threads with high CPU usage have high CPU usage because they are doing actual useful work or because they are spending a large chunk of their time busy-waiting for interlocked exchange thread synchronization which game programmers often use when they need to wait on other threads but don't want to incur the partial/full context switch overhead of using mutexes, semaphores, message queues, etc. and possibly lose the remainder of the thread's time slice in the process as every system call carries the possibility of the OS scheduling something else on the core that initiated the syscall, something you don't want to occur too often within performance-critical threads.
 

atomicWAR

Glorious
Ambassador
^I am aware of how windows chucks processes thread to thread. I re-checked GTA V with task manager, MSI afterburner and process explorer. While the results had some variance. MSI after burner and task manager agreed on 4 threads primary 8 threads at approx 10% less usage. Process explorer gave me 6 threads primary and 6 threads running approx 10-15% less. So the number of threads went up and while usage on non-primary threads went down up to another 5% in the worst case. I only tested GTA V since it is getting late and time for sleep has nearly come. Will recheck some of the others tomorrow. For me the point is still not made that only 1-3 threads are used. I can only go by the numbers my system is producing.
 

InvalidError

Titan
Moderator

There are tens of thousands of games on the market. The dozen or so that are commonly used in CPU reviews are not representative of the market as a whole as they are picked specifically to emphasize CPU performance. If you pick samples from an already biased group, you end up with similarly biased results.
 

Supahos

Expert
Ambassador
Yep you could pick a set of 10 games to show a 7600k is better than a 7700k, the opposite of that, and even make a 1800x look like the king, or you could make it look like an i3 was just as good as all of them. None of the newer CPUs are better at everything. Some games a 6900k beat a 6850k which beat a 7700k... So there's no "best CPU" without qualifications.
 

KirbysHammer

Reputable
Jun 21, 2016
401
1
4,865


Most late 2016 and 2017 games can utilize more than 4 threads effectively these days. Those games will run fine on lower core models but performance will take a hit if single core performance doesn't make up the difference. Consoles are pushing this forward thanks to the Xbox one and ps4's 8 core processors.