Are 4 threads truly "enough" for a gaming PC these days?

mason-the-deathbat

Commendable
Nov 1, 2016
68
0
1,640
First off, I'd like to start with a caveat that it depends on what you play. If what you're into are games like Call of Duty, Counter-Strike, Rocket League and the like, then yes, 4 threads are more than enough.

Over the past year I've seen an uptake of CPU usage in modern titles. Gears of War 4 showing a higher framerate with an i7 w/ a 1080 over an i5, Battlefield 1 using all the CPU it can (although it does run fine on i5s and i3s), Deus Ex: Mankind Divided not working the best with quadcores unless they're heavily OC'ed and even then it'll push you to 100% usage with a high enough GPU (or low enough settings), Watch Dogs 2 being heavily threaded and running terribly unless you have an i7 (although I think this is an optimization issue), and more games taking the same crown Crysis 3 had in 2013 with CPU usage. You can overclock your i5 to "mask" the need for more threads now (except for WD2 where OC does next to nothing) but what happens when you can no longer do that? Are 4-core, 4-thread CPUs truly enough if you want to play the highest end titles?
 
Solution


Overall ipc hasn't improved a ton, no. However there are very real fps differences between 4th and 7th gen, much less 3rd and 7th gen. A combination of things from ipc, faster clocks, ddr4 etc.

My comments were regarding the testing method which is all over the place. Let enough variables change and it's no longer apples to apples, it's apples to starfruit. Ryzen is obviously shown as recent because it is, it's brand new. Why not use recent i5's and i7's? In many games newer i5's have an advantage over older i5's, that's how tech works as it improves...
Yes. For the last 5 years it's been the same. Every year there are a handful of titles that actually take advantage of more than 4 cores, but even with an Intel i5 Cpu, running a game that performs better on an i7, the i5 still performs perfectly well. It's still hard to recommend an i7 on most builds unless the hyperthreading will absolutely be used by graphics programs, or there is the extra money to spend.

*edit* To use your example, Deus Ex: Mankind Divided, the same saying holds true. It really depends on the GPU in most cases.

Techspot, btw. I should really credit the site.

CPU_01.png


Yes, it's a high end Gpu, but even the aged Sandy Bridge i5 does well.
 

mason-the-deathbat

Commendable
Nov 1, 2016
68
0
1,640
I think the fact that say, a 3570K bottlenecks anything past the GTX 1060 in the case of Crysis 3 and can drop you below 60 at times during actual gameplay rather than the built-in benchmark (my own i5 is proof of this) unless you heavily overclock in the case of Deus Ex shows the strength of having more threads since the 3770 non-K will run perfectly fine at stock. I'd recommend a Ryzen 5 over an i7 on the budget. the i7 is faster but Ryzen 5 will run just fine.
 

mason-the-deathbat

Commendable
Nov 1, 2016
68
0
1,640


Crysis 3 bench here lol. And yeah, it honestly depends on where you're at in the game. First mission in Dubai, the breach in the NSN, etc. are all GPU bound in my experience and even i get about 80+ FPS on high. but as soon as Prague hits, I'm down in the mid-40s, GPU usage goes down to about 85%, etc.
 


I honestly don't watch youtube reviews. I can assimilate the information faster reading than watching someone, so that's what I prefer.

Back to your original question though, yes 4 cores are *enough* for gaming right now. You can get better performance in some titles(and apparently other titles on various maps) with a higher multi-threaded Cpu. However, with a good enough Gpu, 4 cores is plenty enough. This holds true for 80% of gamers builds. No, not 4k resolutions or multi-Gpu setups, but for the average Pc gamer up to medium-high range builds, 4 cores does just fine.
 

mason-the-deathbat

Commendable
Nov 1, 2016
68
0
1,640


Yeah, the GTX 1070 was at about 70% usage the entire time in that benchmark. And yeah a lot of people would be fine with an i5 and overclock it. But how much longer would that be viable if more games pull a Watch Dogs 2 and just utterly refuse to let i5s hit 60 stable? lol
 


No, non-HT four core CPU's are on the way out for gaming and are already way behind for general tasks like photo editing and video production. Just like nobody buys a dual-core for gaming today, no one should be buying a non-HT quad core today and expecting it to last very long. More and more games are starting to leave i5 class CPU's behind. There are a handful of old games still used for GPU testing that show better results with a high clocked i5, but those old game engines typically respond to IPC and clock speed much more than thread count. With game developers already putting out games that provide solid performance gains with more than 4 cores why would anyone looking at a new system even waste time on a non-HT quad CPU ? Just buy an i7 or Ryzen and never look back...
 

mason-the-deathbat

Commendable
Nov 1, 2016
68
0
1,640


Yeah, although I think even COD isn't that heavy on your CPU at that framerate. I can run MW1 at about 250-300fps in the campaign and that's where my CPU starts to hit near 100% load. but that's after going way above 200 FPS.
 

mason-the-deathbat

Commendable
Nov 1, 2016
68
0
1,640


Absolutely.
 


Just updated my post with links that might help
 

mason-the-deathbat

Commendable
Nov 1, 2016
68
0
1,640


yeah those games definitely benefit from the extra threads. and GTA V stutters on i5s after 160 FPS while i7s are fine.
 


"after 160 FPS" Who cares? Who notices? Just because you can benchmark a difference, doesn't mean anyone will notice a difference.
 


I like to toss a more appropriate caveats that answers the question:

1) Is the ONLY thing your doing is PLAYING the game?
2) Are your 'expectations' for playing the game the SAME as how it is sold to you in advertising?

<1> If your not recording / broadcasting / twitching / streaming music / running multiple discord / teamspeak channels , etc. etc. etc. Then yeah for the most part a i5 does the job,

<2> If you expectation is with 8-16GB of RAM, normal 1TB 7200RPM drive and at least a 1050Ti , yeah any game plays decently near or around the magic 60FPS number, on a single normal 1080p 24" or so display - all things being equal (adequate Internet, no viruses, cleaned out PC, etc.).

BUTTTTT... as the song goes "Everyone wants to be a Rockstar!". Which means they want Triple 40" screen 4K displays, doing 100FPS MINIMAL while streaming to Youtube, Twitch, and any other venue, WHILE hosting Discord, Teamspeak, Overwolf, etc. all at the same time WHILE streaming music from the Internet, having emails, twitter feeds, etc. ALL live and no 'delays' all at the same time - Just like they 'see' on Youtube.

That is the problem the large majority of posters pose the same question or 'expectations' then run into reality. So it all depends what the 'management of expectations' needs to help resolve it and customize the solution to the budget to best perform a reasonable actual result.
 
There are so many factors that it's difficult to say. It's even more important that people buy hardware they specifically need rather than relying on overly vague and often flawed one size fits all approach.

What game is being played? What resolution? What refresh rate? All those things factor in. Well if you use an i5 with anything over a 1060 it's going to bottleneck. Which game specifically? And by bottleneck are we talking it will cause fps dips into the 40's or it will cause framerate drops from 80fps to 65fps? It will matter on a 144hz screen but what about a 60hz panel? Suddenly the 'bottleneck' is the monitor, not the cpu.

Same applies when people suggest a gtx 1080 is being held back. Held back at what resolution, 4k? 1440p? 1080p? If someone's using a 1080 on a 1080p display, regardless of which cpu it's probably going to be held back. Some could argue it's the cpu not pushing enough frames but it's likely just as much a case of overkill gpu on too small of a screen resolution.

If a particular cpu isn't working for the games you play then sure, upgrade. However you can't assume that everyone who 'games' is playing your games. Gamer, gaming pc etc are too loose in terms of being meaningful. Solitaire isn't witcher3 isn't cs:go isn't gtaV. It's also difficult to discern from some benchmarks like the one posted by the op regarding crysis 3.

Yes it showed how ht or multiple threads can be used but it was also an odd comparison. Why all three cpu's at 4ghz? That's not default speed nor is it 'normal' oc speed. The r7 1700 was oc'd to 4ghz (stock turbo is only up to 3.7ghz), the i5 was oc'd to 4ghz (from a max turbo of 3.8ghz) and the i7 6700k was gimped at 4ghz since max turbo is 4.2ghz. Are we testing stock, are we testing oc, are we testing just kind of oc this and not that? Same with the generations. Are i5's still as capable? Let's have a look by testing an i5 from 2012 against an i7 from 2015 and an amd cpu from 2017. That sounds legit. Why not test with an i5 7600k or i7 7700k? If we're comparing latest gen.

Basically all this said was when all bets are off and a 'test' is done with the i5 in the worst light being the oldest/slowest and fewer threads, a 5yr old i5 doesn't do half bad. The point of benchmarks are to remove variables, not skew the conditions and monkey around with variables to get different outcomes. This is why people don't take youtube vids seriously a lot of times.

There are different ways to test but it's important to make mention of what's being tested, why the test is done the way it is. If oc is entering into the mix, put a real oc on the various cpu's. Put the i7 up to around 4.7ghz or the i5 up to around 4.4 or 4.8ghz depending which i5 is picked and how many gens back the tester goes. At least try to compare an apples to apples same gen i5/i7 with either both on ddr4 or both on ddr3. Ram can have an impact, motherboard can have an impact. Other benchmark sites which used a newer i5 like the 7600k found in crysis3 when paired with a gtx titan (not a low end gpu) that it was capable of averaging nearly 100fps, far more than needed for a 60hz panel and a far cry from the 55-60fps seen from the 3570k.

http://www.eurogamer.net/articles/digitalfoundry-2017-amd-ryzen-7-1800x-review
 

mason-the-deathbat

Commendable
Nov 1, 2016
68
0
1,640


Oh I'm SURE people won't notice frametimes going from like 4ms to like 100ms constantly.
 

mason-the-deathbat

Commendable
Nov 1, 2016
68
0
1,640


To be fair, it ain't like IPC has improved all that much since Ivy Bridge. about 15%. also have to factor that Kaby is basically a refresh of Skylake with higher clocks lol.
 


Overall ipc hasn't improved a ton, no. However there are very real fps differences between 4th and 7th gen, much less 3rd and 7th gen. A combination of things from ipc, faster clocks, ddr4 etc.

My comments were regarding the testing method which is all over the place. Let enough variables change and it's no longer apples to apples, it's apples to starfruit. Ryzen is obviously shown as recent because it is, it's brand new. Why not use recent i5's and i7's? In many games newer i5's have an advantage over older i5's, that's how tech works as it improves.

How does it shake out? Watchdogs at 1080p, the 3570k averaged 64.5fps, the 7600k (stock) averaged 84.3fps. That's a 30% jump in average fps on that game. For minimums the jumps for 1% lows were 29% improvement for the 7600k and for extreme dips, over 39% improvement in fps.

For bf1 the averages gave the 7600k a 13.2% lead, minimums at 1% a 25% increase (20fps) and at .1% lows over 25% more performance.

Total war Warhammer, the 7600k had an average fps lead of 66%, 1% minimums a 75% lead and .1% lows a 28.5% advantage.

This isn't anything to do with amd vs intel, it's strictly modern vs aging intel.
http://www.gamersnexus.net/hwreviews/2792-intel-i5-7600k-review-ft-2500k-3570k-more/page-3

To suggest ipc hasn't improved a lot, really depends on the game. Gains are bigger in some games, not so big in others but many of those gains are well over a measly 15%. In just those three games, the avg fps increase was 36.4%, 1% lows improved 43% and extreme dips at .1% improved 30.8%.

That's why it's important to compare the most recent intel if pitting it against the most recent amd. Would it be fair to compare a 7600k against an fx 6300? Not all of that advantage is due to ipc but ipc isn't the only factor when it comes to performance.

Same goes for any benchmark or comparison, in order for it to be meaningful the playing field needs to be somewhat level as much as possible. It's probable that a reviewer didn't have access to the latest i5's or i7's, that's fine. Not everyone does. If not then it's difficult to perform a legit review. Same as it would be hard for me to perform a cooking comparison and substitute bananas for flour because well, I didn't have flour on hand.
 
Solution


I would like to think this is why Intel is releasing a mainstream 6-core CPU with Coffee Lake later this year.