This youtuber is claiming a i5 4690K isn't a good match for a gtx 1070. why is he right or wrong?

Solution
The guy was cherry picking high cpu utilization games for one thing. Not sure what his point was, that an i7 was faster at witcher overall? We knew that before the 1070 even came into play. It doesn't apply to every game since games are programs. Saying a cpu is bad for handbrake doesn't mean it's bad for photoshop, they're different programs just like witcher3 is a different program than cod. When games are more computationally complex they'll require more cpu horsepower, when they're more graphically immersive and visual quality is set high (max vs medium graphics) the gpu needs to work harder.

If there's a performance difference between 2 cpu's, for instance the i5 and i7 - say the 4690k and 4790k on witcher 3, while using a 970 why...

Kenneth Barker

Reputable
Aug 17, 2015
378
0
4,860
Yes, the i5 4690K will be fine for the 1070. The 1070 is essentially a 980Ti

The 1080 and soon to be announced "high end" cards are pushing that realm of bottleneck.

The primary reason is Vulcan and DX12 bring multicore and multi thread support to new highs. The i5's in the near future will be a ways behind the multi threaded CPUs out there because of that.

 
That guy does not know sh*t.
i5 4690k and i5 6600k is only about 5-10% different, which is ignoreable in games.
Games tend to rely more on GPU anyway.

HT on games is a pure gamble, games today are optimal on i5. i7 is only a waste of money for pure gaming rig.
Unless he is a druid or prophet, who can see the future..he is stupid.

Vulcan and DX12 and also trying to reduce the processor dependencies of GPUs.
 

BruhLovin

Commendable
May 1, 2016
122
0
1,710

Tell me about it, I cringe when I see how people stress that the Skylake 6600k is 10x better in all aspects from the 4690k :p
 
The only place it may make a difference is when your pushing super high refresh rates in older games. Typically when your pushing 100 plus fps you start to get CPU bound. Games like CSGO where for some reason people demand 300 fps on 60hz monitors.
 

Jan_26

Commendable
Jun 30, 2016
247
0
1,760
This is a simple question that doesn't have a simple answer. I'll try to make it simple and we can go to deeper details if you want.

So... every game does a lot of calculations. Part of those is for disk operations, memory management, caching, pre-fetching textures, part of those is for managing audio (or positional audio), part of those is for physics, part for game engine (much more if it is a MMO game due to more complex tasks to do), then goes AI and in the end of this list comes graphic engine.
With every generation of graphic API (be it Glide in the far past, OpenGL, DirectX, Vulcan etc.) more things from cpu part of the equation can be effectively moved from cpu and being computed on gpu, but it can hardly be all of it. Now if you imagine all those non-graphic parts beating your cpu (and some other components), we come to graphics. It eats a lot of resources even while majority of the work is done on graphics. Now imagine that for cpu there is no difference whether you play 1024x768 or 3840x2160 - there is still same scene graph that needs to be processed and rendered - same volume of triangles that need to be transferred between cpu and gpu... and here it comes - either your gpu can process that volume faster than cpu can feed it, or it can't. If your gpu is work hungry, either the graphical engine doesn't need more or your cpu has reached limits of what it can provide in the given time. There you get bottleneck on gpu or cpu.
In the past, games were strictly single threaded. Well, maybe you had audio processing running in separate thread, but nothing worth noting. This was due to neither API nor chips themselves were able to effectively (or at all) work with concurrency. With newest API and gpu chips these limits are getting gradually lifted and then you can get more work done on cpu side. But, it puts more stress on game development process since proper multithreading is not trivial if it is to be done properly (risk of deadlocks, starvation, issues with synchronization of data between threads etc. - it can do for some decent headaches).
And here we go to the last part of the equation... After all that hardware and API improvements, it comes to game developer and it purely depends on him, how he utilizes the given resource budget. It can be done well and then majority of bottlenecks with really lay on the slowest piece of silicon, but it can be more of an issues of developers not making the optimal choices... or, not making optimal choices for your particular setup (imagine "strong cpu" and "weak gpu" and a game designed to calculate as few as possible on cpu and as much as possible on gpu... won't really give you optimal results).
In the end, we get to i5 vs i7. The difference between the two in majority is a presence or absence of hyperthreading technology (in Intel's implementation this means two logical cpu sharing one physical core). With HT turned off, you are pretty much (not entirely, but close enough) to having i7 = i5. With HT turned on, you can obtain some free computing power, but those additional logical cpus have to share the physical silicon. Which can be done quite effectively (simply by using different code paths in cpu to do different things and not competing with itself for own resources), but there is still some cost resulting in slightly lower performance of single core. If your game is (almost) single threaded, your performance can even go lower compared to i5 or i7 without HT. But with every cpu generation HT works better than it used to. Further, modern OSes first utilize physical cores and only when they run out of available computing power they start to schedule on "secondary" logical cpus... Modern games still mostly use single thread "with sauce", so i5 vs i7... nothing gained. Absolutely newest games or games that need significantly more cpu computing (MMOs, we can get to reason why later if you want), can already utilize more cores, but it hardly gets up to 4. In future we can expect utilization will gradually increase, but it won't be fast - developers need to design their games so the majority of gamers can actually play it well, and these days, majority of gaming PCs are 4 core machines.
I can't imagine a common gaming scenario where your 4690k wouldn't be able to feed GTX-1080, let alone GTX-1070 so unless you do some cpu heavy computing, I'd go for i5 without worries.
 

David_24

Distinguished
Aug 26, 2015
329
1
18,795


thx for the time.
they keep saying on their channel that you can tell if there's a bottleneck by ... so and such. But i don't trust their method.
Can you give me a couple one liners that I can respond to their video's with that point out how misleading they're being?
https://www.youtube.com/watch?v=nW3dNjGaaBg
 

Mike3k24

Respectable
Apr 21, 2016
1,218
0
2,660
The thing is the guy is right. He is showing you raw evidence that the i5s will not be relevant when using these HIGH end GPUs. But you guys refuse to accept the proof. Now I'm not saying the i5 would hugely bottleneck. It'll only be limiting performance by a little bit.
 
The guy was cherry picking high cpu utilization games for one thing. Not sure what his point was, that an i7 was faster at witcher overall? We knew that before the 1070 even came into play. It doesn't apply to every game since games are programs. Saying a cpu is bad for handbrake doesn't mean it's bad for photoshop, they're different programs just like witcher3 is a different program than cod. When games are more computationally complex they'll require more cpu horsepower, when they're more graphically immersive and visual quality is set high (max vs medium graphics) the gpu needs to work harder.

If there's a performance difference between 2 cpu's, for instance the i5 and i7 - say the 4690k and 4790k on witcher 3, while using a 970 why would anyone think it would be different or better gpu utilization to cpu utilization at the same resolution with a 1070? The differences are negligible since most people game on 60hz monitors and when both are averaging over 60fps it's not much of an issue. When talking about 60hz vs 144hz monitors then yes, the fps difference between 65fps and 80fps will matter.

The reason for going with a bigger gpu isn't to make the current fps go faster on a lower resolution, it's to push the pixels faster on a higher resolution. Apparently this youtube reviewer doesn't get that. It's long been known that the 970 was more or less a sweet spot for 1080p. A 980 would do a bit better but at significantly more money.

Enter the 1070 and 1080, both are supposed to be faster than a 980ti. Now go back before the 1070/1080 and how many sane people recommended a 980ti or sli'd titans for 1080p? Not many, they were overkill and couldn't really flex their legs. Move into 1440p and 4k territory, ah now it makes sense for the bigger gpu's.

He flat out showed the significant improvement even with an i5 of the 1070 over the 970, so where's his point about these high end cards being pointless with 'lowly' i5's. Didn't his results at one point show a 25% increase? This is why it's difficult to rely on youtube, they don't prescreen people before they upload videos. Anyone and everyone is free to and if they talk in circles enough eventually some people will buy their story.

If you're playing games that you know are heavily cpu bound or are one of the rare ones that make use of ht, then by all means get an i7. If you're using a 144hz monitor definitely. If using a 60hz monitor it won't make as much difference. Just like some games awhile back like fallout they would show how much faster an i7 was than an i5, but both pushed over 60fps. Which coincidentally was important because it was a built in limit to the game and game time is important on titles like that so certain parts of the game don't glitch. Over 60fps is pretty well a moot point on a title like that.

It goes back to the way it's been for quite some time, if in doubt reduce visual setting in a game at the resolution/refresh rate you play at. If going from max visuals to min causes your fps to jump up then the gpu is what's holding you back. If there's no change in fps then the cpu is holding things back. Not too many newer games push 120-144fps outside of like cs:go and a few others. Even with an i7 and a 980ti. Unless perhaps turning visuals down to squeeze a few more fps.

Then it comes down to which type of gamer are you, do you have a need for a lot higher than 60fps for a 1080p 144hz monitor? Or are you looking at 60hz monitors at 1440p with overall better visuals and smoother graphic detail due to the increased resolution? In those cases an i5 with a 1070 or 1080 makes perfect sense. Hitting over 100fps with graphics turned up at 1440p or 4k is more or less a pipe dream outside of maybe one or two rare combinations of game and hardware.
 
Solution