i5 4690+GTX 1080ti+4k Gsync bottleneck?

David Taquet

Reputable
Oct 29, 2014
72
0
4,630
Hi,

I am giving a refresh to PC, taking parts from a different one.

I am planning on adding up a GTX 1080ti to my existing core i5 4690 (16GB ram) and hope to play at 4K at 60Hz (with some settings on medium) on a monitor with gsync (acer xb281hk probably, since it is the only one in my country which is less than 700$).

I have the option of taking out a core i7 7700+mobo (8GB ram) to put it in my old PC case, but before doing that I wanted to know if I could expect significant bottlenecking with the 4690.

Also, does 8GB or 16GB ram have an effect on 4K gaming?

Thanks.

 
Solution

The video appears to support my point exactly. Both of those CPUs were able to push well in excess of 60fps in every game they tested there, with the only exception being Grand Theft Auto 5, which is a game that's notoriously CPU-bound. These games were also running at 1080p, for the very reason that it prevents the graphics card from becoming the bottleneck and limiting maximum performance, to better show the potential differences between the CPUs for their review. At 4K, the graphics card would have become the...
Your existing 4690 likely wouldn't be an issue for 60Hz. Increasing the resolution shouldn't typically increase the load on the CPU, mainly just the graphics card. Aside from the hyperthreading for heavily-threaded workloads, the performance of a 7700 should only be a little faster than a 4690, so I would hold off on upgrading the CPU, mobo and RAM until you start to see them affecting performance, at which point there will likely be better CPU options available. The 16GB could help to avoid performance issues in some newer and upcoming games, so it might be best to stick with that for a relatively high-end system.
 

David Taquet

Reputable
Oct 29, 2014
72
0
4,630
Thanks.

After I posted this thread, I actually found this video

https://www.youtube.com/watch?v=daiF6lYguN4

It describes performance on two cpus (i7 4790k and i5 750), with the same GPU. And it seems the i5 is slightly faster in most games at 4K. NOt sure what to make of it...



 
4k 'Gaming' monitors are a scam, just get a regular high quality IPS 4k monitor from the likes of LG or Dell, these 'gaming' monitors often have TN panels and just G-Sync slapped on them.
I recommend an upgrade to an i74790/4770 (non-k or k depending on whether you have a Z series board or not) in substitute of this price gap.
Total budget for this upgrade including GPU?
 

David Taquet

Reputable
Oct 29, 2014
72
0
4,630
oh, Hi! We talked yesterday on that display forum about the 4K monitors. Still not sure about the display (will go to the big store next months). Just enjoying the anticipation, and asking questions in case I decide to go 4K.

As for my budget for the PC, it is already ordered and will arrive in a few days. So the big question is: is it worth transfering the i7 7700 cpu and mobo on my old case, or will the current i5 4690 be nearly as good, at UHD?



 


Considering his testing method, that's probably within the margin of error, so I wouldn't say that the overclocked i5-750 system was necessarily "faster" at 4k than the 4790k at stock clocks, it just performed very similar. Any differences that did exist might have been down to RAM timings or motherboard performance or some other factor. The takeaway from that video is that at 4k, even a 1080 Ti can't manage high enough frame rates in most modern AAA games to run into much of a CPU bottleneck on either of those systems. Either CPU should provide sufficient performance to perform its calculations in any of those games at least 60 times per second. It's when you start dealing with screens that can display higher frame rates that those differences become more notable.

For your system, the same is likely to hold true. It probably won't make much of a performance impact in the vast majority of current games whether you're using the i5-4690 or the i7-7700. As long as they can handle at least 60fps, it shouldn't matter much if one is technically a bit faster than the other. I would hardly say the i5 will struggle with a 1080 Ti, when it only needs to push 60fps. Now, if you were looking at getting a monitor with more moderate resolution and a higher refresh rate, like a 1440p 144Hz screen, then the 7700 would likely manage higher frame rates. Its per-core performance should only be around 15% faster though, so the differences probably wouldn't be huge even then.

The i7 does have hyper-threading though, which could help with performance in games that fully utilize more than 4 threads. The vast majority of games do not though, and and you can probably count those that do on one hand. That could change in the future though, and both AMD and Intel will be offering "mainstream" CPUs with at least 6 cores soon, so developers might be more likely to make greater use of them. In that case, combined with its slightly higher performance, the i7 might have more of a performance advantage in future games.

Now, it sounds like you already own both systems, and if that's the case, it would make sense to pair the i7-7700 with the 1080 Ti. It might not make much of a difference in terms of performance now, but in a year or two it could in newer, more demanding games. You did mention the i7 system having less RAM though, so you would probably want to upgrade that to 16GB at some point. Much like the advantages of having more than four threads though, having more than 8GB of RAM probably won't make a big difference in most existing games, though some are starting to hit that limit. Unlike CPUs though, upgrading RAM usually tends to be relatively cheap, particularly if the motherboard has some empty RAM slots available.

Back on the topic of the monitor, if you're not entirely set on getting a 4k screen, it might also be worth looking into 1440p 144Hz screens. Since they require the card to render fewer pixels per frame compared to 4k, the 1080 Ti should be able to run most games at max settings at around 100+ fps (And in that case, the i7-7700 might make even more sense). The 1080 Ti will also be able to maintain 60+ fps for longer in future games at max settings, where at 4k, you'll probably need to turn down the graphics options in many games being released a year or two from now to keep yourself around 60fps.
 

David Taquet

Reputable
Oct 29, 2014
72
0
4,630
Thanks for the well-written and easy to understand answer.

Concerning the screen, I don't have any experience with either resolution or frequency in gaming, as I now own a 1200p, 60hz. however, I saw a YouTube 4K video of andromeda on a 4k Tv and was blown away.

I plan on going to a large PC store, 4-5 hours driving from my house, later next month. Then, I will be able to see for myself. I guess I am just enjoying the anticipation.

I am just worried the difference 1200 to 1440 would not be very noticeable at 27-28. Also, since I have been quite satisfied with 60fps, I don't know if I would feel the extra hz on a 1440p, 144hz.

May I ask what monitor you use?


 

David Taquet

Reputable
Oct 29, 2014
72
0
4,630
But isn't that video for 1080p, where CPU does make a difference?

As you get into higher resolution the GPU becomes more important than the CPU.


 

The video appears to support my point exactly. Both of those CPUs were able to push well in excess of 60fps in every game they tested there, with the only exception being Grand Theft Auto 5, which is a game that's notoriously CPU-bound. These games were also running at 1080p, for the very reason that it prevents the graphics card from becoming the bottleneck and limiting maximum performance, to better show the potential differences between the CPUs for their review. At 4K, the graphics card would have become the bottleneck rather than the CPU, and framerates would have been nearly identical between those processors in the games they tested. Increasing the resolution is not going to increase load on the processor, but it will greatly increase load on the graphics card. The video linked to earlier was perhaps a better example of this, since it compared different resolutions, and showed what tends to happen when the graphics card's performance becomes the limiting factor. At 4K, the system with the slower, non-hyperthreaded i5-750 performed quite similar to the faster, hyperthreaded i7-4790k. Even with that older i5 overclocked, the i7 was still technically capable of offering more performance, but the framerates were effectively being capped by the graphics card's ability to render four times as many pixels as 1080p. That's why you don't typically see 4K resolution used in CPU reviews, since there would be little difference between most modern processors at that resolution using the games and graphics cards available today. Here's an example of the kind of CPU performance graphs you get at 4K...

https://www.kitguru.net/components/cpu/luke-hill/amd-ryzen-7-1800x-cpu-review/11/

And again, I agree that the slightly faster i7 with more threads can potentially make a difference at 4K, it's just that in the majority of games available today, it won't. There are a handful of outliers that could perform better on the i7, and maybe the added threads could reduce the occasional frame spike, but it's not likely to make a big difference, and the i5 will hardly "struggle" to run today's games at 4K, at least no more than the 1080 Ti might "struggle" to maintain 60fps at high settings in some games at that resolution. Perhaps by next year there will be more games that show notably improved performance running on the i7 at 60Hz, but currently, developers tend to design games to run well on 4 thread CPUs, since that's what most people have. If it were a matter of buying the i7-7700 to replace the i5-4690, I would say it might not be worth the added cost, but if you already have both systems, using the i7 would be a reasonable choice, particularly if you can easily add another 8GB of memory.


If that's a 1920x1200 screen, a 2560x1440 screen would still offer 60% more total pixels, since it's a bit wider screen ratio, so there's a greater increase in horizontal resolution than there is vertical. It might be worth seeing the screens in person to get a better idea though. Ultra-wide 21:9 screens are also an option now, and you can get those in up to 3440x1440 resolution, which is more than double the pixels on a 1920x1200 screen. It's still quite a bit less than 4K at 3840x2160, though the lower resolution should allow for better performance at max settings compared to 4K. Not all games natively support 21:9 though, so you might end up with black bars on the sides in some (particularly older) titles. Different people prefer different things in terms of resolution, refresh rate and graphics settings though, so opinions on what matters more can vary.

As for the system I currently use for gaming, it doesn't have the power to push 1440p at decent framerates, so I am holding off on getting a 1440p or higher monitor, at least until I build a new system. I am overdue for a new build, but probably won't put one together until this fall or winter. As for the kind of monitor I'm considering, I tend to be a bit picky about those things, and not many fit what I'm looking for. Ideally, it would be a 1440p screen with a higher refresh rate and adaptive sync, and either an IPS or VA panel. My current monitor does 75Hz, and while I more often game at 60Hz, I like having the option for higher refresh rates in fast-paced games. I find the 144Hz quantum dot VA panels from Samsung to be interesting, due in part to their high static contrast ratios and good color reproduction, but they only come in 1080p for now. And while they perform better than other VA panels, minimizing many of the issues often inherent with that panel type, there are still some things that could be improved with them. I'm kind of hoping it won't be long before Samsung comes out with their second-generation of 144Hz quantum dot panels, perhaps in higher resolutions, but there's no telling whether that might happen anytime soon.
 
Solution