how much of a difference does clock speed make

simonz93

Distinguished
Nov 17, 2015
253
0
18,790
For example between an i7-6700 @ 3.4GHz and an i7-6700K @ 4.4GHz?

I have the former but I'm getting very bad FPS (20-30) @ 1080P with some CPU heavy games (Total War) when other PC with same specs (980Ti and 16GB) and CPUs with better clock speed are getting 40 FPS at 4K.

So is that supposed to increase so much with 1.0 GHz more? I think there's something wrong with my CPU or Ram so I'm trying to narrow down the problem.

Thanks.
 
Solution
Alright, sorry for the delay. It's been a busy weekend.

I just have a few questions, are you talking about GTA V Online?
You shouldn't be getting 40-50 FPS on Ultra in this game at 1080p. Anandtech got an average FPS of 55, minimum of 38.5 with what sounds like similar settings at 1440p. (I can't seem to use BBcode to have in-text images, so here are the links to the GTA V Benchmarks I looked at. With a quote explaining their graphics settings.) You might also want to take a look at the full review from each, to further compare results.

Anandtech GTA V - 1440p, 4xMSAA, Very High
Full Review
On a quick note about settings, as Grand Theft Auto V doesn't have pre-defined settings tiers, I want to quickly note...
4.4/3.4 = 1.294 = 29.4% faster (assuming perfect scaling and 100% cpu caused bottle-necking); so if you're getting 30 FPS @ 3.4ghz you'd probably get 38FPS @ 4.4ghz

Understand 4k and 1080p or even 720p all at the same graphic settings will affect the cpu about the same. the only variation will be whether the gpu is the bottleneck or not. so If people are getting 40fps in 4k with your setup, then they're probably overclocked to a moderate 4.4-4.5ghz on that cpu to manage it.
 

DonQuixoteMC

Distinguished
From the Intel ARK:
i7-6700: "Max Turbo Frequency.....4 GHz"

We're looking at a difference of 400-500 MHz between the overclocked cpu and stock boost clock of your cpu. In other words, the difference in clock speed is not even close to the culprit for someone getting nearly twice twice your frame rate. Overclocking never has been a game changer performance wise. While it does help, there has to be something else at play here.

How clean (software and hardware[dusty] :p) do you keep your system? Any temperature problems?
 

simonz93

Distinguished
Nov 17, 2015
253
0
18,790


Temp of GPU under load is around 75 c. CPU high 60s. I have 5 140mm Corsair fans blowing at 80% strength.

Max core clock at 1034 MHz, max memory clock at 3506 MHz.

Sorry I don't know which one of the MSI afterburnder section shows the clock speed of GPU & CPU clock speeds.
 
I see nothing odd there at all, CPU util is good, CPU temp is good, gpu temp and util is good, gpu clocks are good, ram usage ok (not sure how much you have), page file useage on the first image looks wrong, way too high.

What are you power settings in windows, make sure max performance?It's no symptomatic of it, but could be.
 

simonz93

Distinguished
Nov 17, 2015
253
0
18,790


ok thanks that's good to hear. But then I have no idea why my PC isn't performing well :/
What is this page file usage though? Is it causing performance problem because it's too high?
Yeah power setting is at max performance.
 


I will have to strongly disagree. CPUS aren't a zero sum game.

first of all intel's turbo technology does not work like you are suggesting, it's not an overclock, and the "max" turbo is only if 1 core is taking all the load, it's a one core turbo.

As to what I mean about a zero sum game. There are a number of things that load to the cpu for a game. but ONLY so many. so lets call that number X. so X number of things will need to be handled by a cpu. if the cpu's able to handle Y items, and Y<X, then the game will be cpu bound. if Y>X then the game will NOT BE CPU BOUND. Think of it as a wall to jump over. Some games are so badly coded that they'll always ask more no matter how fast the cpu, those are the cpu bound games people use as benching titles for new cpus; in those titles you are 100% correct. overclocking won't help you much, though if you think about it, the difference between a choppy 48fps and a smooth 60fps is only 25%, which would be the same as clocking up a 4ghz chip to 5ghz. (because these games are used for benching CPUS, you can be forgiven for thinking overclocking isn't much of a solution, as usually overclocks translate directly into fps gains)

For the not truly -cpu bound games there is a point where more power doesn't matter anymore. MOST games are like this. This is why a 4.7ghz piledriver can produce 60fps in pretty much 95% of games out there. there is a point where more cpu power will NOT translate into more fps. however if you're under that point the game may play quite poorly (usually these titles have some pretty critical, game crippling issues if the cpu isn't good enough).

Since draw calls and other issues =/= displayed FPS we can't simply say "well you're getting 50% less FPS then your friends with the same system, so it can't be resolved with overclocking. It may be those 20% overclocks those friends pcs have are enough to overcome the number of draw calls a game throws at a cpu, which will result in the game performing DRASTICALLY better then a 20% overclock may indicate likely. those same friends might try an overclocking competition, and further increase their clock speeds and see ZERO fps improvement. as their game was always GPU bound (since the cpu was already clocked up past the requirements)


NOW ON TO THE SPECIFIC ISSUE
-He needs an overclocked cpu. I know this is the problem because total war titles are UNIVERSALLY cpu bound titles. Total War Rome actually can tax an 8 core haswell-e. Those games rarely are taxing on gpus, they almost always beat the junk out of cpus, and they tend to be single core titles as well (meaning they play like junk on AMD cpus).
 

DonQuixoteMC

Distinguished


[strike]Thanks for the clarification. Honestly never knew there was a difference between Intel Turbo and an overclock. I always assumed it was a power saving feature that scaled the core clock (on all cores) up to the limit as the load demanded and heat allowed.

In that case.. Overclocking across all cores might just give you the extra performance you need to put you over the edge.[/strike]
 

DonQuixoteMC

Distinguished
@ingtar33:

I'm not convinced by your conclusion.

First of all, this
4.4/3.4 = 1.294 = 29.4% faster (assuming perfect scaling and 100% cpu caused bottle-necking); so if you're getting 30 FPS @ 3.4ghz you'd probably get 38FPS @ 4.4ghz
seems like a useless abstraction. You recognize this yourself when you threw in the caveat: "assuming perfect scaling..." So I'm not sure why this mathematical comparison could stand as a useful metric.

Second,
it's not an overclock, and the "max" turbo is only if 1 core is taking all the load, it's a one core turbo.
Intel Turbo Boost of 4.0GHz, while not equivalent to a 4.0GHz overclock, is closer than you seem to suggest. simonz93's PC does not have heating problems, and he just clean installed. He's provided the ideal environment for turboboost technology to perform as close to a manual overclock as possible.

Third,
The assumption that an overclocked CPU would perform better in Total War doesn't seem to be supported by any benchmark.
Anandtech Total War CPU Bench
PClab Total War CPU Bench
Units are FPS/s

Perhaps I'm misunderstanding some critical element, but I honestly don't see a CPU upgrade/overclock as a solution for simonz93's fps woes. I think Total War is just a mess of a game that no one gets >40FPS on.


@simonz93:
What FPS do you get in similarly CPU bound titles? Could you give us a broader picture of your system performance with fps from other games?
Also, could you refer us to the guy who gets 40FPS at 4K? Where are you getting this data? I'm not doubting it, I'm just curious if further investigation could reveal an underlying solution.
 

simonz93

Distinguished
Nov 17, 2015
253
0
18,790


I read that on the Steam Total War forum, sorry I forgot which thread it was exactly. But when asking about performance, people generally have better performance with slightly worse or same built, and same performance with worse builds.

The thing is that my monitor only has 60hz refresh rate, so although I can get 60 FPS in quite a few games, I can't tell what's the maximum they can get and whether others can get significantly better FPS like 120+ when I'm only getting 70s. And since I only game @ 1080P, I have no idea what the FPS will be like when I play @ 4K, 980Ti is supposed to be able to play at 40K at 30 ish FPS for most AAA games, but I don't think mine can do that.


Here are the games where I cannot get 60 FPS: (all at ultra settings)
- GTA 5 with reshade, avg 40-50 (but even vanilla I can't get stable 60 FPS)
- Assassin's Creed Syndicate: avg 50 ish
- Fallout 4 with graphics and visual mods as well as ENB: avg 40-50
- Hitman 2016: usually 60 FPS but can dip to as low as 20-30 FPS
- All total war games in big battles: the worst, Attila, can dip to single digit; the most recent, Warhammer, can dip down to high 20s.

games where I can mostly play at 60 FPS but not stable with dips
- Rise of the Tomb Raider
- Just Cause 3

All the other games can pretty much locked at 60 FPS @ 1080P, some with very minor and infrequent dips.

Appreciate your comments :)
 

DonQuixoteMC

Distinguished
Alright, sorry for the delay. It's been a busy weekend.

I just have a few questions, are you talking about GTA V Online?
You shouldn't be getting 40-50 FPS on Ultra in this game at 1080p. Anandtech got an average FPS of 55, minimum of 38.5 with what sounds like similar settings at 1440p. (I can't seem to use BBcode to have in-text images, so here are the links to the GTA V Benchmarks I looked at. With a quote explaining their graphics settings.) You might also want to take a look at the full review from each, to further compare results.

Anandtech GTA V - 1440p, 4xMSAA, Very High
Full Review
On a quick note about settings, as Grand Theft Auto V doesn't have pre-defined settings tiers, I want to quickly note what settings we're using. For "Very High" quality we have all of the primary graphics settings turned up to their highest setting, with the exception of grass, which is at its own very high setting. Meanwhile 4x MSAA is enabled for direct views and reflections. This setting also involves turning on some of the advanced redering features - the game's long shadows, high resolution shadows, and high definition flight streaming - but it not increasing the view distance any further.

Guru3D also had a benchmark which suggested you should be getting more FPS from your GPU, though admittedly they used slightly lower settings.
Now, I'm fairly certain both websites used the built-in GTA V benchmark, so if you could personally run that benchmark and post results from that, mimicking the previous reviews' settings, it'd be interested to hear.
Something to keep in mind, both reviews used a more powerful CPU, but regardless, the fps difference shouldn't be this great.


I think we should also start by having you run a GPU benchmark, to help isolate the source of your performance issues.
Try installing the Unigine Valley benchmark: http://www.guru3d.com/files-details/unigine-valley-benchmark-download.html (download mirror at bottom of page) and let us know what your score is. This should directly measure GPU performance, while minimizing the impact of the CPU. You should score around 5900 on the Ultra Preset at 1080p with Antialiasing disabled.



 
Solution

simonz93

Distinguished
Nov 17, 2015
253
0
18,790


Thanks! LOL mine was at max power save. Will this help, and will it really increases the GPU temp? Mine is already quite high at high 70s (75-8ish) when under load
 

simonz93

Distinguished
Nov 17, 2015
253
0
18,790


nah I only play offline. But I do have some nice reshade and enb which might have decreased the FPS.

I will try the test, thanks
 

DonQuixoteMC

Distinguished


Haha that couldn't have been helping things!
It shouldn't make too much of a difference, low 80s isn't abnormal for GPUs. Low 90s and you might want to worry, and you'll start to see thermal throttling. Shouldn't run into anything with the temperatures you're at.

Yeah, I know it's a pain sometimes to enable/disable mods, but if you get the chance, trying without the reshade/ENB might give slightly more accurate numbers. Probably not worth the trouble though :p
 

simonz93

Distinguished
Nov 17, 2015
253
0
18,790


Hey! I noticed that after chaging the GPU setting to prefer max performance, the GPU temp stays at 62c long after quitting the game (not doing anything except internet), and right after start up, it keeps rising to 62 as well (also not doing anything but using internet).

Normally it goes back to mid-30s a while after gaming and low 30s when starting up. The moment I switched to "adaptive", temp dropped to 37 and stabilized.

This is scaring me LOL. Is that normal?
 

DonQuixoteMC

Distinguished
Well... Prefer Maximum performance should always boost idle clock speed under use (outside of game). For instance, when handling webpages' hardware accelerated content. Though I'm surprised your temperatures are that high regardless.

Do you have multiple monitors, by chance?

Have you ever set up a fan profile for your GPU?

EDIT: "idle clock speed under use" what a redundant sentence I just typed. You know what I mean :p
 

simonz93

Distinguished
Nov 17, 2015
253
0
18,790


nah I just have one 1080P monitor. I didn't set up a profile for my GPU, but last time I tried to set it up, Rivia or Afterburner said something like "it's gonna void your warranty" so I didn't do it LOL.