What will I get with Overclocking?

gamer1411

Reputable
Jun 16, 2014
176
0
4,690
For example if I overclock my i5-4670k cpu,what bonus will I get?How much I can overclock it with evo 212 cpu cooler,but so it is safe and so it will still last for years.
Will my computer boost faster,will it work faster,or what?Will I see more fps?(GPU is 780ti)

Also second question about this is:
I5-4670k with 3.4ghz can fully support gtx 780 ti,but for example if new graphic card from Nvidia comes(8xx),and if they are stronger,and if i5-4670k stock is bottleneck for them,will they stop being bottleneck if I clock it to 4.5ghz?

I hope you understand second question,and if not please tell me so I can try and Explain more.
 
Solution

The benefit of overclocking is to gain higher performance. That does not mean that it will yield higher performance. More on this later. Another bonus you get is more heat (if you count that as a bonus).



Depends. CPUs are not created equally. Some are better overclockers than others. I would start out at 4.3GHz. Haswell does run a bit hotter. If that works out, try 4.4Ghz. I would say the limit would probably be 4.5GHz on the 212 EVO.



As long as you spend time testing out and making sure the system is stable on the overclock, it should be safe...

leo2kp

Distinguished
Everything should be a little faster, even gaming. Yes, overclocking can help avoid CPU bottlenecks when using faster GPUs. You need to find out how far to push the CPU with that cooler. You need to learn how to overclock using your board and CPU, and learn how to stress-test your components (memtest86+, Prime95, 3DMark, use temperature monitors, etc).
 

The benefit of overclocking is to gain higher performance. That does not mean that it will yield higher performance. More on this later. Another bonus you get is more heat (if you count that as a bonus).



Depends. CPUs are not created equally. Some are better overclockers than others. I would start out at 4.3GHz. Haswell does run a bit hotter. If that works out, try 4.4Ghz. I would say the limit would probably be 4.5GHz on the 212 EVO.



As long as you spend time testing out and making sure the system is stable on the overclock, it should be safe. I haven't had a case where the CPU died from overclocking, yet. So my current one is running at 4.7GHz for about 2 years and 7 months now.



Not really. If you want responsiveness, buy a SSD.



Maybe. Only if the CPU is somehow causing some bottleneck. If not, overclocking it won't help much.



Only second question huh? =/



Depends on the game. A CPU can bottleneck a GPU, yes. But games that do not make use of the full capability of the GPU is also bottlenecking it. So the answer is "it depends". 8xx isn't out yet so there's no way to know.
 
Solution
It depends what's the limiting performance in each application. In my rig....

CPU is at 31% OC (4.6 GHz)
GPU is at 26% OC (Memory about 21%)
Memory is 50% overclocked (2400)

Overclocking a GFX card by x% can get ya a fps increase of about 50 - 60% of x.... so a 25% OC might net a 12.5 - 15.0 % increase in fps.

Overclocking a CPU by x% will get ya a fps increase of about 25% of x.... so a 30% OC might net a 7.5 % increase in fps.

Overclocking memory by x% can show a FPS increase of between 0 and 20% of x .... so a 50% increase can yield nothing for example in Crysis or 10-11% in games such as F1 and STALKER.
 

gamer1411

Reputable
Jun 16, 2014
176
0
4,690


2400MHZ RAM Memory,is same as 1866mhz memory. It is only empty stories that it is better. Please google and see. ;)
 
I don't have to google it, you can see the proof right here on THG.....not the mention the benchmarks I have run myself.

image006.png


Care to explain the above ?????

The impact gets greater on minimum frame rates and even more in SLi / CF.



 

gamer1411

Reputable
Jun 16, 2014
176
0
4,690


https://www.youtube.com/watch?v=9fTJPHDW9kk

I didn't said 1600mhz vs 2400mhz,I said 1866mhz vs 2400.
There is maybe a little bit difference but it is not worth money,and also you will get no more then 1fps in gaming.
 
I'm sorry but that 1866 thing doesn't work. Look at the THG chart....various speeds depicted BESIDES 1600, all show improvement with speed.... same is true of lower CAS. Same here:

IGP%20Results.png


I have seen his videos before and while he acts a bit dorky in each one, this one was silly. He starts off with the statement that sure there's a difference if ya going to run memory benchmarks but what about "things you actually do ?" He then proceeds to run a series of GFX card benchmarks. He then moves on to test a whopping 3 games known to be GFX limited to "prove" that 2400 does nothing.

Where are any games known to run into CPU or memory limitations ? Where is F1, Where are the three STALKERs ? Where;s the testing in SLI ? Where's the testing of minimum frame rates ?

Where's AutoCAD, ..... where's Premier ?

This is not "new stuff"

http://www.anandtech.com/show/2792/12

22.3 % (SLI) increase in minimum frame rates w/ C6 instead of C8 in Far Cry 2
18% (single card) / 5% (SLI) increase in minimum frame rates w/ C6 instead of C8 in Dawn of War
15% (single card) / 5% (SLI) increase in minimum frame rates w/ C6 instead of C8 in World in Conflict

Also see http://www.bit-tech.net/hardware/memory/2011/01/11/the-best-memory-for-sandy-bridge/1

http://www.anandtech.com/show/6372/memory-performance-16gb-ddr31333-to-ddr32400-on-ivy-bridge-igp-with-gskill/14

CAS and Speed both play in how fast memory runs

[CAS / Frequency (MHz)] × 1000 = X ns

[9 / 1866] x 1000 = 4.82 nano seconds
[10 / 2400] x 1000 = 4.17 nano seconds

THG presented a balanced approach to the issue showing two games....one had an 11% improvement from 1600 to the top end and the other Metro Last Light showed none. If we are going to accept that using 3 games is sufficient to "prove a point", then I can prove two directly opposing hypotheses.

a) I will choose F1 and the STALKER series, where you will see very significant differences as the games are know to be significantly bandwidth limited.

b) I will Choose Metro Last Light and the Crysis series each of which is know to be GFX limited.

So how is it that it is so easy to " prove" two opposing viewpoints as being correct ?

The truth of the matter is exactly what I said:

Overclocking memory by x% can show a FPS increase of between 0 and 20% of x .... so a 50% increase can yield nothing for example in Crysis or 10-11% in games such as F1 and STALKER.

The video "evidence" proves that the first half of my statement was 100% correct..... the second half was not addressed. Whether he just didn't know any better or intentionally chose to "hide" these facts, I have no way of knowing.
 

gamer1411

Reputable
Jun 16, 2014
176
0
4,690


Added as solution for hard work
 


Money is an individual decision..... this "not worth it" mindset is a leftover from reviewers who have reviewed memory when it first comes out when the differences are $100 or more. I'd agree at that point many will find the investment hard to justify. Then as production lines improve their yields over time and price drops to small differences or even $0, we see these same articles quoted several times a week here on THG where peeps claim it's not worth it even when they the price difference is $0 ! ..... had the same "discussion" last week despite the fact that the 2133 modules were cheaper than the the 1600. Any performance difference becomes logical once you get to a certain price point. The fact that this argument persists even when the price difference is $0 should be the indicator that something is wrong with this mindset.

The 2nd set of false logic goes..... "well it's $25 more for 2400 and that's a 15% increase in RAM cost for an average 2- 5% increase in performance". It's not your RAM that goes faster, it's your whole system that goes faster. So:

On a $2,000 box, is it worth an increase in system cost of 1.25% for an average performance increase of 2 -5 %
On a $1,750 box, is it worth an increase in system cost of 1.43% for an average performance increase of 2 -5 %
On a $1,500 box, is it worth an increase in system cost of 1.66% for an average performance increase of 2 -5 %
On a $1,250 box, is it worth an increase in system cost of 2.00% for an average performance increase of 2 -5 %

The 3rd problem I have is that the instances where RAM has historically had the most effect (minimum frame rates and multiple cards) is not looked at in most reviews.

The great majority of the builds I have done this year between $1,600 and $8,800 the user chose 2400. Below that, it's usually 2133 except for a few occasions where there was no 2133 at attractive prices at the time. Now even 2666 is starting to be worthwhile considering with just a $5 premium over 2400.

It was commonly accepted that the 4770k isn't worth it over the 4670k for gamers because that 3-6% performance difference isn't worth the $100 (42%) CPU cost difference .... well what about the fact that it is a 5% cost increase on a $2,000 box ? Certainly way way more of a % reward than ya get on video card upgrades. I see the same advice repeated for the 4x90k series without a thought tot he fact that we have a whole new ballgame here. Personally, looking at the 4x90k series, my view is that the logic doesn't hold. With a 4.4 Ghz Turbo Frequency for the 4790k versus 3.9 Ghz for the 4690k ... that's a 13% speed increase versus the 3% we saw on the 4x70k for a similar 42% increase in CPU cost. But is this not a less "automatic decision" when system performance goes up 13% for a 5% system cost increase on a $2,000 box .... or even 10% on a $1,000 box . Spending 5% to get 13% not logical ? Spending 10% to get 13% not logical ? Let's try the least logical "worth it" argument, GFX cards.

-780 Ti is $200 more (42%) than the 780 and gives 14% more performance, is it worth it ? .

-The 780 Ti is $180 more (36%) than the 290x and gives 8% more performance at stock speeds, is it worth it ? I guess so, nVidia has sold almost the same amount of 780 Tis (0.43% market share) than all R9 200 series cards (0.44% market share) combined.

-The 780 is 150 (45%) more than the 770 for just a 15% increase in performance, is it worth it ?

Peeps make these decisions every day without question .... looking only at component cost, the increase from a 770 to to 780 or a 780 to a 780 Ti has 1:3 return on investment and it goes by w/o question but buying RAM which has the same 1:3 ROI is automatically "not worth it" ? Can't have it both ways. And when looking at it as a % of system cost .

-On a $2,000 box, the 780 Ti nets a 14% performance increase over the 780 for a 10% increase in system cost (7:5 ROI) ... the RAM upgrade is better then 2:1)
-On a $1750 box, the 780 nest a 15% performance increase over the 770 for a 9% increase in cost (5:3), again the RAM is better than 2:1.

Getting back to the 4x90's .... is the universal wisdom is the 13% performance increase of the 4790k is not worth it at $100 ..... To my mind, if you can justify a a 780 over a 770 or a 780 Ti over a 780, it is most certainly "worth it" as they all have the same 1:3 ROI looking again only at component cost.

But what if the pricing structure of CPUs changed over time like RAM does ? What if the difference dropped to $50 ? What if the difference dropped to $25 ? Regardless of where the price difference tips the scales for you, you have to admit, that that 2 year old review looking at price differences of $100 and the reviewer saying "it's not worth it".... presents a different scenario, when that price drops to $20.....and then further to $0. No one is saying that the performance increase aren't small..... however when the price increases are even smaller, and even 0, if ya still under budget it should be the proverbial "no brainer".

I also don't understand why many reviewers will do in depth reviews on GFX cards, taking them apart to inform their readers as to what manufacturer (Hynix, Samsung, Elpida, etc) and type of memory is used and then, when looking at RAM modules, don't bother to look beyond the logo and actually peek at who manufactured the stuff. Why when testing GFX cards, some will show us minimum frame rates and performance under SLI, but then only look at average when looking at RAM ?