Sign in with
Sign up | Sign in
Your question
Solved

Overclock non k processor

Last response: in CPUs
Share
September 22, 2012 5:48:21 PM

I had bought the intel core i5 3550 around 3 months ago. Today I just noticed that there was another version which was the intel core i5 3570, which is 100mhz faster, and regreted getting the 3550. Is it possible to overclock the 3550 an extra .1 ghz even tho it doesnt have the k?

More about : overclock processor

a b à CPUs
September 22, 2012 5:51:20 PM

You regret getting your processor because the 3570 is 0.1GHz faster?

Trust me when I say that 0.1GHz translates to like 5% difference in applications and games. You can overclock your CPU, but overclocking a non-K CPU is harder, and carries the risk of damage, more so than a K CPU.
m
0
l
September 22, 2012 5:59:09 PM

Lol @ the first sentence the above user posted.

@OP I dont think you can Overclock non K processors.
m
0
l
Related resources

Best solution

a b à CPUs
September 22, 2012 6:06:32 PM

You can, by changing the BCLK, but it's a lot riskier than changing the multiplier, which you can't do on a non-K CPU.
Share
September 22, 2012 6:15:17 PM

obsama1 said:
You regret getting your processor because the 3570 is 0.1GHz faster?

Trust me when I say that 0.1GHz translates to like 5% difference in applications and games. You can overclock your CPU, but overclocking a non-K CPU is harder, and carries the risk of damage, more so than a K CPU.

100mhz on a gpu makes quite a difference, so I thought same for cpu. Thanks tho
m
0
l
September 22, 2012 6:16:09 PM

Best answer selected by stevems.
m
0
l
a c 139 à CPUs
a b K Overclocking
a b å Intel
September 22, 2012 7:32:25 PM

Should be the same as SB, can raise it up 4 extra bins which is .4ghz. Aside from the bclk.
m
0
l
a c 283 à CPUs
a c 110 K Overclocking
a b å Intel
September 22, 2012 7:38:48 PM

k1114 said:
Should be the same as SB, can raise it up 4 extra bins which is .4ghz. Aside from the bclk.


Yep. [:djdecibel:3]
m
0
l
September 22, 2012 10:14:19 PM

Correct me if I'm wrong, but the reason .1 Ghz seems like a lot on a GPU and not on a CPU is sheer relativity. .1 out of 3.3 only equates to an extra 3% cycles per second, while 100 Mhz out of a 1024 Mhz GPU is a full 9.8% extra clocks per second. Simply put, because GPU's are generally run at slower clocks than CPU's, each Mhz of OC makes a bigger difference in GPU's than it would in a much faster running CPU.

To use (slightly) more complicated math:

3,300*60 seconds = 198,000 cycles per minute

3,400*60 seconds = 204,000 cycles per minute

GPU

1024*60 = 61,440 cpm

1124*60 = 67,440 cpm

So with a CPU you get 6,000 more cycles per minute, while on a GPU you get 6,000 more cpm as well, so theoretically, the OC is the exact same. But the GPU will seem much faster, an almost 10% increase, simply because 6,000 cpm is a lot to add to 60,000, while it isn't so much to add to 198,000.

That ends my simpleton discussion of relevancy.

Now, I also know there are more complicated concepts at play, which I have questions about. For instance, how each core splits up the work of a CPU. In a four core processor, is the 3.4 Ghz divided among each core? So each runs at 850 Mhz? Or does it mean that each core runs at 3.4 Ghz? If each core runs at 850 Mhz, then is a 100 Mhz OC applied to each core, for 100 extra Mhz a piece, or split among each of them, for 25 Mhz a piece? Would a dual core at 3.4 then have each core running at 1600 Mhz, or would each run at 3.4 still?

Does anyone have answers to these questions? I am sure I could do a google search, and I will, but I'm interested in the unique understanding that the experts here have.
m
0
l
a b à CPUs
September 22, 2012 10:45:27 PM

4 bins above max turbo + a 3% adjustment on baseclock (given that you have a motherboard that supports overclocking - z77, z68, p67) should put you put to above 4.2 ghz. very respectable. pust make sure your cooling & power supply can handle the increased heat & load. good luck with your oc :) 
m
0
l
September 22, 2012 11:09:47 PM

WARNING: WALL OF TEXT

"Now, I also know there are more complicated concepts at play, which I have questions about. For instance, how each core splits up the work of a CPU. In a four core processor, is the 3.4 Ghz divided among each core? So each runs at 850 Mhz? Or does it mean that each core runs at 3.4 Ghz? If each core runs at 850 Mhz, then is a 100 Mhz OC applied to each core, for 100 extra Mhz a piece, or split among each of them, for 25 Mhz a piece? Would a dual core at 3.4 then have each core running at 1600 Mhz, or would each run at 3.4 still?"

First, I can't find anything that matches your question on Google. Essentially, is it better to put 100Mhz OC on a CPU or GPU, and why?

To partially answer myself, and your query, each core runs at 3.4 Ghz if the CPU is 3.4 GHz. So if you OC a 3.3 Ghz, quad-core processor by 100 Ghz, you are running each core at 3.4Ghz. Each core can do 3% faster work on its job, each one cycles 100 extra time per second, or 6,000 a minute. So we have actually 24,000 extra cycles. The original speed was also much higher than I calculated. Instead of 3300 cycles per second, we have a total of 13,200 for 60 seconds = 792,000 total per minute.

24,000/792,000 = 3%

We have the ability to run 4 jobs at once at 3% more cycles for each of them per unit of time than before.

In a dual core, 100 Mhz OC is only 2 jobs at 3% better performance a piece. At 3.3 Ghz, we have 3300 * 60 * 2 = 396,000 cpm. At 100 Mhz per core OC, we get 100Mhz extra per core, or 6,000 extra per minute, a total of 12,000. At 12,000/396,000 you see exactly 3% relative increase.

Second, I want to qualify my answer. These percentages are all RELATIVE to your perception of the previous performance of your system. A 100 Mhz OC on a dual core processor is the exact same as a 100 Mhz OC on a quad core, mathematically. For instance, at a 100 Mhz OC there is a 3% performance increase in a 3.3 Ghz processor, be it a quad or dual core. You will see a similar RELATIVE performance increase. BUT THIS DOES NOT MEAN THAT A QUAD CORE AND DUAL CORE AT THE SAME CLOCK SPEED ARE EQUAL. The quad core will do much more work at the same 3.3 Ghz. It has 792,000 CPM as opposed to 396,000 CPM for the dual core. A 100 Mhz OC will put you at the exact same 3.03% RELATIVE better performance than before on both CPU's, but 3% of 792,000 is better than 3% of 396,000. So any OC on a quad core should make a larger objective difference than the same OC on a dual core, but the subjective difference will be the same.

In a GPU we have a totally different concept. We have parallel cores? That sounds like a CPU to me though. I know much less about how a GPU works. But we do know that we can calculate and edit the memory clock and core clock speed, despite whichever way the set-up theoretically works. If we assume that it is like a single core CPU then an OC on a single core GPU should make less of a difference, than we could expect from multi-core systems. It won't make nearly as large of an objective jump in clock speed as a CPU would. But considering my previous statement, that most GPU's run on slower clock speeds than CPU's, you would notice a much larger relative difference at 100 Mhz

Considering we aren't computers, I doubt objective difference is nearly as important as subjective difference. So it may be more important to boost your GPU because it is a 10% relative increase, compared to a 3% CPU relative increase, despite the fact that a 100 Mhz Core OC is actually less total clock cycles per minute to add to a GPU than a CPU.
m
0
l
a b à CPUs
September 22, 2012 11:17:48 PM

lol, someone was bored. seriously tho, thanks for the read, very informative :)  i just genrally tend to go with "as fast as they'll both run while providing stability & lognevity" it's easier to construe :p 
m
0
l
!