Processors 3.6ghz sweet spot

eklipz330

Distinguished
Jul 7, 2008
3,034
19
20,795
I've heard plenty of times that 3.6ghz is the sweet spot for most LGA775 processors, and overclocking past that provides little to no benefit, and might cause extra heat if you turned up the voltage. Doesn't this alone make LGA775 quad cores worth it [aside from the fact they cost more]? Does anyone have a real life example of when they went from 3.6 to lets say 4.0 on a q9550/q9650 and reaped a significant amount of benefit out of it?

This has been running through my mind for a while, this pretty much says that efficiency means all, and a core i7 at 3.6ghz is better than a c2q at 4.0ghz simply because there is no benefit... anyone can help me out on this?
 

hundredislandsboy

Distinguished
Never heard of a CPU sweet spot per se unless your specifically refering to frame rates on particular video cards.

Yes, more powerful newer processors will bottleneck older GPUs so that going from 3.6 GHz to 4 or even 5 Ghz will not improve frame rates.

Yes, higher speeds, especially from overclocking, entails higher voltages, resulting in more heat.

Most PC enthusiast don't look for a sweet spot but want to see the CPU go as fast as possible.

Unless the program, ie video encoding, is written to use as many cores as possible, there is no difference between i7, quad core, or dual core. There are lots of gaming benchmarks that show an e8600 at 3.33 GHz outperforming quad cores at 2.6, 2.8, 3 GHz and even outperforming the entry level i7s.
 

mi1ez

Splendid
I would imagine any single threaded apps would love to go past 3.6GHz. It also depends if you're raising the multi. My PC is much quicker at 3.2 with a multi of 8 (1600FSB) then at 9 (1422FSB) although I can't remember the memory settings.
 
Here's how it works:

If your game is limited by the graphics chip you will see no benefit in overclocking.

If your CPU is the limiting factor the overclocking it will help your game. In fact, if you can 10% overclock it and your GPU is still not the limiting factor you'll get an average 10% more FPS in your game.

There's no "sweet spot", either your CPU is the limiting factor or its your graphics solution. (all other issues such as RAM sorted out)

Which parts to choose can be tricky as it depends on how you use your computer, and even for games there's a difference.

Some people get a specific CPU and try to get a Graphics cart that maxes this out.

Many people don't want SLI or Crossfire and buy a high-end quad-core and a good graphics card keep it for two years then replace it with a newer graphics card.

You wouldn't put a GTX 275 with an ATOM CPU. Don't expect to get good Crysis gaming while using an HD3200 and a Core i7 965.

Personally, I'd like to see charts that specify which graphics card will max out which CPU (and for Crossfire/SLI) averaged over a collection of 5 good games. Unfortunately we just seem to get a recent graphics card put with the latest CPU so it involves a little bit of reasoning to figure out what the best upgrading path is.

TEST:
You can open the Task Manager by using CTRL-ALT-DEL. Leave it open and run a game for 5 minutes. Close the game and look at the Task manager. If you see at least one core maxed out and many others fairly high your CPU's maxed out. If your CPU is running under 50% you can get a lot better graphics card or add a second one of possible.

You should also be running FRAPs to view your FPS because you could be synched to 60FPS in which case neither your CPU nor graphics might be stressed to max.

If you see it fluctuating below 60FPS then either your graphics or CPU is being maxed out.

You need to show all the threads, not "one graph for all CPUs". "25%" migh seem like your CPU is fine. but if you look at the thread graphs and see four threads, with one right at 100% and three unused then your game doesn't multitask and is simply maxing out that single core on your CPU.

Run this on your high-end games and see where you stand. NeverwinterNights 2 doesn't support multi-threading on the CPU which is generally unusual at the level of its graphics quality so it's easy to be limited by your CPU here. (Also why dual-core 3GHz would blow away quad-SLI 2.1GHz. ) Whatever.

Hope this was enlightening.

VSync:
Not all games are struggling to meet 60FPS. I installed an older Conan game and FRAPS shows over 300FPS. My monitor can only show 60 of these so I get screen tearing, waste power and have higher fan noise. Without VSync running your system runs the game at maximum until either your GPU or CPU is stressed.

I needed to install ATI Tray Tools for Vista to get VSync working reliably (still. WTF?). This tool also can force VSync on for games that never supported it. I make a custom profile in ATI TRay tools for that Conan game with "force Vsync" and when I start the game it's running solid at 60FPS and my CPU was under 20% and my computer fan noise didn't really noticeably change.

I wish I didn't have to keep messing around to address issues that should easily work. Why is some relative nobody providing free fixes for something Microsoft, AMD, and NVidia haven't fixed?
 

hundredislandsboy

Distinguished
I've tested at a few speeds, and i'll say this, games are still the same regardless of cpu clock at 2.4 and above, they don't play any different. ....
--------------------------------------------------------------------------------------------------------
Try Crysis, World In Conflict, Company of Heroes at 1680 X 1050, 2AA, if you want to see the difference.
 

B-Unit

Distinguished
Oct 13, 2006
1,837
1
19,810
OMG So much fail...

First you dont identify a CPU bottleneck by looking at task manager. Run your game with full eye candy and at your highest supported resolution. Note your FPS. Now, drop your resolution to 1024x768 or even 800x600 if you can, and now note your FPS. Did it go up? If not, you are CPU bound rather than GPU bound. This will vary by game obviously.

So, to answer the OPs question, the 'sweet spot' is going to vary by game and vid card. TBH, your 'sweet spot' will vary by CPU too, Id say the true 'sweet spot' is how ever high you can OC without increasing voltage. Other definitions may be used tho.
 

eklipz330

Distinguished
Jul 7, 2008
3,034
19
20,795
i think you guys got the wrong idea and i should have been more precise

I've read in a recent tom's article that going over 3.6ghz on resolutions of 16x10 don't reap many benefits on today's games. Now a 45nmC2D is about $180 where a 45nmc2q is about $260. Many people say there's no point of going to a c2q cause multithreaded apps are small in quantity. But since a 45nmc2q can hit 4.0ghz easy, wudnt it make a better buy than a c2d than can hit 4.5? since the jump from 3.6-4.5ghz won't reap too many benefits?
 

eklipz330

Distinguished
Jul 7, 2008
3,034
19
20,795
yeah but around what clock would the c2d or c2q would have to reach [im talking on avg] before clocking any higher is pretty pointless? [i.e. oc'ing another 300mhz only brings you 1/2 fps and 10C of heat]

im asking this because i want to know whether the q9550 is better than the q9650, i heard the maximus formula se has fsb caps for the 45nm quads =[