Okay, I've come into about $80, which is enough for me to replace my e2180, which, admittedly, overclocks pretty well.
Would it be worthwhile to upgrade from an e2180 overclocked to 3.33Ghz to an e5200, which looks to clock between 3.6-4Ghz? (Cooling is a Monsoon II TEC Cooler, so high voltages and such arn't a problem.)
At the same speed, the e5200 should be about 10-15% faster because of the archetecture updates and the cache, right?
So would it be worthwhile to go for that?
And $180 for the Q6600 or Q8200 is a LOT more than I'm willing to spend. (I won't be making much more money for putting towards this for about six-eight months, either.)
The rest of my computer is...
4GB DDR2 800
ATI HD 4850
22" Acer Monitor - 1680x1050 resolution.
Well, the main reason I'm wondering is the cache. Me and my friends have identical systems save for the processor and heatsink (even the OS is the same, from the same day install. -Windows 7 beta), and it's an e4400 @ 3Ghz.
The difference is that his system runs our favorite games with MUCH higher minimum frame rates at the exact same settings. Fallout 3, for example, on his system dips down to about 35fps, while mine dips down to 3-5 at times. Maximum framerates are about the same, and because of the low dips, my average frames took a massive hit, too.
This WAS happening before I switched to Vista, or 7, so the only reason I can think of is a hard drive issue (even went to a basic install of Vista once with nothing installed but the game and drivers, and it did it. Even tried installing on a secondary hard drive of mine for no difference, too....) or it's the amount of cache difference.
It's not just Fallout 3, Though. It's Red Alert 3, Far Cry 2, Crysis Warhead, and more.