Intel Core i3-3220 vs AMD Phenom II X4 965 BE

zbozk

Honorable
Oct 30, 2012
14
0
10,510
i3-3220 Ivy Bridge vs Phenom II X4 965 BE

What is better for gaming?
What is better for video-editing and rendering (I use Sony Vegas Pro and After Effects)?
Which is better for both?
Which is better overall?

i3-3220 costs about 6$ more than the Phenom. Phenom II X4 has 4 true cores which could be handy for Sony Vegas and After Effects but will I see any difference from the i3-3220 which has 2 cores and 4 threads? Phenom II X4 has no L3 Cache and has very very old architecture, but then again, its 4 cores weighs very big on my decision-making.

 
For your uses, the 965 is the best choice.

The i3 is better for gaming, but the 965 is better for video editing. Neither is particularly great for both, really.

Overall, for your needs, the 965 makes the most sense.

You can just overclock the 965 to nearly match (or even exceed, in some situations) the i3 in gaming, but you also have to factor in the cost of an aftermarket cooler if you do that.

And the 965 does have L3 Cache. 6MB, to be exact.
 

zbozk

Honorable
Oct 30, 2012
14
0
10,510
If I don't overclock the Phenom 965, would there be a noticeable difference in gaming performance between i3-3220 and phenom?

What about video-editing, would the 4 threads even par at performance with the phenom's 4 cores?
 
"Noticable" difference? Eh, not really, but it's certainly a measurable one. It could be noticeable if you were at 30 FPS minimum with the 3220 because that would mean something like a 20 FPS (or a little less) minimum with the 965, but with a good enough GPU, that won't really be much of an issue.

As far as video editing goes, actual cores are king. The 2 HT "cores" of the i3 are about 1/3rd as powerful as a real core.

Here's a good comparison of the two, though: http://www.anandtech.com/bench/Product/102?vs=677

Single and lightly threaded benches go to the i3, but the heavily threaded benches go to the 965.
 
The PII should be fine at Ultra settings in Crysis 2, since you're gonna run into a GPU bottleneck, at that point. It won't be "great", but it'll do.

High resolutions and maxed out (or close to) quality settings will always be worse on the GPU than it is for the CPU. Almost any modern CPU will do reasonably well given those conditions. If you're forced to lower the res and the quality settings because of an inadequate GPU (or a monitor that isn't higher res), that's when a slower CPU hurts you.