AMD has the clock rate on its side. But Intel's Ivy Bridge architecture boasts superior IPC throughput. We pit the 4.2 GHz FX-4170 against Intel's new 3.3 GHz Core i3-3220 in an effort to determine which CPU is the better buy for $125.
Intel's Ivy Bridge architecture does give us improved graphics performance. However, aside from a slight boost to instruction-per-cycle (IPC) throughput, the new design offers little more to enthusiasts shopping for a stronger processor. As a result, the new 3.3 GHz Core i3-3220 is fairly similar to the older Core i3-2120, which also ran at 3.3 GHz.
The situation is a little different when you look at AMD's mainstream line-up. The company recently introduced its FX-4170, which features a base close 600 MHz faster and a Turbo Core frequency up to 500 MHz faster than FX-4100. Granted you step up to a 125 W thermal ceiling when you embrace AMD's dual-module solution, and you pay $10 more. But we expect enthusiasts chasing performance to gladly make those compromises when a better experience is available. So, is it?

New action in the budget-oriented processor space makes for an interest re-match. Ever since introducing its Sandy Bridge design, Intel has enjoyed a quantifiable advantage in the gaming space, its Pentium G860 beating AMD's Phenom II X4 955 and FX-8120 in our suite using high-end graphics (Picking A Sub-$200 Gaming CPU: FX, An APU, Or A Pentium?) Conversely, we've shown that an FX-4100 can keep pace with Intel's Core i3-2100 when you use a less expensive GPU, which becomes the bottleneck in games (AMD FX Vs. Intel Core i3: Exploring Game Performance With Cheap GPUs).
So, with the introduction of the Core i3-3220, we thought it'd be prudent to revisit both companies' $125 offerings to see how they do in terms of gaming and productivity.
| AMD FX-4170 | Intel Core i3-3220 | |
|---|---|---|
| Architecture: | Bulldozer | Ivy Bridge |
| Manufacturing Process: | 32 nm | 22 nm |
| Cores (Threads): | 4 (4) | 2 (4) |
| Base Clock Rate (Maximum Turbo): | 4.2 (4.3) GHz | 3.3 GHz |
| Processor Interface: | Socket AM3+ | LGA 1155 |
| L3 Cache: | 8 MB | 3 MB |
| Thermal Envelope: | 125 W | 55 W |
| Online Price: | $120 (Newegg) | $130 (Newegg) |
The match-up we have is both interesting and asymmetrical. The two processors are capable of executing four threads concurrently. Intel achieves this with two physical cores equipped with its Hyper-Threading technology to exploit underutilized resources, while AMD's FX-4170 employs two Bulldozer modules sporting a pair of integer cores, a shared floating-point unit, and a bunch of other shared resources. The FX-4170 includes 8 MB of shared L3 cache, while the Core i3 has 3 MB. AMD's FX-4170 operates at a base clock rate 900 MHz higher than Intel's offering, and it can accelerate a full 1 GHz higher under the influence of Turbo Core. The Core i3-3220 doesn't benefit from Intel's Turbo Boost technology at all, but instead relies on an architecture able to execute more instructions per cycle than AMD's. The entire FX family comes equipped with an unlocked ratio multiplier, useful for overclocking, while all of Intel's Core i3s don't accommodate overclocking at all, really.
If you consider the specifications on their own, the Core i3-3220 looks completely outclassed. But because the Ivy Bridge design enjoys far superior IPC than AMD's best effort, each core is made all the more effective, despite a substantial frequency deficit. To that point, there's also a colossal disparity in the power these two chips dissipate. The FX-4170 has a 125 W TDP, while the Core i3-3220, manufactured at 22 nm, has a 55 W ceiling. That's less than half of the FX.
- AMD FX-4170 Vs. Core i3-3220: A Fair Fight?
- Test System And Benchmarks
- Benchmark Results: Synthetics
- Benchmark Results: Audio And Video Conversion
- Benchmark Results: Content Creation And Productivity
- Benchmark Results: Battlefield 3 And DiRT Showdown
- Benchmark Results: Metro 2033, The Elder Scrolls V: Skyrim, And StarCraft II
- A Close Race Today, But Tomorrow Shows More Promise For AMD
Hope Piledriver is all that it promises and more.
"...but Tomorrow Shows More Promise for AMD".
Tomorrow...as in ... Oct 16, 2012 or is it only figurative?
At 3.3 GHz, the 6100 doesn't fare well. It's easily out-gamed by the FX-4170, and only gets a bit of a break in highly threaded apps.
In my opinion, if you are on a budget, the FX4170 can be a decent cpu, it is $10 cheaper than the i3, it isn't much but might be able to give you a slightly bigger budget on graphics. Its not a complete wash either. The power consumption might be high but nothing a desktop can't handle, would be more expensive for people who pay more for power but generally in north america, power is pretty cheap.
At 3.3 GHz, the 6100 doesn't fare well. It's easily out-gamed by the FX-4170, and only gets a bit of a break in highly threaded apps.
Does that mean the FX-4170 and 6100 really share the same MSRP? Because I was thinking that it may just be a Newegg (shop-specific) price thing (possibly a sale/discount).
I don't think it would've been a bad idea to have it around if that were the case (though I don't mean to impose more work on you guys). It might've been something to see what relative application performance score it would've gotten compared to the i5, as well as how much it performs less compared to the FX-4170. (I imagine by not so much since a lot of the games were GPU-limited already.)
NO fanboi here....
My current AMD powered laptop plays Skyrim reasonably, something that would have easily required another $100 to do with an Intel laptop at the time of upgrade.
Next time? Whatever has the best numbers will be what I buy... (I always wait to see if a chipset or CPU/GPU is plagued with problems before I buy - usually 3-6 months after they hit market)
Brand loyalty is something companies try to instill in consumers, and nowadays it has no place in a consumer's choice of hardware or software.
Buy what will do the things you NEED to do... forget about joining the war one way or the other.
If more people would buy what is on top instead of supporting only one company, more companies would have to innovate and improve their products more substantially.
A: 10 frames is significant. It's 16.6%
B: This is about benchmark performance, not whether you'll notice the drop in frames or not.
However, I will note that if your display is 60hz, and the game you're playing has vsync, then that small 5 frame difference between the game running at 55 fps and the display's 60hz actually changes to a HUGE frame drop, because the FPS will drop to 30 to stay in sync with the display.
Not something you'd like with Australian electricity prices.