We're already intimately familiar with the Bulldozer and Ivy Bridge architectures, so nothing that we saw today is particularly surprising. Single-threaded applications are going to hum on Intel's chip, while applications able to tax the FX's four integer clusters are going to treat two Bulldozer modules more like a quad-core processor (not quite, though, as the Core i5's stellar performance demonstrates).
The real questions, then, are: what does a dramatically higher clock rate do for AMD's offering, specifically, what does that bump up to a 125 W TDP do for power consumption, and how does Intel compare, given its dual-core implementation?

Despite proof that some of the games we tested do take advantage of quad-core CPUs, the dual-core Core i3-3220 takes a lead in this discipline, mostly a result of Skyrim. The FX-4170, on the other hand, serves up better application performance, and by a larger margin. When you consider the way people use their PCs, we're inclined to put more value on the larger productivity win favoring AMD, particularly since the apps where an FX excels are threaded. Those are the workloads that require more processing power.
If we were to make our judgement on performance alone, AMD's FX-4170 would have the edge.
But there's another side to this story. It starts with power consumption, and ends with efficiency.

At idle, the FX-4170-based machine uses almost 20 W more than the Core i3-3220-based box. Under load, that gap grows to a staggering 103 W. The dual-module FX almost doubles the consumption of a quad-core Core i5, in fact.
Yes, the load comes from a largely-synthetic Prime95 run, and yes, it's unlikely you'll ever see such nasty power numbers on a day to day basis. Nevertheless, two times the power consumption really puts AMD's small performance advantage into context. Efficient, this CPU is not. Whether or not that matters to you is a personal decision.
What we're really looking forward to, though, is Vishera. Intel put its cards on the table earlier this year with its Ivy Bridge architecture, and Haswell-based chips won't show up until the second quarter of next year. More immediately, we're expecting FX CPUs based on AMD's Piledriver architecture this month. We've seen evidence that Piledriver may add up to 15% more performance in the same thermal envelope as Bulldozer. If that holds true, then a processor with two Piledriver modules in the same $125 range should help the company claw back some of the performance/watt deficit it currently suffers.
You can bet we'll revisit this topic when those chips start showing up.
- AMD FX-4170 Vs. Core i3-3220: A Fair Fight?
- Test System And Benchmarks
- Benchmark Results: Synthetics
- Benchmark Results: Audio And Video Conversion
- Benchmark Results: Content Creation And Productivity
- Benchmark Results: Battlefield 3 And DiRT Showdown
- Benchmark Results: Metro 2033, The Elder Scrolls V: Skyrim, And StarCraft II
- A Close Race Today, But Tomorrow Shows More Promise For AMD
Hope Piledriver is all that it promises and more.
"...but Tomorrow Shows More Promise for AMD".
Tomorrow...as in ... Oct 16, 2012 or is it only figurative?
At 3.3 GHz, the 6100 doesn't fare well. It's easily out-gamed by the FX-4170, and only gets a bit of a break in highly threaded apps.
In my opinion, if you are on a budget, the FX4170 can be a decent cpu, it is $10 cheaper than the i3, it isn't much but might be able to give you a slightly bigger budget on graphics. Its not a complete wash either. The power consumption might be high but nothing a desktop can't handle, would be more expensive for people who pay more for power but generally in north america, power is pretty cheap.
At 3.3 GHz, the 6100 doesn't fare well. It's easily out-gamed by the FX-4170, and only gets a bit of a break in highly threaded apps.
Does that mean the FX-4170 and 6100 really share the same MSRP? Because I was thinking that it may just be a Newegg (shop-specific) price thing (possibly a sale/discount).
I don't think it would've been a bad idea to have it around if that were the case (though I don't mean to impose more work on you guys). It might've been something to see what relative application performance score it would've gotten compared to the i5, as well as how much it performs less compared to the FX-4170. (I imagine by not so much since a lot of the games were GPU-limited already.)
NO fanboi here....
My current AMD powered laptop plays Skyrim reasonably, something that would have easily required another $100 to do with an Intel laptop at the time of upgrade.
Next time? Whatever has the best numbers will be what I buy... (I always wait to see if a chipset or CPU/GPU is plagued with problems before I buy - usually 3-6 months after they hit market)
Brand loyalty is something companies try to instill in consumers, and nowadays it has no place in a consumer's choice of hardware or software.
Buy what will do the things you NEED to do... forget about joining the war one way or the other.
If more people would buy what is on top instead of supporting only one company, more companies would have to innovate and improve their products more substantially.
A: 10 frames is significant. It's 16.6%
B: This is about benchmark performance, not whether you'll notice the drop in frames or not.
However, I will note that if your display is 60hz, and the game you're playing has vsync, then that small 5 frame difference between the game running at 55 fps and the display's 60hz actually changes to a HUGE frame drop, because the FPS will drop to 30 to stay in sync with the display.
Not something you'd like with Australian electricity prices.