Today is AMD’s big chance to prove the value of its FX-8350 in a gaming environment, particularly with a price tag far lower than the competition from Intel. You see, it might appear that Intel has an advantage in our test because we picked its highest-end Ivy Bridge-based chip (the same opportunity given to AMD, by the way). But the price difference between the two doesn't escape us. Intel will need to justify its higher price in relation to the FX-8350.
But first, a look at power consumption and efficiency.

The FX-8350's stock power consumption doesn't look too terrible compared to Intel's, even though it's indeed higher. But we don't get the whole story from this chart, either. We didn't see AMD's chip running at its rated 4 GHz when it was under duress at stock settings. Rather, it dropped both its multiplier and voltage level under an eight-thread Prime95 workload to stay within its rated power envelope. Throttling artificially curbs the CPU's power consumption, and the big increases we see when the Vishera-based processor is overclocked come from fixed multiplier and voltage settings.
At the same time, games don't really utilize the FX-8350's ability to handle eight threads concurrently, and consequently never seem to trigger the same throttling mechanism. Also interesting is that the FX-8350, at its stock voltage setting, often exceeds the 1.35 V we set manually for overclocking. That explains why system power consumption doesn't change much between the stock and overclocked GPU load tests.

As mentioned, the stock FX-8350 doesn't throttle at all during gaming, since most titles aren't able to fully tax the chip. In fact, games actually enjoy a benefit from Turbo Core technology, which takes the CPU to 4.2 GHz. AMD’s biggest problem in the performance chart, then, is that Intel walks away with a noticeably higher average.

Using the average power consumption and average performance of all four configurations as the average for our efficiency chart, AMD's FX-8350 generates around two-thirds as much performance per watt compared to Intel's Core i7-3770K. If you’d like to run these calculations yourself, please note that we zeroed-out the average by subtracting one (100%) from the charted values.
- Chasing Bottlenecks To Eyefinity (But Not Beyond)
- Test Settings And Benchmarks
- Results: 3DMark, Aliens Vs. Predator, And Metro 2033
- Metro 2033, Second By Second
- Results: Battlefield 3, F1 2012, And Skyrim
- Battlefield 3, Frame By Frame
- Skyrim, Frame By Frame
- Power And Efficiency
- Can AMD's FX Keep Up With Its Radeon HD 7970?
I disagree. What's needed is even stronger push on the developers to use more than four cores, effectively, not some 100% load on one core and 10% on the other five cores.
I disagree. What's needed is even stronger push on the developers to use more than four cores, effectively, not some 100% load on one core and 10% on the other five cores.
I thought more cores were for multi-tasking, as in having multiple programs running simultaneously. It would suck to turn on BF3 and everything else running on my PC simply shut down because the CPU is under 100% utilization. How would i be able to play BF3 while streaming/playing some HD content on my TV that's hooked up to my same computer.
single core performance... look up some other benchmarks, where they use itunes to encode things, or when i believe winzip went from single core to multicore, it shows a GREAT difference more cores can do to performance.
the problem is that few games and few programs really scale, sure, pro applications almost always take advantage of whatever you put in them, but consumer, different story.
more cores can offer more multitasking, but they also allow the load to be shifted from one core to all 4 cores and get over all more performance when properly coded.
woulda liked to see how a 3570k does against the fx8350 running the same cfx setup. impo, the price/perf woulda tipped further in favor of intel in configs like this.
lastly, woulda liked some newer games like sleeping dogs, far cry3, max payne 3 in the benches instead of the ol' bf3 single player. i hear bf3 sp doesn't stress cpus that much. may be bf3 skewed the benches in favor of amd as much as skyrim favored intel.
Why not just use two computers?
If you could buy $4 RAM instead of $40 RAM, but the $4 RAM made your system 50% slower, would you buy it? No, because it would make your $1000 PC perform like a $500 PC.
You can only do per-component value when you're only comparing one component. In this case, the graphics cards and CPUs were being tested as a pairing (just like the title says).
Again, I enjoy reading the article. Get ready for b!tching by fanboies.... Tom.
No need. My sister's FX 8350 kicks my 3570k's ass at 4.2 ghz consistently in most benchmarks. We both run GTX 480's
what world do you live in? I payed 200 euro for my i5 3570k while my sister's 8350 cost ~ 160 and gets better performance.
Why don't use AMD FX 6x00? They are cheaper, almost 60€ in my country. You have compared AMD FX 4x00 already, but i don't see any review or article using a FX6x00 and i think it's the sweet spot for an all-in-one PC (game and work, with 8GB at least of RAM).
Sorry for my english.
Your processor is only as good as the Programming that supports it, and Intel pays developers to use code that supports it and that is missing on AMD's architecture.
Can you run the whole test again with a $200 intel quadcore
and ditch the old DX 9 game engines , too?