This brings us to the fun part of the review: the FX’s gaming performance. Typically, when we review a CPU, we use a high-end GPU to avoid bottlenecking host processor performance. That makes sense from a theoretical approach, but it's naturally not going to be practical. At what point does graphics hold back a CPU, anyway?
For this reason, when there are large differences between CPUs, we test each game twice: once with a high-end graphics card (in this case, AMD’s Radeon R9 295X2) and once with a more mainstream graphics card better balanced to match the FX. AMD's Radeon R9 270X or 285 are good matches.
When Does The Graphics Card Become The Limiting Factor?
DiRT 3's ability to scale makes it a good example. If you pair a fast CPU with a high-end graphics card, this title really flies. But if you handicap your platform with a lower-end GPU, the bottleneck becomes obvious.

Now, this isn't to say that matching a Radeon R9 270X to a Core i7-4790K is a good idea. However, if you allow graphics to become your limiting factor, the impact of CPU performance becomes less obvious. If anything, let our exploration serve to better inform you how component choice can dramatically alter the outcome of benchmark results.

We perform the same exercise in Battlefield 4, even though we measured a performance difference of only 30 percent with the high-end graphics card. First, let’s take a look at the original test:

In single-player mode, a Radeon R9 285 is enough to essentially level the playing field. It's our limiting component at Ultra detail settings and a 1920x1080 resolution. Of course, most folks still looking at Battlefield 4 are involved in the more CPU-taxing multi-player component. Unfortunately, that's difficult to benchmark reliably.

BioShock Infinite doesn’t have the reputation of being hard on hardware, but that doesn’t mean that the combination of a high-end graphics card and AMD's FX-8370E makes sense.

Once again, capping performance with a more mainstream GPU masks the potential of our various host processors. In the real-world, pairing a Radeon R9 285 and FX CPU makes sense. Substituting in a Core i7 won't yield dramatically better frame rates until you also step up to a much faster graphics configuration.

Gaming at 3840x2160 Resolution
It’s common knowledge that massive resolutions almost always lead to GPU bottlenecks. Consider it an exaggeration of what we just saw. Even at maxed-out settings, there’s almost no difference between the CPUs in spite of the relatively high frame rates. It basically doesn’t matter what CPU you pick because the graphics card creates the performance ceiling.

To put it nicely, the FX-8370E is a true middle-of-the-road CPU. Using it only makes sense as long as the graphics card you choose comes from a similar performance segment.
Depending on the game in question, AMD’s new processor has the potential to keep you happy around the AMD Radeon R9 270X/285 or Nvidia GeForce GTX 760 or 660 Ti level.
A higher- or even high-end graphics card doesn’t make sense, as pairing it with AMD's FX-8370E simply limits the card's potential.
If you pre-suppose that your sample is tainted why bother to do the testing and the article in the first place. Perhaps this is a case where your should purchase the product of the shelf in order to better serve your readers.
I think we all get it Vishera isn't exactly wonderful in single core operations, but:
A) I have yet to see any software which requires A LOT of single core power, it's 2014, if something is still single-core, it probably doesn't need all that power or il old enough to make even Vishera good at it.
B) You are comparing a 2012 architecture to a 4790K, It's like comparing Pentium 4 to a Pentium G3258.
If you pre-suppose that your sample is tainted why bother to do the testing and the article in the first place. Perhaps this is a case where your should purchase the product of the shelf in order to better serve your readers.
8150, 8320, 8230e, 8350, 8370e.
That would demonstrate the improvements of Vishera over Bulldozer, as well as any improvements offered by binning.
1) almost every vendor does this, cpus, graphics, ect..
2) the chip they received is exactly what you get when you buy it off the shelf, however every cpu/gpu ect varies by a small amount. The vendors simply make sure that review sites get the top end of that group. In all honesty we are probably talking 3% performance from the majority at most.
My 8320 will happily run 3.5/3.6ghz @ 1.15v as long as turbo core is disabled.
I will probably get the 8320E for my office computer during Black Friday. $140 is the price right now but I prefer $125 or less for an AMD CPU.
Far too many people forget the whole cost of OCing a chip. Sure, a 4.5 83XX can slightly beat a stock i5, but at what cost? The 6300 is a far more compelling CPU for tweakers. If you're lucky on a few sales, you can get the chip, cooler, and mboard for the same $200. And as pointed out here, unless you're pairing it with a top-shelf GPU, you won't see any gaming benefits with a pricier platform.
This is AMD's latest offering. The Haswell refresh is Intel's latest offering. Whatever the products' pedigrees, why shouldn't the two latest SKUs be compared?
AMD is embarrassing itself with these "new" releases. It is quite sad. I wonder how many more years they will milk "Piledriver"?
agreed, this cpu need new (limited) mobo to operate.. this making it's a minus point...
anyways we need to keep advocating good balanced built more often..
I see lot's of people keep waste money in one (op) part to only be limited by another parts in his system...
(the true potential of the system is nowhere to be seen)
agreed, this cpu need new (limited) mobo to operate.. this making it's a minus point...
anyways we need to keep advocating good balanced built more often..
I see lot's of people keep waste money in one (op) part to only be limited by another parts in his system...
(the true potential of the system is nowhere to be seen)
Agreed, too many people, and some that I personally know will throw a high end K chip in their rig and match it with a $120 GPU while not wanting to overclock said CPU, and then get mad because they can't max out new titles. Recently, a friend's brand new i7 rig was out ran by my overclocked FX rig in a bet on the Metro LL benchmark due to his GTX 650 GPU vs my heavily overclocked R9 280X
However, it seems that AMD won't be making any new CPU architectures until 2016. I'm doubtful that AMD will manage to push the clock any further in the near-future, though 5 GHz is possible. A 200W part will make your PC a space heater.
For the 2016 build, there's a chance that AMD may be revamping the CPU drastically, but there's also the chance that AMD will just give up. The third alternative is that they will release a CPU update for game consoles.
I'm also doubtful about the hybrid x86/ARM chip they want to make. In theory, it's sound, but I'm thinking of the complications from programming the thing, plus the potential for bugs.