Though we usually talk about average frames per second, an even more important measure of playability is milliseconds per frame. That's because frames that take a relatively long time to render can be quite jarring. In theory, a 91 FPS rate could include a single 100-ms frame and ninety 10-ms frames, and that one 100-ms frame would be what kills your experience.
This can happen on a single-GPU card. However, the complexities of synchronizing multiple GPUs make them more common in CrossFire and SLI configurations. We covered this micro-stutter effect in Micro-Stuttering And GPU Scaling In CrossFire And SLI, and have plans to cover this phenomenon in more depth in the next couple of months.
Since an evenly-spread 20 FPS rate would consist of 20 50-ms frames, we’re using 50 ms as the cut-off for actual playability in today’s analysis. Many gamers get annoyed with frame intervals far shorter (say, 30 ms), but that isn't as likely to get you killed as it is to simply bug you.


The performance of our Radeon HD 7970s in CrossFire appears fairly similar on our AMD- and Intel-based platforms when we run at 1920x1080. Our system based on the FX-8350 encounters a couple of higher spikes, but the worst of these we see only reaches up to 40 ms.
It's worth noting that we're using Fraps to take these measurements (currently the only solution, short of capturing the output with a PCI Express-based frame grabber). Consequently, we're not representing the entire rendering pipeline. After comparing our recorded results to actual gameplay, however, we're confident that the most egregious performance interruptions are being illustrated. Moreover, we're not comparing SLI to CrossFire, so the frame-time spikes are truly attributable to each platform.


Frame times simultaneously appear more variable (the bulk of the graph is wider) and with lower variability (the largest spikes are smaller) at 4800x900. Both platforms seldom cross the 30 ms barrier, and the AMD-based machine only spikes to 40 ms once.


You'll probably want to stop at 4800x900 or dial detail settings back to the Medium preset if 30-ms and greater frame times bother you. Ultra-quality details at this super-high resolution appear barely playable.
- Chasing Bottlenecks To Eyefinity (But Not Beyond)
- Test Settings And Benchmarks
- Results: 3DMark, Aliens Vs. Predator, And Metro 2033
- Metro 2033, Second By Second
- Results: Battlefield 3, F1 2012, And Skyrim
- Battlefield 3, Frame By Frame
- Skyrim, Frame By Frame
- Power And Efficiency
- Can AMD's FX Keep Up With Its Radeon HD 7970?
I disagree. What's needed is even stronger push on the developers to use more than four cores, effectively, not some 100% load on one core and 10% on the other five cores.
I disagree. What's needed is even stronger push on the developers to use more than four cores, effectively, not some 100% load on one core and 10% on the other five cores.
I thought more cores were for multi-tasking, as in having multiple programs running simultaneously. It would suck to turn on BF3 and everything else running on my PC simply shut down because the CPU is under 100% utilization. How would i be able to play BF3 while streaming/playing some HD content on my TV that's hooked up to my same computer.
single core performance... look up some other benchmarks, where they use itunes to encode things, or when i believe winzip went from single core to multicore, it shows a GREAT difference more cores can do to performance.
the problem is that few games and few programs really scale, sure, pro applications almost always take advantage of whatever you put in them, but consumer, different story.
more cores can offer more multitasking, but they also allow the load to be shifted from one core to all 4 cores and get over all more performance when properly coded.
woulda liked to see how a 3570k does against the fx8350 running the same cfx setup. impo, the price/perf woulda tipped further in favor of intel in configs like this.
lastly, woulda liked some newer games like sleeping dogs, far cry3, max payne 3 in the benches instead of the ol' bf3 single player. i hear bf3 sp doesn't stress cpus that much. may be bf3 skewed the benches in favor of amd as much as skyrim favored intel.
Why not just use two computers?
If you could buy $4 RAM instead of $40 RAM, but the $4 RAM made your system 50% slower, would you buy it? No, because it would make your $1000 PC perform like a $500 PC.
You can only do per-component value when you're only comparing one component. In this case, the graphics cards and CPUs were being tested as a pairing (just like the title says).
Again, I enjoy reading the article. Get ready for b!tching by fanboies.... Tom.
No need. My sister's FX 8350 kicks my 3570k's ass at 4.2 ghz consistently in most benchmarks. We both run GTX 480's
what world do you live in? I payed 200 euro for my i5 3570k while my sister's 8350 cost ~ 160 and gets better performance.
Why don't use AMD FX 6x00? They are cheaper, almost 60€ in my country. You have compared AMD FX 4x00 already, but i don't see any review or article using a FX6x00 and i think it's the sweet spot for an all-in-one PC (game and work, with 8GB at least of RAM).
Sorry for my english.
Your processor is only as good as the Programming that supports it, and Intel pays developers to use code that supports it and that is missing on AMD's architecture.
Can you run the whole test again with a $200 intel quadcore
and ditch the old DX 9 game engines , too?