Gaming Shoot-Out: 18 CPUs And APUs Under $200, Benchmarked

Now that Piledriver-based CPUs and APUs are widely available (and the FX-8350 is selling for less than $200), it's a great time to compare value-oriented chips in our favorite titles. We're also breaking out a test that conveys the latency between frames.

At least on the desktop, dual-core processors rarely helped bolster performance when they were first introduced. Most mainstream apps simply hadn't been optimized for multiple cores; that sort of technology was principally enabled in the server and workstation space. You had multi-socket motherboards with single-core chips cranking on complex problems in parallel. But games were almost exclusively written to run on a one core.

Programming with threading in mind isn't easy, and it took developers years to adapt to a world where CPUs seemed destined to improve performance through parallelism rather than then 10 GHz clock rates Intel had foreshadowed back in 2000. Slowly, though, the applications most able to benefit from multiple cores working in concert have been rewritten to utilize modern hardware.

Want proof? Just have a look at our benchmark suite. We test something like two pieces of software that are still single-threaded: Lame and iTunes. Everything else, to one degree or another, is threaded. Content creation, compression, and even productivity apps tax the highest-end four- and six-core CPUs.

Games, on the other hand, have taken longer to "get there." With a primary emphasis on graphics performance, it's not surprising that single-threaded engines still exist. However, spawning additional threads and utilizing a greater number of cores allows ISVs to implement better artificial intelligence or add more rigid bodies that can be affected by physics.

Increasingly, then, we're seeing more examples of games exhibiting better performance when we use quad-core processor. They're still the exception though, rather than the rule. And that's why the great single-threaded performance of Intel's Sandy Bridge architecture (and later Ivy Bridge) dominated most of our processor-bound game testing. Back in the day, dual-core Pentiums went heads-up against quad-core CPUs from AMD, and came out in the lead.

It's now clear that gunning for higher and higher clock rates is not the direction AMD and Intel are going. They're both building desktop-oriented CPUs with as many as four modules (in AMD's case) or six cores (in Intel's). In turn, game developers continue getting better about utilizing available on-die resources. We're clearly at a point where you need at least a dual-core CPU to enjoy today's hottest titles, if for no other reason than sticking with a single-core chip would put you about eight years back in processor technology. But is there a reason to skip over the dual-core models and jump right into the world of gaming on a quad-core CPU?

That's what we're hoping to answer today, and we have a new tool to help us.

This thread is closed for comments
276 comments
    Your comment
  • e56imfg
    Nice round up. It's nice to finally shine some light onto the new FX chips.
  • esrever
    Wow. Frame latencies are completely different than the results on the tech report. Weird.
  • ingtar33
    so... the amd chips test as good as the intel chips (sometimes better) in your latency test, yet your conclusion is yet again based on average FPS?

    what is the point of running the latency tests if you're not going to use it in your conclusion?
  • shikamaru31789
    I was hanging around on the site hoping this would finally get posted today. Looks like I got lucky. I'm definitely happy that newer titles are using more threads, which finally puts AMD back in the running in the budget range at least. Even APU's look like a better buy now, I can't wait to see some Richland and Kaveri APU tests. If one of them has a built in 7750 you could have a nice budget system, especially if you paired it with a discrete GPU for Crossfire.
  • hero1
    ingtar33so... the amd chips test as good as the intel chips (sometimes better) in your latency test, yet your conclusion is yet again based on average FPS?what is the point of running the latency tests if you're not going to use it in your conclusion?


    Nice observation. I was wondering the same thing. It's time you provide conclusion based upon what you intended to test and not otherwise. You could state the FPS part after the fact.
  • Anik8
    I like this review.Its been a while now and at last we get to see some nicely rounded up benchmarks from Tom's.I wish the GPU or Game-specific benchmarks will be conducted in a similar fashion instead of stressing too much on bandwidth,AA or using settings that favor a particular company only.
  • cleeve
    ingtar33so... the amd chips test as good as the intel chips (sometimes better) in your latency test, yet your conclusion is yet again based on average FPS?what is the point of running the latency tests if you're not going to use it in your conclusion?


    We absolutely did take latency into account in our conclusion.
    I think the problem is that you totally misunderstand the point of measuring latency, and the impact of the results. Please read page 2, and the commentary next to the charts.

    To summarize, latency is only relevant if it's significant enough to notice. If it's not significant (and really, it wasn't in any of the tests we took except maybe in some dual-core examples), then, obviously, the frame rate is the relevant measurement.

    *IF* the latency *WAS* horrible, say, with a high-FPS CPU, then in that case latency would be taken into account in the recommendations. But the latencies were very small, and so they don't really factor in much. Any CPUs that could handle at least four threads did great, the latencies are so imperceptible that they don't matter.
  • cleeve
    esreverWow. Frame latencies are completely different than the results on the tech report. Weird.


    Not really. We just report them a little differently in an attempt to distill the result. Read page 2.
  • cleeve
    Anik8.I wish the GPU or Game-specific benchmarks will be conducted in a similar fashion instead of stressing too much on bandwidth,AA or using settings that favor a particular company only.


    I'm not sure what you're referring to. When we test games, we use a number of different settings and resolutions.
  • znakist
    Well it is good to see AMD return to the game. I am an intel fan but with the recent update on the FX line up i have more options. Good work AMD
  • wh3resmycar
    hey don.. where's dota 2? :(
  • Nintendo Maniac 64
    Hey Tom's, I think I may have found a bug with the new layout. Even though this article is stated to be "IN REVIEWS", it in fact doesn't appear on the "all reviews" page:
    http://www.tomshardware.com/articles/?articleType=review
  • cleeve
    147653 said:
    hey don.. where's dota 2? :(


    It's coming. Lots of other stuff to do, but it's coming. :)
  • kalliman
    Okay, But Gigabyte 990FXA-UD5 costs 164 $. Gigabyte Z77X-UP7 - 399. Also RAM timings and speed prefers Intel... This benchmark is completely unaccurate. Also 8350 is cheaper than i5s.

    An exactly benchmarks or "benchmarks" like this one misleads most of the Intel's fun-boys.
  • Nintendo Maniac 64
    Who would have thought? AMD's "MOAR CORES!" is actually paying off. :P
  • LORD_ORION
    BigMack70You know what I think this article shows more than anything? How freaking awesome the Phenom II x4 / x6 chips were for low-midrange builds for their time.


    Except they weren't cheap for their time.
    eg: 945 was $280 for its 1st year? :\

    Spend that now and what do you get?
  • wh3resmycar
    can't move my wallet without dota 2 numbers. just can't.

    but then again what's making AMD hard to swallow is the abysmal TDP ratings of their APUs. hopefully you guys can explicitly explain how an a8-5500 manages 65w while a a8-5600k pulls 100w with just a 300mhz difference?

    or with power constraints, what would be more effective? an ivy bridge celeron + 6670 or a6/a8 APU? apart from the usual load/idle, what about posting-in-a-forum-power-consumption?

    i would love to post my questions on the forums but i'm pretty sure the thread'll be just ravaged by fanboys or wouldn't get a pertinent answer.


    thanks!!!
  • rdlazar
    Where would the Athlon II x4 750K be on that list?
  • The-Darkening
    Isn't Intel G860 13-13-13 latencies a bit too relaxed? I'm sure 10-10-10 would be possible. Any explanation for this? Just curious.
  • cangelini
    Nintendo Maniac 64Hey Tom's, I think I may have found a bug with the new layout. Even though this article is stated to be "IN REVIEWS", it in fact doesn't appear on the "all reviews" page:http://www.tomshardware.com/articl [...] ype=review

    Because there are so many products, this was defined as a round-up. Unfortunately, it doesn't look like it's currently possible to use both Reviews and Round-ups as filters at the same time. So, they don't show up together. I'm going to pass this feedback back to France to see if Round-ups can be folded into Reviews.
  • The_Trutherizer
    how is it possible for minimum to be more than average? Or am I missing something?
  • DarkSable
    122690 said:
    You know what I think this article shows more than anything? How freaking awesome the Phenom II x4 / x6 chips were for low-midrange builds for their time.


    The Phenom II x4 black edition, from everything this test was saying, is STILL the best budget CPU by far.

    At least that's what it looks like to me. Too bad they didn't test the new ivy bridge Pentiums.
  • The_Trutherizer
    oh.. as compared to the g860... -_- It is a confusing graph. Would be nice to have a non-relative metric.
  • ingtar33
    cleeveWe absolutely did take latency into account in our conclusion. I think the problem is that you totally misunderstand the point of measuring latency, and the impact of the results. Please read page 2, and the commentary next to the charts.To summarize, latency is only relevant if it's significant enough to notice. If it's not significant (and really, it wasn't in any of the tests we took except maybe in some dual-core examples), then, obviously, the frame rate is the relevant measurement.*IF* the latency *WAS* horrible, say, with a high-FPS CPU, then in that case latency would be taken into account in the recommendations. But the latencies were very small, and so they don't really factor in much. Any CPUs that could handle at least four threads did great, the latencies are so imperceptible that they don't matter.


    very well.

    I don't envy you guys. CPUs these days are basically indistinguishable on the high end from eachother. While benches can show minor differences, generally speaking no human can tell the difference between an i5-3570k an i7-3770k or an fx 6300/8350. the framerates, user experiences and the rest are pretty much identical. the chips are close enough that you can find benches that will favor the FX over the i series... not many... and of course there will be many that favor the i series. But the point is there is little performance difference...

    Even the a10-5800k... unless you're gaming in HD, is basically indistinguishable from an i7-3770k with a discreat gpu... granted no one who can afford an i7 would play games on anything less then HD, but the point stands... the user experience is basically identical at 720p and lower.

    So I can't imagine its easy to do your job here. As you're being asked to judge something you can't judge with your eyes... and left to the small percentages and milliseconds of difference between different cpus...