Sign in with
Sign up | Sign in

Results: Gaming

AMD A10-7800 APU Review: Kaveri Hits the Efficiency Sweet Spot
By

We want to test AMD's APU as it sits on the motherboard. I see no point in buying a processor that emphasizes on-die graphics and then adding a Radeon R7 265X, even in CrossFire. Such a configuration makes little sense from the cost and technical angles. Yes, AMD officially recommends it and yes, we tried it out. But the much faster discrete card is a mismatch for the on-die engine. We even experienced detrimental effects like stuttering from this odd couple. If you want to go the add-in route, take a look at the Athlon X4 750K (or Pentium G3258) instead.

Let's instead stick to a setup better suited to the strengths of an APU: a more highly integrated all-in-one system with as few external components as possible.

Metro: Last Light

Both upper-end APUs will run at 1080p in this game, but they barely manage to average over 30 FPS. You're better off at 720p, where even the A10-7700K generates playable performance.

Battlefield 4

Battlefield 4 is another AAA title that can't be ignored. In single-player mode, however, where the graphics subsystem is emphasized, the APUs have a tough time maintaining adequate frame rates. You really need to run this one at 720p. Fortunately, the graphics look better than they sound.

BioShock Infinite

Since the third game in the BioShock franchise is a thinly disguised console port, its frame rates are adequate to good at 1080p.

The A10-7800 lets you play older games at medium quality settings and newer titles at lower details without discrete graphics. If you run into frame rate issues, consider lower resolutions. In other words, a Kaveri-based APU by itself is suitable for low-end gaming PCs. Since AMD's processors also drive both of the latest game consoles, we don’t expect this situation to change any time soon. We didn’t mention Mantle because most games don’t use Mantle.

Add a comment
Ask a Category Expert
React To This Article

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 70 comments.
Top Comments
  • 10 Hide
    damric , August 19, 2014 3:36 AM
    Why is this so late to market?
Other Comments
  • 2 Hide
    tiger15 , August 19, 2014 1:49 AM
    You are stressing power efficiency.
    What about comparing those numbers with other offerings? (Intel?)
  • 8 Hide
    Memnarchon , August 19, 2014 2:09 AM
    Quote:
    Just to wonder if Microsoft or Sony were to put this chip in their next gaming consoles and give those gamers a fighting chance.


    Maybe the new consoles lack CPU power (even if they are 8 core, the 1,6Ghz/1,75Ghz cripples them), their GPU part is far more powerful than existing APUs.
    PS4's GPU has cores like 7870 and XBOX1 has cores like 7790, in other words more powerful than the 512 core R7 which exists in today's best APU A10-7850K.
  • 10 Hide
    damric , August 19, 2014 3:36 AM
    Why is this so late to market?
  • 5 Hide
    Cryio , August 19, 2014 3:46 AM
    Wait. You can now CrossFire A10 7850 with GPUs other than the 240 and 250X?

    I have a friend with a 7850K and a 260X and he's dying to know if he can CrossFire.

    "I see no point in buying a processor that emphasizes on-die graphics and then adding a Radeon R7 265X. Yes, AMD officially recommends it and yes, we tried it out." Can I take this as a yes ?
  • 6 Hide
    gadgety , August 19, 2014 4:33 AM
    The A8-7600 seems to be the effiency sweet spot in the Kaveri line up, specially at 45W. Trying to compare with of the A10-7800 with the A8-7600, although as far as I can tell just about ALL your tests seem to be done at different settings (e.g. BioShock Infinity is run at Medium Quality Presets rather than the lowest settings as in the test of the A10-7800) so the comparison isn't straightforward. A8-7600 is within 91-94% of the A10-7850K. One item which is comparable is video encoding in Handbrake, where the A8-7600 is at 92.8% of the 7850k, whereas the A10-7800 is at 95.7% of the 7850k. Price wise you'd pay a 63% premium for the A10-7800 over the A8-7600 to get an extremely minute performance advantage, around 3% or so.
  • -4 Hide
    Drejeck , August 19, 2014 4:37 AM
    Quote:
    Quote:
    Just to wonder if Microsoft or Sony were to put this chip in their next gaming consoles and give those gamers a fighting chance.


    Maybe the new consoles lack CPU power (even if they are 8 core, the 1,6Ghz/1,75Ghz cripples them), their GPU part is far more powerful than existing APUs.
    PS4's GPU has cores like 7870 and XBOX1 has cores like 7790, in other words more powerful than the 512 core R7 which exists in today's best APU A10-7850K.

    Not accurate.
    PS4 GPU is a crippled and downclocked 7850 (disabled cores enhance redundancy and less dead chips)
    XB1 GPU is a crippled and downclocked R7 260X (as above) and like the 7790 should have AMD True Audio onboard, but they could have changed that. This actually means that CPU intensive and low resolution games are going to suck because the 8 cores are just Jaguar netbook processors.
    The reality is that PS4 is almost cpu limited already and the XB1 is more balanced. Now that we've finished speaking of "sufficient" platforms let's talk about the fact that a CPU from AMD and the word efficient are in the same phrase.
  • -4 Hide
    Memnarchon , August 19, 2014 4:50 AM
    Quote:

    Not accurate.
    PS4 GPU is a crippled and downclocked 7850 (disabled cores enhance redundancy and less dead chips)
    I think you need to do a little more research since: Reverse engineered PS4 APU reveals the console’s real CPU and GPU specs. "Die size on the chip is 328 mm sq, and the GPU actually contains 20 compute units — not the 18 that are specified. This is likely a yield-boosting measure, but it also means AMD implemented a full HD 7870 in silicon."
    Quote:
    XB1 GPU is a crippled and downclocked R7 260X (as above) and like the 7790 should have AMD True Audio onboard, but they could have changed that. This actually means that CPU intensive and low resolution games are going to suck because the 8 cores are just Jaguar netbook processors.
    The reality is that PS4 is almost cpu limited already and the XB1 is more balanced. Now that we've finished speaking of "sufficient" platforms let's talk about the fact that a CPU from AMD and the word efficient are in the same phrase.

    The PS4 will be CPU limited? Since they write the code/API according to a hardware that it will remain the same for like 7-8 years, such thing as CPU limited especially for a console that runs the majority of games at 1080p, does not exist...
    ps: I agree with the downclocked part since they need to save as much power as they can...
  • 5 Hide
    blubbey , August 19, 2014 5:24 AM
    Quote:
    Quote:
    Quote:
    Just to wonder if Microsoft or Sony were to put this chip in their next gaming consoles and give those gamers a fighting chance.


    Maybe the new consoles lack CPU power (even if they are 8 core, the 1,6Ghz/1,75Ghz cripples them), their GPU part is far more powerful than existing APUs.
    PS4's GPU has cores like 7870 and XBOX1 has cores like 7790, in other words more powerful than the 512 core R7 which exists in today's best APU A10-7850K.

    Not accurate.
    PS4 GPU is a crippled and downclocked 7850 (disabled cores enhance redundancy and less dead chips)
    XB1 GPU is a crippled and downclocked R7 260X (as above) and like the 7790 should have AMD True Audio onboard, but they could have changed that. This actually means that CPU intensive and low resolution games are going to suck because the 8 cores are just Jaguar netbook processors.
    The reality is that PS4 is almost cpu limited already and the XB1 is more balanced. Now that we've finished speaking of "sufficient" platforms let's talk about the fact that a CPU from AMD and the word efficient are in the same phrase.


    PS4 is 1152:72:32 at 800MHz, 7850 is 1024:64:32@ 900MHz or so (860MHz release?) It is not a "crippled 7850", the 7850 is a crippled pitcairn (20 CUs is the full fat 7870, PS4 has 18, 7850 16 CUs). "CPU limited" is very PC orientated thinking, things like offloading compute to the GPU will help. No, I'm not saying their CPUs are "good" but they will find ways of offloading that work onto the GPU.
  • 5 Hide
    silverblue , August 19, 2014 5:34 AM
    Quote:
    The A8-7600 seems to be the effiency sweet spot in the Kaveri line up, specially at 45W. Trying to compare with of the A10-7800 with the A8-7600, although as far as I can tell just about ALL your tests seem to be done at different settings (e.g. BioShock Infinity is run at Medium Quality Presets rather than the lowest settings as in the test of the A10-7800) so the comparison isn't straightforward. A8-7600 is within 91-94% of the A10-7850K. One item which is comparable is video encoding in Handbrake, where the A8-7600 is at 92.8% of the 7850k, whereas the A10-7800 is at 95.7% of the 7850k. Price wise you'd pay a 63% premium for the A10-7800 over the A8-7600 to get an extremely minute performance advantage, around 3% or so.


    Yes, but the A8-7600 has a 384-shader GPU. I suppose it depends on whether you want to use the GPU or not.
  • 6 Hide
    curtisgolen , August 19, 2014 7:54 AM

    "I see no point in buying a processor that emphasizes on-die graphics and then adding a Radeon R7 265X. Yes, AMD officially recommends it and yes, we tried it out." Can I take this as a yes ?

    This needs to be explained more... There is a lot of people that would love to use a 260x let alone a 265x
  • 1 Hide
    IInuyasha74 , August 19, 2014 8:25 AM
    Quote:

    "I see no point in buying a processor that emphasizes on-die graphics and then adding a Radeon R7 265X. Yes, AMD officially recommends it and yes, we tried it out." Can I take this as a yes ?

    This needs to be explained more... There is a lot of people that would love to use a 260x let alone a 265x



    Yes chances are you can crossfire them without having a crash or something, but its a terrible idea to do it. Instead of increasing your performance in games, your FPS would drop by more than half and your power consumption would increase greatly. Its never a good idea to crossfire with the integrated graphics, it just doesn't go well.
  • 2 Hide
    atminside , August 19, 2014 9:30 AM
    I like that AMD is working hard on getting better at efficiency, but I am really disappointed that AMD has no future plans for a AM3+ or AM4 derivative for high end CPUs. I love my Phoneme X4 955, old that it is, but with Intel being so expensive and that AMD has not delivered a better platform for me to justify an upgrade; i have been stuck with my 790xta-4d4 mobo and 955. Hope AMD will come around make plans for a new high end or performance CPU not just APU.
  • 1 Hide
    ingtar33 , August 19, 2014 9:55 AM
    anyone else notice that Kaveri was within 5% of the ipc of sandy bridge at similar clock speeds...

    how did we miss this when Kaveri came out?
  • 1 Hide
    LionD , August 19, 2014 10:04 AM
    I agree with gadgety - the REAL sweet spot is A8-7600. In all test I could found, it shows 90%+ performance of A10-7800 - I mean GPU tests, no mention to CPU. So the ridiculous cost of A10-7800 just has no sense.
  • 2 Hide
    LionD , August 19, 2014 10:12 AM
    Quote:

    Yes, but the A8-7600 has a 384-shader GPU. I suppose it depends on whether you want to use the GPU or not.

    Still, the non-synthetic GPU-related tests (gaming, OpenCL) shows little difference between A10-7800 & A8-7600. In most cases it falls within 10% and NEVER reaches theoretical 25% - even 20.
  • 1 Hide
    falchard , August 19, 2014 10:26 AM
    AMD will probably go with a new naming convention if it makes a new desktop mobo socket. More than likely, the days of not getting graphics on chip are over. I think AMD is going to always have an APU from this day forward. So it will probably mean you should wait for the next APU architecture and not invest in this one. As we all know these APU are pretty much dual cores with a almost four cores. Most applications will treat them as dual cores. The next APU architecture will be complete cores and more worthwhile to invest in.
    The mobo I am looking for next is probably going to be a server based board. When the AMD Socket G34 was released, there were a few desktop variants.
  • 1 Hide
    smoohta , August 19, 2014 10:28 AM
    I agree with tiger15 - these power consumption/efficiency graphs are interesting but utterly useless without comparing them to other offerings.

    Also, I was wondering whether you could expand on the HSA benchmark- it sounds very interesting but you offer no information on what it actually does (except that it was originally provided by AMD)...
    For example- how much data does this benchmark actually use?
    Did you try increasing/decreasing the amount of data to see where HSA starts being effective?

    Also in HSA- comparing between the processors by percentage seems pretty misleading (and is not the way it is done in other benchmarks)... is it possible to add absolute measurements here?
  • 0 Hide
    icemunk , August 19, 2014 10:33 AM
    This makes me want a little FM2 system even more... just a low power consumption device that can play some pretty decent games and reasonable frames. The biggest downfall for me right now is the lack of cheap FM2 mITX boards available. The cheapest I've saw are sitting in the $130-160 range, which is far too expensive. If I could have a cheap little $50-60 mITX mobo, along with this APU; in a little mini-ITX case at a reasonable price, ($200-300) I would buy one today. I refuse to pay $150 for a mITX board though.
  • 1 Hide
    curtisgolen , August 19, 2014 10:35 AM
    Quote:
    Quote:

    "I see no point in buying a processor that emphasizes on-die graphics and then adding a Radeon R7 265X. Yes, AMD officially recommends it and yes, we tried it out." Can I take this as a yes ?

    This needs to be explained more... There is a lot of people that would love to use a 260x let alone a 265x



    Yes chances are you can crossfire them without having a crash or something, but its a terrible idea to do it. Instead of increasing your performance in games, your FPS would drop by more than half and your power consumption would increase greatly. Its never a good idea to crossfire with the integrated graphics, it just doesn't go well.


    hat is not true according to AMD. Could not find it on AMD website but read this. http://wccftech.com/amd-kaveri-dual-graphics-works-ddr3-memory-based-radeon-r7-gpus/
Display more comments
React To This Article