AMD Ryzen 5 2400G Review: Zen, Meet Vega

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

SoC & Chipset Connectivity

As we've discussed, Raven Ridge gets eight PCIe 3.0 lanes for add-in graphics, rather than Zeppelin's 16, along with four general-purpose lanes. This isn't a deal-breaker, though. Modern graphics cards (even the high-end ones) don't fully utilize wide PCIe links. Moreover, these processors include capable on-die graphics.

The processor also has its own USB and SATA controllers, which complement the I/O you get from an X370, B350, or A320 chipset.

Swipe to scroll horizontally
Row 0 - Cell 0 USB 3.2 Gen2USB 3.1 Gen1USB 2.0PCIe Gen3 (Gfx)PCIe Gen2 (General Use)SATASATA Express
Raven Ridge4118 Lanes4 lanes2-
X370 Chipset Provides266-842 (or 4 more SATA)
B350 Chipset Provides226-622 (or 4 more SATA)
A320 Chipset Provides 126--2-


Raven Ridge supports FreeSync with supported displays and motherboards. It also supports HDCP 1.4/2.2 for streaming 4K+HDR content. AMD plans to have a production-class PlayReady 3.0 DRM graphics driver in Q3, which you'll need to stream 4K content from Netflix. Wireless display via Miracast is also supported.

The processors sport a wide range of hardware-accelerated video encode and decode features that execute on Vega's Video Core Next (VCN) silicon. Naturally, the most important comparison is to Intel's UHD Graphics 630, which has a broader range of hardware-accelerated video encode capabilities, such as MPEG-2, VP8, and VP9 8-bit. AMD does support VP9 10-bit decode, which Intel has yet to offer.

The Vega Graphics Engine

Chris Angelini covered the Vega architecture in-depth in our AMD Radeon Vega RX 64 8GB Review, so check that story out for more detail on Vega itself.

Ryzen 5 2400G features a Vega-based graphics engine with 11 Compute Units, while the lower-end Ryzen 3 2200G includes eight CUs. The 2400G wields 44 texture units (four per CU), 704 Stream processors, and 16 ROPs. That's an impressive list of resources crammed next to a quad-core CPU. But it pales in comparison to Radeon RX Vega 64's 4096 Stream processors.

AMD uses the same Raven Ridge die for its mobile and desktop products. As such, Ryzen 5 2400G looks a lot like the Ryzen Mobile 7 2700U, though the 2400G features an extra CU. It also has a lower maximum graphics frequency of 1250 MHz compared to the 7200U's 1300 MHz.

Ryzen 3 2200G and the Ryzen Mobile 5 2500U both have eight CUs, and they share the same 1100 MHz peak graphics clock rate.

Of course, comparisons to Intel's Kaby Lake-G family are inevitable. Those Intel models come with two flavors of Radeon RX Vega graphics: 100W processors featuring "GH" graphics and 65W models with "GL" graphics.

The GH implementation sports 24 CUs and 1536 Stream processors. It features a base clock of 1063 MHz that accelerates up to 1190 MHz, plus 4GB of HBM2 (4-hi stack) directly attached via Intel's EMIB technology. Single-precision performance tops out at 3.7 TFLOPS, compared to the 2400G's 1.76 TFLOPS.

Taking a step down, the GL graphics engine features 20 CUs. Lower base/boost frequencies of 931 and 1011 MHz, respectively, further differentiate the two configurations. Intel does maintain 4GB of HBM2. But peak compute performance falls to 2.6 TFLOPS compared to the 2200G's 1.126 TFLOPS. 

Aside from the brawnier allocation of CUs on Intel's Kaby Lake-G models, they also profit from HBM2 and its massive throughput advantage. Raven Ridge is fed by much slower DDR4 system memory. While overclocking is going to help augment AMD's stock graphics performance, Intel is going to enjoy the leg up in frame rate comparisons.

As an aside, AMD announced the Radeon Vega Mobile at this year's CES. It features HBM2 and the same 1.7mm Z-height as Intel's Kaby Lake-G processors. With Kaby Lake-G going into Intel's NUC form factor, there's a chance we could see AMD take a similar path to the desktop. That would give those CUs a lot more bandwidth to work with, if the company could incorporate the solution into a high-end processor. This also raises questions of whether Intel would make EMIB available to AMD.


MORE: Intel & AMD Processor Hierarchy

MORE: All CPUs Content

Paul Alcorn
Managing Editor: News and Emerging Tech

Paul Alcorn is the Managing Editor: News and Emerging Tech for Tom's Hardware US. He also writes news and reviews on CPUs, storage, and enterprise hardware.

  • InvalidError
    Looking at Zeppelin and Raven dies side by side, proportionally, Raven seems to be spending a whole lot more die area on glue logic than Zeppelin did. Since the IGP takes the place of the second CCX, I seriously doubt its presence has anything to do with the removal of 8x PCIe lanes. Since PCIe x8 vs x16 still makes very little difference on modern GPUs where you're CPU-bound long before PCIe bandwidth becomes a significant concern, AMD likely figured that nearly nobody is going to pair a sufficiently powerful GPU with a 2200G/2400G for PCIe x8 to matter.
  • Olle P
    1. Why did you use 32GB RAM for the Coffee Lake CPUs instead of the very same RAM used for the other CPUs?

    2. In the memory access tests I feel to see the relevance of comparing to higher teer Ryzen/ThreadRipper. Would rather see comparison to the four core Ryzens.

    3. Why not also test overclocking with the Stealth cooler? (Works okay for Ryzen 3!)

    4. Your comments about Coffee Lake on the last page:
    "Their locked multipliers ... hurt their value proposition...
    ... a half-hearted attempt to court power users with an unlocked K-series Core i3, ... it requires a Z-series chipset..."
    As of right now all Coffee Lake CPUs require a Z-series chipset, so that's not an added cost for overclocking. I'd say a locked multiplier combined with the demand for a costly motherboard is even worse. (This is suppsed to change soon though.)
  • AgentLozen
    Tom's must think highly of this APU to give it the Editor's Choice award. It seems to be your best bet for an extremely limited budget.

    I totally understand if you only have a few hundred dollars to build your PC with and you desperately want to get in on some master race action. That's the situation where the 2400G shines brightest. But the benchmarks show that games typically don't run well on this chip. They DO work under the right circumstances, but GTAV isn't as fun to play at low settings.

    Buying a pre-built PC from a boutique with a GeForce 1050Ti in it will make your experience noticeably better if you can swing the price.
  • akamateau
    What most writers and critics of integrated graphics processors such as AMD's APU or Intel iGP all seem to forget, is not EVERYONE in the world has a disposable or discretionary income equal to that of the United States, Europe, Japan etc. Not everyone can afford bleeding edge gaming PC's or laptops. Food, housing and clothing must come first for 80% of the population of the world.

    An APU can grant anyone who can afford at least a decent basic APU the enjoyment of playing most computer games. The visual quality of these games may not be up to the arrogantly high standards of most western gamers, but then again these same folks who are happy to have an APU also can not barely afford a 750p crt monitor much less a 4k flat screen.

    This simple idea is huge not only for the laptop and pc market but especially game developers who can only expect to see an expansion of their Total Addressable Market. And that is good for everybody as broader markets help reduce the cost of development.

    This in fact was the whole point behind AMD's release of Mantle and Microsoft and The Kronos Group's release of DX12 and Vulkan respectively.

    Today's AMD APU has all of the power of a GPU Add In Board of not more than a several years back.
  • Dark Lord of Tech
    Graphics still too weak , a card is still needed.
  • Blas
    "Meanwhile, every AMD CPU is overclockable on every Socket AM4-equipped motherboard" (in the last page)
    That is not correct, afaik, not for A320 chipsets. It is for B350 and X370, though.
  • salgado18
    "with a GeForce 1050Ti in it will make your experience noticeably better if you can swing the price."
    "a card is still needed"

    You do realize that these CPUs have an integrated graphics chip as strong as a GT 1030, right? And that you are comparing a ~$90 GPU to a ~$220 GPU?

    If you can swing the price, grab a GTX 1080ti already, and let us mITX/poor/HTPC builders enjoy Witcher 3 in 1080p for a fraction of the price ;)
  • InvalidError
    20700012 said:
    but then again these same folks who are happy to have an APU also can not barely afford a 750p crt monitor much less a 4k flat screen.
    When 1080p displays are available for as little as $80, there isn't much point in talking about 720p displays. I'm not even sure I can still buy one of those even if I wanted to unless I shopped used. (But then I could also shop for used 1080p displays and likely find one for less than $50.)

    The price of 4k TVs is coming down nicely, I periodically see some 40+" models with HDR listed for as little as $300, cheaper than most monitors beyond 1080p.

    20700022 said:
    Graphics still too weak , a card is still needed.
    Depends for who, not everyone is hell-bent on playing everything at 4k Ultra 120fps 0.1% lows. Once the early firmware/driver bugs get sorted out, it'll be good enough for people who aren't interested in shelling out ~$200 for a 1050/1050Ti alone or $300+ for anything beyond that. If your CPU+GPU budget is only $200, that only buys you a $100 CPU and GT1030 which is worse than Vega 11 stock.

    If my current PC had a catastrophic failure and I had to rebuild in a pinch, I'd probably go with the 2400G instead of paying a grossly inflated price for a 1050 or better.
  • Istarion
    People come here expecting to find an overclockeable 4 core with a 1080-like performance for 160$. And a good cooler. I'd love to be so optimistic :D

    Summarizing: we are saving around 50-100$ for the same low-end performance. That's 25% to 40% cheaper. What are we complaining about?!? I'd be partying right now if that happened in high-end too!!! 300$ for a 1080...

    All those comments saying "too weak", or "isn't fun to play at low settings", seriously, travel around the globe or just open your mind, there's poor people in 90% of the world, do you think they'll buy a frakking 1080 and a 8700k?!?

    And there's even non-poor people that doesn't care about good graphics! Go figure!
    Otherwise, why there are pixel graphics games all over the place? Or unoptimized/breaking early access games??

    I have a high-end pc and still lower fps to minimum for competitive play, so I won't see any difference between a 1080Ti vs a 1070 (250 vs 170fps, who's gonna see that, my cat?!? No 'cause my monitor is not fast enough!).
  • rush21hit
    As a cyber cafe owner, I would love to replace my old A5400s to the lower R3.

    Except that the DDR4 sticks went crazy expensive over here. FML