AMD Ryzen 5 2400G Review: Zen, Meet Vega

14nm+, Precision Boost 2 & Power Management

14nm+ & Precision Boost 2

According to AMD, its 14nm+ process is denser and more power-efficient than the 14nm node it was using previously. However, the company isn't sharing much beyond those claims. To be clear, this is not the GlobalFoundries 12nm LP process that AMD will transition to in April when the Zen+ processors are expected to launch. That new process will provide even more of a performance boost over the current 14nm+ LPP FinFET.

We do know that 14nm+ enables higher frequencies at a given voltage, which AMD turns into higher base and boost clocks. The company also improved its Precision Boost 2 feature, which is comparable to Intel's multi-core Turbo Boost technology.

Precision Boost 2 is a DVFS (Dynamic Voltage Frequency Scaling) implementation designed to improve performance in multi-threaded workloads. AMD's current-gen Ryzen processors only offer dual-core or all-core boost frequencies. But the Precision Boost 2 algorithms operate on anywhere from one to eight active threads. This should help Ryzen 5 2400G capitalize on the architecture's already-strong threaded performance. AMD can also now control the frequency and voltage of each core independently (in the past, Ryzen processors could only adjust each CCX as an entire unit).

This technology should help when relatively light threads keep other cores active. These lighter threads don't utilize a given core fully, but because the core is working on something, it can still cause the processor to drop from its dual-core turbo setting into a slower all-core frequency. Game engines are notorious for this type of behavior, often running several helper threads (such as audio) on different cores.

AMD doesn't share a list of specific multi-core Precision Boost bins because the algorithm is truly opportunistic and will boost to different frequencies based upon temperature, current, and load. That isn't too surprising—Intel also stopped sharing its multi-core Turbo Boost ratios for similar reasons.

Precision Boost 2 is intricately woven into the capabilities of AMD's SenseMI suite. For instance, Pure Power uses an array of 1000 sensors to monitor all of those critical parameters, thus enabling real-time adjustments. This information flows through the Infinity Fabric. The coherent control and data interface services six different clients in the SoC, including the multimedia engines, display engine, DDR4 memory controllers, I/O and System Hub, host processing cores, and the graphics engine. AMD split the Infinity Fabric into control and data planes to optimize performance and granularity (1ms intervals) for the real-time telemetry data.

Power Enhancements

As with any product destined for mobile applications, power is key. Raven Ridge-based SoCs have the ability to shut down different blocks in order to curb consumption. The SoC also uses internal and external (on the motherboard) voltage regulators that communicate with each other, but operate independently. This allows the processor to deactivate a regulator when it isn't needed, dropping the chip into a lower power state.

Intel's Kaby Lake and AMD's Bristol Ridge processors feature two power rails, one dedicated to the CPU and another dedicated to the GPU. Raven Ridge employs a single rail for both regions to enable power sharing. This allows the SoC to dedicate more current to regions that are experiencing heavier load, purportedly boosting performance.

Shutting off areas of the chip, or power gating, requires a fast resumption time (gate exit). Simply put, if you put a core to sleep, you want it to quickly resume activity when it's called upon. AMD implemented faster resumption times to allow power gating without negatively affecting the user experience.

MORE: Best CPUs

MORE: Intel & AMD Processor Hierarchy

MORE: All CPUs Content

This thread is closed for comments
89 comments
    Your comment
  • InvalidError
    Looking at Zeppelin and Raven dies side by side, proportionally, Raven seems to be spending a whole lot more die area on glue logic than Zeppelin did. Since the IGP takes the place of the second CCX, I seriously doubt its presence has anything to do with the removal of 8x PCIe lanes. Since PCIe x8 vs x16 still makes very little difference on modern GPUs where you're CPU-bound long before PCIe bandwidth becomes a significant concern, AMD likely figured that nearly nobody is going to pair a sufficiently powerful GPU with a 2200G/2400G for PCIe x8 to matter.
  • Olle P
    1. Why did you use 32GB RAM for the Coffee Lake CPUs instead of the very same RAM used for the other CPUs?

    2. In the memory access tests I feel to see the relevance of comparing to higher teer Ryzen/ThreadRipper. Would rather see comparison to the four core Ryzens.

    3. Why not also test overclocking with the Stealth cooler? (Works okay for Ryzen 3!)

    4. Your comments about Coffee Lake on the last page:
    "Their locked multipliers ... hurt their value proposition...
    ... a half-hearted attempt to court power users with an unlocked K-series Core i3, ... it requires a Z-series chipset..."

    As of right now all Coffee Lake CPUs require a Z-series chipset, so that's not an added cost for overclocking. I'd say a locked multiplier combined with the demand for a costly motherboard is even worse. (This is suppsed to change soon though.)
  • AgentLozen
    Tom's must think highly of this APU to give it the Editor's Choice award. It seems to be your best bet for an extremely limited budget.

    I totally understand if you only have a few hundred dollars to build your PC with and you desperately want to get in on some master race action. That's the situation where the 2400G shines brightest. But the benchmarks show that games typically don't run well on this chip. They DO work under the right circumstances, but GTAV isn't as fun to play at low settings.

    Buying a pre-built PC from a boutique with a GeForce 1050Ti in it will make your experience noticeably better if you can swing the price.
  • akamateau
    What most writers and critics of integrated graphics processors such as AMD's APU or Intel iGP all seem to forget, is not EVERYONE in the world has a disposable or discretionary income equal to that of the United States, Europe, Japan etc. Not everyone can afford bleeding edge gaming PC's or laptops. Food, housing and clothing must come first for 80% of the population of the world.

    An APU can grant anyone who can afford at least a decent basic APU the enjoyment of playing most computer games. The visual quality of these games may not be up to the arrogantly high standards of most western gamers, but then again these same folks who are happy to have an APU also can not barely afford a 750p crt monitor much less a 4k flat screen.

    This simple idea is huge not only for the laptop and pc market but especially game developers who can only expect to see an expansion of their Total Addressable Market. And that is good for everybody as broader markets help reduce the cost of development.

    This in fact was the whole point behind AMD's release of Mantle and Microsoft and The Kronos Group's release of DX12 and Vulkan respectively.

    Today's AMD APU has all of the power of a GPU Add In Board of not more than a several years back.
  • Dark Lord of Tech
    Graphics still too weak , a card is still needed.
  • Blas
    "Meanwhile, every AMD CPU is overclockable on every Socket AM4-equipped motherboard" (in the last page)
    That is not correct, afaik, not for A320 chipsets. It is for B350 and X370, though.
  • salgado18
    "with a GeForce 1050Ti in it will make your experience noticeably better if you can swing the price."
    "a card is still needed"

    You do realize that these CPUs have an integrated graphics chip as strong as a GT 1030, right? And that you are comparing a ~$90 GPU to a ~$220 GPU?

    If you can swing the price, grab a GTX 1080ti already, and let us mITX/poor/HTPC builders enjoy Witcher 3 in 1080p for a fraction of the price ;)
  • InvalidError
    2003862 said:
    but then again these same folks who are happy to have an APU also can not barely afford a 750p crt monitor much less a 4k flat screen.

    When 1080p displays are available for as little as $80, there isn't much point in talking about 720p displays. I'm not even sure I can still buy one of those even if I wanted to unless I shopped used. (But then I could also shop for used 1080p displays and likely find one for less than $50.)

    The price of 4k TVs is coming down nicely, I periodically see some 40+" models with HDR listed for as little as $300, cheaper than most monitors beyond 1080p.

    276663 said:
    Graphics still too weak , a card is still needed.

    Depends for who, not everyone is hell-bent on playing everything at 4k Ultra 120fps 0.1% lows. Once the early firmware/driver bugs get sorted out, it'll be good enough for people who aren't interested in shelling out ~$200 for a 1050/1050Ti alone or $300+ for anything beyond that. If your CPU+GPU budget is only $200, that only buys you a $100 CPU and GT1030 which is worse than Vega 11 stock.

    If my current PC had a catastrophic failure and I had to rebuild in a pinch, I'd probably go with the 2400G instead of paying a grossly inflated price for a 1050 or better.
  • Istarion
    People come here expecting to find an overclockeable 4 core with a 1080-like performance for 160$. And a good cooler. I'd love to be so optimistic :D

    Summarizing: we are saving around 50-100$ for the same low-end performance. That's 25% to 40% cheaper. What are we complaining about?!? I'd be partying right now if that happened in high-end too!!! 300$ for a 1080...

    All those comments saying "too weak", or "isn't fun to play at low settings", seriously, travel around the globe or just open your mind, there's poor people in 90% of the world, do you think they'll buy a frakking 1080 and a 8700k?!?

    And there's even non-poor people that doesn't care about good graphics! Go figure!
    Otherwise, why there are pixel graphics games all over the place? Or unoptimized/breaking early access games??

    I have a high-end pc and still lower fps to minimum for competitive play, so I won't see any difference between a 1080Ti vs a 1070 (250 vs 170fps, who's gonna see that, my cat?!? No 'cause my monitor is not fast enough!).
  • rush21hit
    As a cyber cafe owner, I would love to replace my old A5400s to the lower R3.

    Except that the DDR4 sticks went crazy expensive over here. FML
  • InvalidError
    120657 said:
    And there's even non-poor people that doesn't care about good graphics! Go figure! Otherwise, why there are pixel graphics games all over the place? Or unoptimized/breaking early access games??

    Most of my all-time favorite games (ex.: Portal 1&2) aren't particularly graphics-intensive. Better graphics doesn't automatically equate to more fun. Much of the time, it feels like a lot of the extra bling in modern games is there to distract from how unoriginal they otherwise are overall and the extra graphics budget would have been better spent in narration, game level and mechanics design.
  • btmedic04
    Awesome! I'd love to see a comparison between this APU and what $170 will get you in cpu and gpu to see how much wider the performance delta is. My only complaint is that we had to wait a year to play with these bad boys
  • PaulAlcorn
    362640 said:
    1. Why did you use 32GB RAM for the Coffee Lake CPUs instead of the very same RAM used for the other CPUs? 2. In the memory access tests I feel to see the relevance of comparing to higher teer Ryzen/ThreadRipper. Would rather see comparison to the four core Ryzens. 3. Why not also test overclocking with the Stealth cooler? (Works okay for Ryzen 3!) 4. Your comments about Coffee Lake on the last page: "Their locked multipliers ... hurt their value proposition...
    ... a half-hearted attempt to court power users with an unlocked K-series Core i3, ... it requires a Z-series chipset..."
    As of right now all Coffee Lake CPUs require a Z-series chipset, so that's not an added cost for overclocking. I'd say a locked multiplier combined with the demand for a costly motherboard is even worse. (This is suppsed to change soon though.)


    1.) That is a mislink to the wrong product, we are fixing that, good eye! We used the same capacity for all systems.

    2.) We do run into a bit of time pressure on these projects (to put it extremely lightly), especially when we are testing with new images, GPUs, and resolutions. Late FW and drivers also hurt. We have the 2200G review coming later this week and it has the 1300X in the test pool. I'll throw the memory measurements in for good measure.

    3.) Thermal generation is a bit different with this chip--like Intel's Kaby Lake-G, we can overrun coolers at stock settings by hitting both units hard at once. We've got an article coming tomorrow that goes in depth on some of the more extreme scenarios where this can happen with the stock cooler - even at stock settings.
  • PaulAlcorn
    127850 said:
    "Meanwhile, every AMD CPU is overclockable on every Socket AM4-equipped motherboard" (in the last page) That is not correct, afaik, not for A320 chipsets. It is for B350 and X370, though.


    Oops, thx, fixed!
  • FD2Raptor
    2003862 said:
    What most writers and critics of integrated graphics processors such as AMD's APU or Intel iGP all seem to forget, is not EVERYONE in the world has a disposable or discretionary income equal to that of the United States, Europe, Japan etc. Not everyone can afford bleeding edge gaming PC's or laptops. Food, housing and clothing must come first for 80% of the population of the world. An APU can grant anyone who can afford at least a decent basic APU the enjoyment of playing most computer games. The visual quality of these games may not be up to the arrogantly high standards of most western gamers, but then again these same folks who are happy to have an APU also can not barely afford a 750p crt monitor much less a 4k flat screen. This simple idea is huge not only for the laptop and pc market but especially game developers who can only expect to see an expansion of their Total Addressable Market. And that is good for everybody as broader markets help reduce the cost of development. This in fact was the whole point behind AMD's release of Mantle and Microsoft and The Kronos Group's release of DX12 and Vulkan respectively. Today's AMD APU has all of the power of a GPU Add In Board of not more than a several years back.


    Several years back?
    Considering this is a best case scenario with a pair of 3200Mhz CL14 Gskill FlareX, running off of a NVMe drive.

    Try allocating 2GB to this iGPU on a true budget scenario fitting with your "barely afford a 750p crt monitor" narrative and you'll find that titles that would struggle with just 6GB of RAM which is of the barebone 2133/2400 variety that would choke the Vega iGPU (which is already known to favor high memory bandwidth; and god help them if they even try a 1x8GB DIMM on this config due to it being cheaper than a 2x4GB kit) + stuttering from then having to deal with virtual memory on a 7200rpm HDD.
  • King_V
    What I do like as a side benefit to this article is that we can really see how far short the integrated Intel 630 Graphics performs compared to the GT 1030. The hierarchy chart, putting them only 2 tiers apart, wouldn't suggest such a vast playable-vs-unplayable difference.
  • Malik 722
    what makes apu different than cpu.
  • Giroro
    "e outlined the four-core CCXes with green boxes. Similar to what you've seen from AMD's Zeppelin die, the center of a Raven Ridge CCX contains vertical rows of L3 cache. Of course, a Zeppelin CCX has four rows of L3 cache units in the center, which add up to 8MB. The Raven Ridge die only sports four rows, giving us 4MB."

    Cache description is off. Zeppelin has 8x1MB blocks of L3 and 4 x L3 ctl blocks. Raven Ridge apparently has 4 x 1MB blocks and 2 x L3 ctl blocks.
  • chfireball
    drooling at the low power consumption! off grid/campers 100 watts for the entire system + 40 for the monitor.
  • znd125
    Great review. Thank you for putting everything into context by comparing these with Ryzen mobile and Kaby Lake G
  • Kennyy Evony
    Every release amd has done in the past 3 years has been a let down even with massive support from all the fans that bitten the bullet one way or another. I would really like to see some comparisons in real world testing before trusting any opinion toms writers are dishing out.
  • InvalidError
    1736052 said:
    I would really like to see some comparisons in real world testing before trusting any opinion toms writers are dishing out.

    Applications and games benchmarks is pretty much as close to "real world" as you are going to get if you want repeatable results within 2-3% instead of random crapshoots with +/-20% error margins.
  • none12345
    Man these chips look great. The 2200 is an ideal chip for one of the computers i need to upgrade.

    I wish there was a 2300 tho. Id rather have 4c/8t with the lower end gpu. Its not going to be a gaming machine.
  • 1_rick
    "it can still cause the processor to drop from its dual-core turbo setting into a slower all-core frequency."

    That's not actually how it works. You can see this yourself with any tool that lets you monitor core speed. Just like with Intel CPUs, it'll run as many cores as fast as it can. Typically this means with Zen 1 you'll get, in heavy workloads, 2 cores at the turbo speed and all the others at the regular speed (e.g., 2 @ 4.0 and 6 @ 3.6 on an 1800X), although it might be 4.1 and 3.7 if XFR is active.