Tom's Hardware Verdict
The Ryzen 5 2400G redefines our expectations for integrated graphics. It represents a great deal for budget gaming rig builders, and the ability to purchase a single chip without the added expense of a GPU adds to the value. You can tune the CPU, memory, and Vega graphics to boost performance, and compatibility with the existing 300-series motherboard ecosystem is a plus, but you’ll need to make sure the BIOS is compatible.
Solid 720p gaming performance
Passable 1080p gaming in some titles with low settings
Eight lanes for PCIe slots
Need to ensure motherboard BIOS compatibility
Requires a better heatsink for overclocking
Why you can trust Tom's Hardware
Gamers on a budget know that there aren't many options for affordable platforms capable of passable performance, especially with mainstream graphics cards flying off shelves and landing in cryptocurrency mining rigs. AMD aims to give those folks an all-in-one solution with a fresh wave of what the company once called Accelerated Processing Units. Although it's shying away from using APU these days, the new Raven Ridge chips combine host processing, graphics, memory control, and fixed-function accelerators, just like their predecessors. One of the best gaming CPUs you can buy, the flagship Ryzen 5 2400G comes with four SMT-enabled Zen cores and 11 Radeon Vega Compute Units that deliver up to 1.76 TFLOPS. According to AMD, it should be fast enough to run some AAA games at 1080p with low-quality detail settings.
The Raven Ridge family follows last year's Summit Ridge debut, where we were introduced to AMD's Zen architecture in CPU form, without integrated graphics. The 4.8-billion-transistor Zeppelin die allowed AMD to cram eight cores, lots of cache, and plenty of PCIe into a Socket AM4 interface. But it was only an option if you were pairing it up to a discrete GPU. Obviously, that left out the masses content with integrated graphics. Before now, those folks could either pick between Intel's modern Core processors or the aging Bristol Ridge APUs, with Excavator cores and GCN 3.0-based graphics.
Clearly, AMD's Zen design needed a companion, and the Vega graphics architecture was a logical choice for modernizing the company's portfolio. Though enthusiasts have mixed feelings about Radeon RX Vega 64 and 56 cards, we'll soon see that the graphics architecture works particularly well in an integrated package. As proof, even Intel is leaning on Vega graphics for its Kaby Lake-G processors.
Raven Ridge couldn't hit the market at a more interesting time. We're weathering the worst GPU shortage ever as cryptocurrency miners snatch up discrete cards in bulk to fuel their bullish outlooks on Ethereum and other altcoins. So, PC gamers may be willing to consider less expensive hardware to tide them over until add-in boards become more affordable. And those who consider Raven Ridge for its value may stay for some fun, because we’re finding that these processors are great for tuners and enthusiasts alike.
Climbing Raven Ridge
At least to start, Raven Ridge is available in two SKUs. Again, the flagship Ryzen 5 2400G boasts four Zen cores with simultaneous multi-threading and 11 CUs, yielding 704 Stream processors. It should be priced around $170.
There's also a Ryzen 3 2200G that comes with four physical cores (without SMT) and eight CUs (512 Stream processors) for a mere $100. AMD positions this processor for the eSports crowd interested in 720p gaming.
Both Raven Ridge models make good on AMD's promise to support the AM4 platform until 2020; they drop into standard Socket AM4 interfaces on motherboards with display outputs. Of course, existing boards need a firmware update to recognize the new models, while newer platforms will include a "Ryzen Desktop 2000 Ready" badge signaling drop-in compatibility.
AMD continues with its basic value proposition of offering unlocked ratio multipliers on all of its processors. And now you can optimize the on-die graphics, too. A refined memory controller officially supports DDR4-2933 (up from DDR4-2666) for dual-channel kits, and also touts improved memory overclocking capabilities. That's an important improvement for extracting maximum performance from an SoC heavily dependent on available bandwidth.
Interestingly, these new processors will replace the existing Ryzen 5 1400 and Ryzen 3 1200 models. Many of the notable differences between those older CPUs and the new ones are tied to a single four-core CCX (Core Complex) design and AMD's 14nm+ process. The outgoing Ryzen models employed two CCXes, leaving no room on the die for a graphics engine.
|Row 0 - Cell 0
|Ryzen 5 2400G
|Ryzen 5 1400
|Ryzen 3 2200G
|Ryzen 3 1200
|CPU Cores / Threads
|4 / 8
|4 / 8
|4 / 4
|4 / 4
|CPU Base/Boost Frequency (GHz)
|3.6 / 3.9
|3.2 / 3.4
|3.5 / 3.7
|3.1 / 3.2
|11 (704 ALUs)
|8 (512 ALUs)
|iGPU Clock (MHz)
|up to 1250
|up to 1100
|up to DDR4-2933
|up to DDR4-2666
|up to DDR4-2933
|up to DDR4-2666
|PCIe 3.0 Lanes
The move to a single CCX eliminates the need for communication between distant groups of cores, so memory and cache access latency is more consistent than we've seen from other Ryzen models. Then again, each CCX usually has 8MB of cache. AMD took the redesign a step further and also reduced the amount of cache on a single CCX, so the Raven Ridge chips only come with 4MB of L3 cache. Fortunately, gaming tends to prefer lower memory latency over high capacity. We'll explore this in more depth through our benchmarks, though.
AMD also tells us that its 14nm+ manufacturing process is more efficient than what came before, facilitating higher operating frequencies. Sure enough, both new Ryzen chips enjoy a 400 MHz base clock rate improvement over Ryzen 5 1400 and Ryzen 3 1200. Moreover, those older CPUs utilized a dual-core Precision Boost feature. But now the company is using a more sophisticated multi-core Precision Boost 2 algorithm that can accelerate by up to 500 MHz.
PCI Express 3.0 connectivity is still available through the Raven Ridge processors. You get four lanes dedicated to the chipset and four more that work well for connecting PCIe-based storage. An additional eight lanes are available for attaching discrete graphics, though that's unfortunately a step back from Summit Ridge-based CPUs with 16 extra lanes. Then again, we don't expect anyone to run a multi-GPU config on an entry-level platform.
Then there's the issue of pricing. Ryzen 5 2400G features the same number of CPU threads and cores at the same price as Ryzen 5 1400, but now it also includes integrated graphics. The same applies to Ryzen 3 2200G versus Ryzen 3 1200, though in that case, you'll actually pay $10 less for Raven Ridge. This puts Ryzen 3 2200G up against some of Intel's Pentium processors. Both AMD models include a bundled 65W cooler, too.
|2 DIMMS - Single Rank
|up to DDR4-2933
|4 DIMMS - Single Rank
|up to DDR4-2133
|2 DIMMS - Dual Rank
|up to DDR4-2667
|4 DIMMS - Dual Rank
|up to DDR4-1866
Ryzen 5 2400G and Ryzen 3 2200G are rated at 65W, just like Ryzen 5 1400 and Ryzen 3 1200. That means swapping out one CCX for a handful of Compute Units ends up being a wash for power. AMD points out that all AM4 motherboards support 95W as a basic requirement, even in the mini-ITX form factor. This leaves plenty of headroom for overclocking. We're also expecting 400-series motherboards to surface in April, along with Zen+ CPUs. Those boards will be less expensive than what we have now, and we anticipate that they'll incorporate lower power consumption, better multi-hub USB throughput, improved power delivery, and memory layout optimizations. All of the existing Ryzen models will drop right in.
As mentioned, AMD doesn't want to call its Raven Ridge chips APUs, perhaps in an effort to shed preconceived notions of lackluster performance from the previous-gen implementations. To AMD's credit, Raven Ridge is an entirely new beast. But the company now wants us to call its flagship the AMD Ryzen 5 2400G with Radeon Vega Graphics. No matter what you call it, though, the 2400G is a powerful chip for $170.
MORE: Best CPUs
MORE: All CPUs Content
Paul Alcorn is the Managing Editor: News and Emerging Tech for Tom's Hardware US. He also writes news and reviews on CPUs, storage, and enterprise hardware.
Looking at Zeppelin and Raven dies side by side, proportionally, Raven seems to be spending a whole lot more die area on glue logic than Zeppelin did. Since the IGP takes the place of the second CCX, I seriously doubt its presence has anything to do with the removal of 8x PCIe lanes. Since PCIe x8 vs x16 still makes very little difference on modern GPUs where you're CPU-bound long before PCIe bandwidth becomes a significant concern, AMD likely figured that nearly nobody is going to pair a sufficiently powerful GPU with a 2200G/2400G for PCIe x8 to matter.Reply
1. Why did you use 32GB RAM for the Coffee Lake CPUs instead of the very same RAM used for the other CPUs?Reply
2. In the memory access tests I feel to see the relevance of comparing to higher teer Ryzen/ThreadRipper. Would rather see comparison to the four core Ryzens.
3. Why not also test overclocking with the Stealth cooler? (Works okay for Ryzen 3!)
4. Your comments about Coffee Lake on the last page:
"Their locked multipliers ... hurt their value proposition...
... a half-hearted attempt to court power users with an unlocked K-series Core i3, ... it requires a Z-series chipset..."As of right now all Coffee Lake CPUs require a Z-series chipset, so that's not an added cost for overclocking. I'd say a locked multiplier combined with the demand for a costly motherboard is even worse. (This is suppsed to change soon though.)
Tom's must think highly of this APU to give it the Editor's Choice award. It seems to be your best bet for an extremely limited budget.Reply
I totally understand if you only have a few hundred dollars to build your PC with and you desperately want to get in on some master race action. That's the situation where the 2400G shines brightest. But the benchmarks show that games typically don't run well on this chip. They DO work under the right circumstances, but GTAV isn't as fun to play at low settings.
Buying a pre-built PC from a boutique with a GeForce 1050Ti in it will make your experience noticeably better if you can swing the price.
What most writers and critics of integrated graphics processors such as AMD's APU or Intel iGP all seem to forget, is not EVERYONE in the world has a disposable or discretionary income equal to that of the United States, Europe, Japan etc. Not everyone can afford bleeding edge gaming PC's or laptops. Food, housing and clothing must come first for 80% of the population of the world.Reply
An APU can grant anyone who can afford at least a decent basic APU the enjoyment of playing most computer games. The visual quality of these games may not be up to the arrogantly high standards of most western gamers, but then again these same folks who are happy to have an APU also can not barely afford a 750p crt monitor much less a 4k flat screen.
This simple idea is huge not only for the laptop and pc market but especially game developers who can only expect to see an expansion of their Total Addressable Market. And that is good for everybody as broader markets help reduce the cost of development.
This in fact was the whole point behind AMD's release of Mantle and Microsoft and The Kronos Group's release of DX12 and Vulkan respectively.
Today's AMD APU has all of the power of a GPU Add In Board of not more than a several years back.
Graphics still too weak , a card is still needed.Reply
"Meanwhile, every AMD CPU is overclockable on every Socket AM4-equipped motherboard" (in the last page)Reply
That is not correct, afaik, not for A320 chipsets. It is for B350 and X370, though.
"with a GeForce 1050Ti in it will make your experience noticeably better if you can swing the price."Reply
"a card is still needed"
You do realize that these CPUs have an integrated graphics chip as strong as a GT 1030, right? And that you are comparing a ~$90 GPU to a ~$220 GPU?
If you can swing the price, grab a GTX 1080ti already, and let us mITX/poor/HTPC builders enjoy Witcher 3 in 1080p for a fraction of the price ;)
When 1080p displays are available for as little as $80, there isn't much point in talking about 720p displays. I'm not even sure I can still buy one of those even if I wanted to unless I shopped used. (But then I could also shop for used 1080p displays and likely find one for less than $50.)20700012 said:but then again these same folks who are happy to have an APU also can not barely afford a 750p crt monitor much less a 4k flat screen.
The price of 4k TVs is coming down nicely, I periodically see some 40+" models with HDR listed for as little as $300, cheaper than most monitors beyond 1080p.
Depends for who, not everyone is hell-bent on playing everything at 4k Ultra 120fps 0.1% lows. Once the early firmware/driver bugs get sorted out, it'll be good enough for people who aren't interested in shelling out ~$200 for a 1050/1050Ti alone or $300+ for anything beyond that. If your CPU+GPU budget is only $200, that only buys you a $100 CPU and GT1030 which is worse than Vega 11 stock.20700022 said:Graphics still too weak , a card is still needed.
If my current PC had a catastrophic failure and I had to rebuild in a pinch, I'd probably go with the 2400G instead of paying a grossly inflated price for a 1050 or better.
People come here expecting to find an overclockeable 4 core with a 1080-like performance for 160$. And a good cooler. I'd love to be so optimistic :DReply
Summarizing: we are saving around 50-100$ for the same low-end performance. That's 25% to 40% cheaper. What are we complaining about?!? I'd be partying right now if that happened in high-end too!!! 300$ for a 1080...
All those comments saying "too weak", or "isn't fun to play at low settings", seriously, travel around the globe or just open your mind, there's poor people in 90% of the world, do you think they'll buy a frakking 1080 and a 8700k?!?
And there's even non-poor people that doesn't care about good graphics! Go figure!
Otherwise, why there are pixel graphics games all over the place? Or unoptimized/breaking early access games??
I have a high-end pc and still lower fps to minimum for competitive play, so I won't see any difference between a 1080Ti vs a 1070 (250 vs 170fps, who's gonna see that, my cat?!? No 'cause my monitor is not fast enough!).
As a cyber cafe owner, I would love to replace my old A5400s to the lower R3.Reply
Except that the DDR4 sticks went crazy expensive over here. FML