Why you can trust Tom's Hardware
AMD Ryzen 7 5700G Test Setup
AMD seeded us with review samples, but we posted a review early by sourcing AMD's Ryzen 7 5700G in an HP Pavilion TP01-2066 system so we could strip the chip and get to testing, but the system configuration sent us down another rabbit hole. The TP01 is an office machine, and the motherboard's two DIMM slots and heatsink-less five-phase power delivery subsystem tells us it isn't designed with gaming performance in mind. That doesn't excuse the memory configuration, though.
We purchased this system for $700 through the Office Depot website, but it's only available with 16GB of memory. The system shipped with a single 16GB Kingston memory DIMM, and Office Depot doesn't give you an option to order the system with two 8GB DIMMs, or even two 16GB DIMMs.
For those not in the know, populating a single channel of a dual-channel memory controller halves the potential memory bandwidth, which is particularly brutal for APUs because the integrated graphics engine is heavily reliant upon system memory.
Yes, we know that it's a common practice for some OEM systems to ship with only a single DIMM. But no, retailers shouldn't ship APU-powered systems without populating the available memory channels (or at least offer an option to do so). To highlight the performance impact, we tested in both a single-DIMM and dual-DIMM configuration in the HP system.
Integrated graphics performance typically isn't as impacted by latency as much as just sheer bandwidth, but adding slight insult to an already grievous injury, the single DIMM matches the 5700G's stock DDR4-3200 interface but uses unalterable JEDEC timings of 22-22-22-52. Again, this is common with OEM systems, but disappointing. As we'll show in great detail below, the single-DIMM config destroys gaming performance.
Most enthusiasts will seek out this chip for the integrated graphics engine, so we tested the integrated graphics in both single- and dual-DIMM configurations in the OEM system. For comparison, we also dropped the chip into a proper motherboard, an ASRock Taichi X570, for iGPU testing on an enthusiast-class board to reflect maxed-out potential. We've also tested the chip on an ASUS Strix B550-E, but the differences between the Taichi and Strix were negligible.
AMD says the Ryzen 7 5700G slots in as what we would traditionally think of as a non-X CPU. Additionally, some enthusiasts will seek out the chip to game with the iGPU until they can find a discrete GPU after the shortage recedes. As such, we also paired the chip with the Gigabyte GeForce RTX 3090 Eagle we use for our standard gaming suite. We didn't bother testing the OEM system in our standard discrete GPU gaming test suite, because, as shown in the picture above, the 3090 won't fit into the HP system anyway. We also ran our standard application suite with both the OEM and enthusiast motherboards, so we have plenty of test results to chew over.
AMD Ryzen 7 5700G Overclocking and Thermals
Row 0 - Cell 0 | CPU Cores | Radeon RX Vega Graphics | Memory | Infinity Fabric (FCLK) |
Ryzen 7 5000G Overclock | Precision Boost Overdrive (PBO) | 2400 MHz | DDR4-4000 | 2000 MHz |
We found the 5700G's integrated RX Vega graphics engine to be an easy overclocker, jumping up to 2.4 GHz (a 400 MHz improvement over stock settings) with the SoC dialed in at 1.35V (this power domain feeds both the iGPU and SoC). Higher settings introduced artifacting, though, and we didn't attempt to add in too much additional voltage to the graphics due to our memory and core overclocks.
We dialed in an all-core 4.5 GHz CPU overclock, but the overclocked CPU cores introduced instability when we mixed in iGPU and memory overclocking. APUs are one of the trickiest types of chips to overclock, at least in terms of balancing the different units. Because gaming performance scales far better with iGPU and memory throughput than core clocks, we compromised and toggled AMD's auto-overclocking Precision Boost Overdrive (PBO) for the CPU cores. PBO alleviated the lion's share of our difficulties balancing the CPU, iGPU and memory clocks, but further tuning might have yielded a better all-core overclock. The silicon lottery always comes into play, so your mileage might vary.
We've heard that good Cezanne chips support a fabric clock (FCLK) up to 2400 MHz, but again, it's a balancing act. We settled for a 2000 MHz FCLK and dialed in an easy DDR4-4000 with a 1:1 FCLK/memory ratio. This 'coupled mode' is the sweet spot for memory latency on AMD's Zen 2 and 3 platforms, but dialing in a higher FCLK can unlock higher coupled frequencies. We're aiming to represent a basic overclock, and a DDR4-4000 kit is already very pricey for a build with an APU, so we stopped there. You should shoot for a DDR4-3200 kit with this type of chip, provided you can nab one at a decent price.
We recorded a maximum of 76C and an average of 69.7C during an extended period of multi-threaded stress tests while a simultaneous instance of Furmark hammered the overclocked iGPU. However, this is with the Corsair H115i's fans cranking away at full speed. Most enthusiasts won't have such a powerful cooling solution for this class of chip, so you'll have to adjust your expectations accordingly. It should go without saying that you shouldn't expect to pull off such vigorous overclocks with the bundled cooler. As always, you should view our overclocking results as an exhibition.
We didn't have to adjust the VDDP voltage to ensure stability, but otherwise, our overclocking efforts were very similar to what we've achieved with the Ryzen 7 4750G in the past, highlighting the similarities between the two SoCs.
AMD Ryzen 7 5700G Power Consumption and Efficiency
It's no secret that Intel has dialed up the power with Rocket Lake to compete with AMD's vastly more efficient chips. As such, there are no real surprises here — the Intel chips draw more power in every measurement.
AMD's Zen 3 models are the most power-efficient desktop PC chips we've ever tested, and the Ryzen 7 5700G brings that same level of efficiency to the APU lineup. The y-cruncher multi-threaded benchmark hammers the chip with a threaded AVX-heavy workload. The eight-core 5700G draws five fewer watts than its eight-core predecessor, the 4750G, and only three more watts than the quad-core 3400G. That's impressive given the massive performance improvements we logged in this benchmark during our application testing.
The 5700G drew more power during our HandBrake x265 and Blender workloads than the previous-gen 4750G. Still, as shown in the renders-per-day-per-watt chart above and the efficiency charts below, the chip delivers significantly more performance per watt. That's a win in any book.
Here we take a slightly different look at power consumption by calculating the cumulative amount of energy required to perform Blender and x265 HandBrake workloads, respectively. We plot this 'task energy' value in Kilojoules on the left side of the chart.
These workloads are comprised of a fixed amount of work, so we can plot the task energy against the time required to finish the job (bottom axis), thus generating a really useful power chart.
Bear in mind that faster compute times, and lower task energy requirements, are ideal. That means processors that fall the closest to the bottom left corner of the chart are best.
AMD Socket AM4 (B550, OEM) | Ryzen 7 5700G, Ryzen 5 5600G |
Row 1 - Cell 0 | ASUS ROG Strix B550-E, HP Pavilion TP01-2066 |
Row 2 - Cell 0 | 2x 8GB Trident Z Royal DDR4-3600 @ 3200, Kingston DDR4-3200 |
Intel Socket 1200 (Z590) | Core i5-11600K, Core i7-11700K |
Row 4 - Cell 0 | ASUS Maximus XIII Hero |
Row 5 - Cell 0 | 2x 8GB Trident Z Royal DDR4-3600 - 10th-Gen: Stock: DDR4-2933, OC: DDR4-4000, 11th-Gen varies, outlined above (Gear 1) |
AMD Socket AM4 (X570) | AMD Ryzen 7 5800X, Ryzen 5 5600X |
Row 7 - Cell 0 | MSI MEG X570 Godlike |
Row 8 - Cell 0 | 2x 8GB Trident Z Royal DDR4-3600 - Stock: DDR4-3200, OC: DDR4-4000, DDR4-3600 |
All Systems | Gigabyte GeForce RTX 3090 Eagle - Gaming and ProViz applications |
Row 10 - Cell 0 | Nvidia GeForce RTX 2080 Ti FE - Application tests |
Row 11 - Cell 0 | 2TB Intel DC4510 SSD |
Row 12 - Cell 0 | EVGA Supernova 1600 T2, 1600W |
Row 13 - Cell 0 | Open Benchtable |
Row 14 - Cell 0 | Windows 10 Pro version 2004 (build 19041.450) |
Cooling | Corsair H115i, Custom loop |
MORE: Best CPUs for Gaming
MORE: CPU Benchmarks Hierarchy
MORE: All CPUs Content
Current page: AMD Ryzen 7 5700G Power Consumption, Overclocking and Thermals
Prev Page Ryzen APUs Level Up Next Page AMD Ryzen 7 5700G iGPU Gaming BenchmarksPaul Alcorn is the Managing Editor: News and Emerging Tech for Tom's Hardware US. He also writes news and reviews on CPUs, storage, and enterprise hardware.
-
lazyabum Great. Integrated GPUs are being pushed for high-end graphics over the lack of Discrete GPUsReply -
hotaru251 amd had the chance to crush intel's low end if they just had more modern apu inside...let vega die already. its 2021.Reply -
AlexWolfheart Clearly the guy that wrote this piece and declared it "the fastest integrated graphics ever" has rather limited knowledge.Reply
Amd (and Intel, working together) has an old integrated Vega that's close in performance to a gtx1060, basically obliterating the 5700G:
Intel 8809G with the Radeon M GH igpu (found inside intel hades canyon nuc)
It may not be a user replaceable apu, but it IS an igpu that can easily demolish this one in terms of performance. But then again, the clickbait title wouldn't be as good as this, now, would it? :) -
Jim90
But they ARE crushing the igpu competition...and they're showing they don't need higher than this for that.hotaru251 said:amd had the chance to crush intel's low end if they just had more modern apu inside...let vega die already. its 2021.
Once the competition catches up then, and only then, will they integrate their other choices. From a business point of view this is eminently reasonable, as much as we'd all like integrated rdna2/3 now. -
AlexWolfheart Jim90 said:But they ARE crushing the igpu competition...and they're showing they don't need higher than this for that.
Once the competition catches up then, and only then, will they integrate their other choices. From a business point of view this is eminently reasonable, as much as we'd all like integrated rdna2/3 now.
So, in short, they're greedy, not offering their best because of weaker competition.
If you were hoping to defend AMD, this argument did worse ... -
King_V
Where did you get the performance numbers that back this? I haven't been able to find any solid reviews on this.AlexWolfheart said:Clearly the guy that wrote this piece and declared it "the fastest integrated graphics ever" has rather limited knowledge.
Amd (and Intel, working together) has an old integrated Vega that's close in performance to a gtx1060
(not looking for YouTube videos, but article reviews with charts and such) -
fball922
Companies exist to make money, and to make as much money (legally) while expending the smallest amount of money possible, so I guess that is greedy? Doing what they are supposed to do? This is literally what every market does. If you were trying to shame AMD for not putting in an iGPU in that makes you feel warm fuzzies, this argument did worse... Of course, if they had gone the route of upgrading the iGPU, then raised the price of the APU, we would probably hear moaning and groaning about how they shouldn't raise the price because it's so greedy of them.AlexWolfheart said:So, in short, they're greedy, not offering their best because of weaker competition.
If you were hoping to defend AMD, this argument did worse ... -
AlexWolfheart King_V said:Where did you get the performance numbers that back this? I haven't been able to find any solid reviews on this.
(not looking for YouTube videos, but article reviews with charts and such)
Have you tried... Google?
I've got the performance numbers both from google, but also from actually owning a NUC8i7HVK as my bedroom media pc, but here, i'll post links for you, since it's so hard to find info:
https://www.tomshardware.com/reviews/intel-hades-canyon-nuc-vr,5536.html
https://www.gamersnexus.net/hwreviews/3282-hades-canyon-review-intel-amd-pressure-nvidia-nuc8i7hvk
https://www.notebookcheck.net/Intel-Hades-Canyon-NUC8i7HVK-i7-8809G-Radeon-RX-Vega-M-GH-Mini-PC-Review.290800.0.html
https://www.anandtech.com/show/12572/the-intel-hades-canyon-nuc8i7hvk-review-kaby-lakeg-benchmarked
As I said before: yes, it is not a user replaceable APU, but it is an integrated gpu from over 2 years ago that still mops the floor with all other igpus. -
AlexWolfheart
Given amd's very low market share in terms of gpus, but decently growing cpu market share, going all in and releasing a cpu with a very powerful igpu would be a good marketing decision, to make them sell even more, and make less people buy Nvidia.fball922 said:Companies exist to make money, and to make as much money (legally) while expending the smallest amount of money possible, so I guess that is greedy? Doing what they are supposed to do? This is literally what every market does. If you were trying to shame AMD for not putting in an iGPU in that makes you feel warm fuzzies, this argument did worse... Of course, if they had gone the route of upgrading the iGPU, then raised the price of the APU, we would probably hear moaning and groaning about how they shouldn't raise the price because it's so greedy of them.
Amd could, in theory, with rdna2 integrated, like the xbox series something and ps5, it could make customers not need to buy an rtx3060 (ti or simple), and thus, keep more marketshare for themselves ;) -
greenreaper With RDNA2 coming in 2022, and prices still abnormally high, it's hard to recommend this as an APU unless you need it now or don't care about the new architecture's boosts in power/perf - or having some level of hardware raytracing support.Reply
To me, having feature parity with the latest consoles is important, because that'll help as many games work on it as possible, even if they don't work as fast. The big question for me is whether RDNA2 will work as well without the large cache in its discrete editions.