Why you can trust Tom's Hardware
AMD Ryzen 7 5800X3D Power Consumption, Efficiency, and Thermals
AMD's Ryzen chips continue to have excellent power and efficiency metrics. Here we can see that the 5800X3D's position further down the voltage/frequency curve yields excellent results in our Handbrake renders-per-watt efficiency metric.
Here we take a slightly different look at power consumption by calculating the cumulative energy required to perform x264 and x265 HandBrake workloads, respectively. We plot this 'task energy' value in Kilojoules on the left side of the chart.
These workloads are comprised of a fixed amount of work, so we can plot the task energy against the time required to finish the job (bottom axis), thus generating a really useful power chart.
Bear in mind that faster compute times, and lower task energy requirements, are ideal. That means processors that fall the closest to the bottom left corner of the chart are the best. As you can see, the Ryzen 7 5800X3D features a nice blend of power and performance.
Test Setup and Overclocking
As mentioned, the Ryzen 7 5800X3D doesn't support overclocking via the CPU multiplier, so you can't change the core clocks via that method. You also cannot adjust the power limits (PPT, TDC, EDC) or CPU voltage. Additionally, the chip doesn't support the auto-overclocking Precision Boost Overdrive (PBO) feature, and you can't undervolt or underclock.
The 5800X3D fully supports overclocking the memory and Infinity Fabric, but as with most Ryzen chips, we found that we were only able to reach DDR4-3800 with the fabric dialed in at 1900 MHz. This setting allows us to run the memory in the desired low-latency 'coupled' (1:1 ratio) mode. You can get higher with uncoupled memory, but that results in less performance in games.
There have been reports of successful BCLK overclocks with early samples, which ekes out a few hundred extra megahertz of performance. We'll follow up with additional testing as time permits, but be cautious about overclocking benchmarks you might see in the wild: Remember, manipulating the BCLK has been shown in the past to cause inflated benchmark scores with AMD chips — but there are solutions for that.
We test Intel processors with the power limits fully removed for our standard measurements, so the 12900K and 12900KS are running beyond Intel's 'recommended' power settings, but remain within warranty. We haven't yet overclocked the 12900KS fully, so we're subbing in our overclocked 12900K configuration in its place. From what we've seen, it appears that the 12900KS silicon often clocks similarly to its non-S counterparts, but we'll update if we see a meaningful difference.
Aside from a few errant programs for Intel, the overall trends for both AMD and Intel should be similar with Windows 10 and 11. As such, we're sticking with Windows 11 benchmarks in this article. We also stuck with DDR4 for this round of Alder Lake testing, as overall performance trends are generally comparable between DDR4 and DDR5. We have a deeper dive into what that looks like in our initial 12900K review.
We tested the Ryzen 7 5800X3D in two configurations:
- Ryzen 7 5800X3D: Corsair H115i 280mm water cooler, default power limits, DDR4-3200 in Coupled mode
- Ryzen 7 5800X3D DDR4-3800: Corsair H115i 280mm water cooler, default power limits, DDR4-3800 in Coupled mode
|Intel Socket 1700 DDR4 (Z690)
|Core i9-12900KS, Core i9-12900K, Core i7-12700K
|Row 1 - Cell 0
|MSI Z690A WiFi DDR4
|Row 2 - Cell 0
|2x 8GB Trident Z Royal DDR4-3600 - Stock: DDR4-3200 14-14-14-36 / OC: DDR4-3800 - All Gear 1
|AMD Socket AM4 (X570)
|AMD Ryzen 7 5800X3D, Ryzen 9 5900X, Ryzen 7 5800X, Ryzen 7 5700X
|ASUS ROG Crosshair VIII Dark Hero
|Row 5 - Cell 0
|2x 8GB Trident Z Royal DDR4-3600 - Stock: DDR4-3200 14-14-14-36 | OC/PBO: DDR4-3800
|Gigabyte GeForce RTX 3090 Eagle - Gaming and ProViz applications
|Row 7 - Cell 0
|Nvidia GeForce RTX 2080 Ti FE - Application tests
|Row 8 - Cell 0
|2TB Sabrent Rocket 4 Plus - Silverstone ST1100-TI - Corsair H115i AIO - Arctic MX-4 TIM - Open Benchtable - Windows 11 Pro
Current page: AMD Ryzen 7 5800X3D Power Consumption, Efficiency, Thermals, Test SetupPrev Page AMD Ryzen 7 5800X3D Boost Frequencies and Thermal Throttling Tests Next Page AMD Ryzen 7 5800X3D Gaming Benchmarks
Paul Alcorn is the Managing Editor: News and Emerging Tech for Tom's Hardware US. He also writes news and reviews on CPUs, storage, and enterprise hardware.
Depending on what applications you're using, there are potentially HUGE performance gains, even on non common workloads.Reply
Level1Techs & Hardware UnBoxed has shown that the 5800X3D is a good value compared to the 12900KS or 12900K
Tom Sunday said:The 5800X3D on the surface looks good. Not the $449 price tag to be sure as many of us given our ongoing dilemma at the gas pumps leaves little or no cash available for higher-end PC fares. Besides AMD being a latecomer to the party with a practically outdated and niche CPU and especially with an all new CPU and hardware generation sitting virtually on our doorstep! At this point in time I would think that many will ‘hold and fold’ until better economic times are in sight and mind. At the latest local computer show the 5800X3D came up in discussion and it was said: “Looks like a very nice chip, but at this late time it’s not a good investment!”
I see this the other way its a fantastic upgrade for those on AM4 that may still be on Zen+ or Zen 2 and will prolong the life of those systems a few more years. Leaving time for pricing to go down on DDR5 when its time to upgrade.
Clearly, something is wrong with the 12900ks sample used (or the setup) if it can't be overclocked at all, and especially if it is not faster than an overclocked 12700k.Reply
Also, how could any conclusion be made without including 12900k/ks + DDR5 tests?
For example, from a TechSpot review, 12900k FarCry 6 performance was the following (no overclocking):
157 frames/sec - 12900k DDR4-3200
170 frames/sec - 12900k DDR5-6400
Details can be found in the "Gaming Benchmarks" section here:
Even after benchmarking the 12900k/ks with DDR5, the 5800X3D might still be ahead in the geometric mean. But since DDR5 prices are dropping I think most people buying a 12900k/ks may end up using higher-end motherboards with DDR5 to squeeze out every last drop of performance. So, can you add some Alder Lake + DDR5 results, please? (thanks!)
This is me. I just upgraded from a 3700X to a 5800X3D. I’m going to get a couple more years out my B450 Tomahawk Max and 2x16gb DDR4. This is paired with a 3080 and 1440p 240Hz monitor. By the time I need to upgrade cpu DDR5 will hopefully be more mature, cheaper and actually bring beneficial improvements for games.Makaveli said:I see this the other way its a fantastic upgrade for those on AM4 that may still be on Zen+ or Zen 2 and will prolong the life of those systems a few more years. Leaving time for pricing to go down on DDR5 when its time to upgrade.
The only thing that I have noticed is my RAM seem to run hotter on the 5800X3D than the 3700X. It now runs at about 45-49c on stock XMP, previously 40-43.
While it may be the best in gaming for AMD's current offerings, other reviews, such as Techpowerup's, which use an RTX 3080, show the 5800X3D to lead by only 7.4% on vs the 5900X in gaming at 1920x1080 on average. Assuming you aren't using a ~$1500 3090 but a ~$900 3080, are you really telling us that you should buy the 3080X3D instead of spending, currently, $80 more on the 5950X, for twice the number of cores and a much better all around system?Reply
The 5950X is outperformed by the 5900X for gaming, so if gaming is the main concern then it makes sense to compare to a 5800X or 5900X.Alvar Miles Udell said:While it may be the best in gaming for AMD's current offerings, other reviews, such as Techpowerup's, which use an RTX 3080, show the 5800X3D to lead by only 7.4% on vs the 5900X in gaming at 1920x1080 on average. Assuming you aren't using a ~$1500 3090 but a ~$900 3080, are you really telling us that you should buy the 3080X3D instead of spending, currently, $80 more on the 5950X, for twice the number of cores and a much better all around system?
sizzling said:The 5950X is outperformed by the 5900X for gaming, so if gaming is the main concern then it makes sense to compare to a 5800X or 5900X.
True, but not in applications, which is half of this test, and the 5950X beats the 5900X quite handily due to having more cores. And since they compared it against an Intel processor with 16 cores, the 12900K, as well as the 12900KS variant, then they should have included a 16 core AMD processor as well for good measure, even though the 12900K and KS are quite a bit faster anyway.
True, but this review is about the 5800X3D which is being pushed as a gaming cpu, nothing more. Therefore it’s reasonable to compare on that basis. If you are not after a purely gaming cpu the 5800X3D probably does not make sense.Alvar Miles Udell said:True, but not in applications, which is half of this test, and the 5950X beats the 5900X quite handily due to having more cores. And since they compared it against an Intel processor with 16 cores, the 12900K, as well as the 12900KS variant, then they should have included a 16 core AMD processor as well for good measure, even though the 12900K and KS are quite a bit faster anyway.
The biggest winners with the 5800X3D are AM4 owners; the people that actually trusted AMD and, well, they have fully delivered, I'd say. I personally didn't go with the 5800X3D, because the 5900X dropped to under $400 and that's just way too good as an upgrade (I got it for £370). I've gone through 3 Zen generations (2700X, 3800XT and now the 5900X) and while I still think the 9K gen from Intel is still good, I can't help but feel kind of sorry for them. Almost the same for 10K gen owners, but the 10700K is still a great CPU in my eyes and let's not talk about 11K gen.Reply
Also, this thing is still 8 cores and 16 threads, it's not like it suddenly got degraded to a 4 core 8 threads CPU. I'm sure it should be fairly similar to the 11700K or at least 10700K and those you wouldn't say are slouches, no? Perspective is as common as the common sense, innit?
The biggest losers are people like me who bought a high end X370 motherboard trusting AMD to support all AM4 processors on all AM4 motherboards like they said at the beginning, then bought a X570 motherboard after they said no Zen 3 support on X370 motherboards only for them to then change their minds and actually support Zen 3 on X370 motherboards...Reply