AMD Backs Off RDNA 3 Efficiency Comparison to RTX 4090

AMD RX 7000 Series Reference Card
(Image credit: AMD)

As noticed by ComputerBase, AMD reportedly removed performance-per-watt graphs from its recent RX 7900 XTX and 7900 XT graphics card presentations that compared the cards to Nvidia's RTX 4090.

AMD's footnotes from November 15th's performance presentation state there was supposed to be a performance-per-watt slide for an RX 7900 XTX versus RTX 4090 power efficiency comparison, but AMD never presented it. The omission is made clear by the "RX-841" footnote that doesn't have a corresponding slide in the deck. 

(Image credit: AMD)

The performance-per-watt slide compared the RX 7900 XTX to Nvidia's very power-hungry GeForce RTX 4090 based on publicly disclosed peak power consumption values (TBP/TGP). AMD accidentally left this footnote in its presentation, so the decision to exclude the slide was apparently made at the last minute.

We don't have any confirmation as to why AMD decided to remove this slide, and there could be several reasons for the omission. First, AMD has said that the RX 7900 XTX competes with the RTX 4080, so a comparison to the RTX 4090 might not have made sense. Additionally, while Nvidia's 40-series cards consume prodigious amounts of energy, the new architecture is more power efficient than its predecessors, and, during gaming, the 40-series cards often consume far less energy than the full TGP.

For instance, our review of the RTX 4080 found the card's performance-per-watt to be incredibly good. It outperformed Nvidia's power-sucking RTX 3090 Ti while consuming far less power during gaming than its maximum 320W TGP allows. In fact, according to our tests, the RTX 4080 Founders Edition consumed just 221W at 1080p. Furthermore, the only workload we found that would max out the RTX 4080's power budget was Furmark combined with a GPU overclock.

Nvidia also sought to highlight that its GPUs don't pull the full TGP during gaming with its own article on the same subject, (opens in new tab) stating the RTX 4080 could hit just 251W of power consumption under average gaming conditions (albeit at 1440p resolution). This is an impressive feat considering Nvidia says its previous-gen RTX 3090 Ti consumes 398W during typical gaming workloads. This works out to a 37% power reduction for the RTX 4080, and that's not even considering the 4080's slightly faster gaming performance.

It's logical to expect AMD's forthcoming RDNA 3 GPUs to be plenty efficient and easily rival or beat the 40-series in power efficiency metrics. However, as with all launch presentations, AMD would want to ensure that its card has a healthy enough lead to highlight the advantages of its architecture. Maybe the comparison against the RTX 4090 didn't work out quite as favorably as AMD expected. Conversely, AMD may have decided that comparing the two cards based on total power ratings alone might not be the best approach, as the RTX 4090 likely doesn't pull its full TGP during gaming. Besides, the RX 7900 XTX competes more directly with the RTX 4080.

It's expected for some of the comparisons in any presentation to hit the cutting room floor; for instance, as noted by Computerbase, there is also no slide corresponding to AMD's "RX-837" footnote, and it appears that the "RX-838" footnote is missing entirely.

We'll likely never know why AMD pulled the RX 7900 XTX versus RTX 4090 power efficiency comparison, but one thing is for sure — it is missing from the deck. 

Aaron Klotz
Freelance News Writer

Aaron Klotz is a freelance writer for Tom’s Hardware US, covering news topics related to computer hardware such as CPUs, and graphics cards.

  • Luneder
    Does not matter when the 4080 is competing with the 4090 for the most expensive card of the year.
    The choices are go for the RX 7900 which is much cheaper with performance that will likely be similar to the 4080 but have a bit higher power use or throw power consumption out the window and go for a 4090 with a 30-40% performance increase for a similar price as the 4080.

    I've been keeping track of ibuypower and cyberpower custom builds and there is only a few hundred dollar difference between a 4080 and a 4090 PC on builds that are all over $3k except for the i7 versions that are just under $3k.
    Reply
  • Math Geek
    makes sense. if the 7900 will perform similar to the 4080 (blatant guess based on rumor mill) and use similar power, then obviously there is no efficiency gap to brag about.

    even the extra power used by the 4090 is still pretty good on a perf/watt scale compared to the 4080/7900 .

    i'd stick with the "our card won't melt the connections and possible ruin your psu and $1600-2000 gpu" angle for my press at this point :)
    Reply
  • Luneder
    Math Geek said:
    makes sense. if the 7900 will perform similar to the 4080 (blatant guess based on rumor mill) and use similar power, then obviously there is no efficiency gap to brag about.

    even the extra power used by the 4090 is still pretty good on a perf/watt scale compared to the 4080/7900 .

    i'd stick with the "our card won't melt the connections and possible ruin your psu and $1600-2000 gpu" angle for my press at this point :)
    they've pretty much solved the connector thing, only 50 cards out of 124,000 sold had the problem (0.04% of cards) and the 50 cases were all because they didn't have the connector seated in all the way. the fix is pretty simple either they add a better clip or they shorten the sensor pins so the cable won't send power if the connector is not seated all the way, the current sensor pins are too long and giving a false positive on being seated in all the way.
    Reply
  • nimbulan
    Math Geek said:
    makes sense. if the 7900 will perform similar to the 4080 (blatant guess based on rumor mill) and use similar power, then obviously there is no efficiency gap to brag about.

    even the extra power used by the 4090 is still pretty good on a perf/watt scale compared to the 4080/7900 .

    i'd stick with the "our card won't melt the connections and possible ruin your psu and $1600-2000 gpu" angle for my press at this point :)
    Yeah based on the few performance numbers AMD's posted for the 7900 XT, compared to independent reviews of the 4080, it puts them basically equal in rasterization performance. The 7900 XTX looks around 15-20% faster than that in rasterization, but the 4080 will still beat it by 30% in raytracing.

    My guess is that AMD was expecting nVidia's cards to hit their TGP limit under most workloads, as has been the case in the past, and prepared a slide with comparisons based on that. But with the 40 series frequently using significantly less power than the TGP limit (and to be clear I'm not expecting the same to be true for AMD,) that will likely make nVidia's cards more power efficient this gen.
    Reply
  • Alvar "Miles" Udell
    To borrow an AMD phrase, did nVidia just "jebait" AMD into showing their hand?
    Reply
  • criticaloftom
    All I know as a consumer is that with a 'hyper' expensive purchase either way; I won't be trusting Nvidia.
    The press about how many inserts the plug is rated for; and the mere fact that they have any that create a system fire means they have done their dash.
    Maybe in a few years consumers will forget, but for now that connector is a dead selling point.
    Reply
  • missingxtension
    Soo about the RYX 4080....
    Reply
  • russell_john
    Admin said:
    AMD reportedly hid a performance per watt slide at the very last moment, relating to its RX 7900 XTX/XT coverage on November 15th. AMD's reasoning is unknown, but we suspect its related to Nvidia's RTX 4080, and excellent power efficiency.

    AMD Backs Off RDNA 3 Efficiency Comparison to RTX 4090 : Read more
    Architecture only gives you small gains in efficiency usually in single digit percentages. In the last 20 year nearly all the efficiency gains have come from using a smaller node size and since the 4000 and 7000 series cards are on essentially the same node they are likely to have very similar performance per watt. The edge will likely go to AMD since they don't have the additional Tensor Cores which add overhead even when they aren't being used. However when you look at performance per watt for Ray Tracing Nvidia is still likely to blow AMD away although they may whittle down the 23% efficiency edge Nvidia had last generation but I suspect not by much
    Reply
  • russell_john
    missingxtension said:
    Soo about the RYX 4080....

    Yes proofreading seems to be a lost art .... Too much reliance on automatic proofreading that can't catch errors like that one
    Reply
  • russell_john
    criticaloftom said:
    All I know as a consumer is that with a 'hyper' expensive purchase either way; I won't be trusting Nvidia.
    The press about how many inserts the plug is rated for; and the mere fact that they have any that create a system fire means they have done their dash.
    Maybe in a few years consumers will forget, but for now that connector is a dead selling point.

    You better get used to it because that connector is going to be on motherboards eventually (In fact that was what Intel originally designed it for) since it is part of the ATX 3.0 standard. Intel's intention with ATX 3.0 is to get rid of the 24 pin motherboard connector, and the 12V Molex connector and replace it with this single 12V connector and have all the voltage conversion/regulation for the other voltages on the motherboard itself. You'll basically have two of these 16 pin connectors coming from the power supply, one for the motherboard and another for the graphics cards.

    What is throwing everyone for a loop is Intel has been using this connector on server motherboards for almost two years without issue but then again those were installed by professionals

    BTW you can't start a fire without a flame and the composite plastic material used for those connectors are flameproof .... Melting DOES NOT equal fire
    Reply