AMD Backs Off RDNA 3 Efficiency Comparison to RTX 4090

As noticed by ComputerBase, AMD reportedly removed performance-per-watt graphs from its recent RX 7900 XTX and 7900 XT graphics card presentations that compared the cards to Nvidia's RTX 4090.

AMD's footnotes from November 15th's performance presentation state there was supposed to be a performance-per-watt slide for an RX 7900 XTX versus RTX 4090 power efficiency comparison, but AMD never presented it. The omission is made clear by the "RX-841" footnote that doesn't have a corresponding slide in the deck. 

(Image credit: AMD)
Aaron Klotz
Contributing Writer

Aaron Klotz is a contributing writer for Tom’s Hardware, covering news related to computer hardware such as CPUs, and graphics cards.

  • Luneder
    Does not matter when the 4080 is competing with the 4090 for the most expensive card of the year.
    The choices are go for the RX 7900 which is much cheaper with performance that will likely be similar to the 4080 but have a bit higher power use or throw power consumption out the window and go for a 4090 with a 30-40% performance increase for a similar price as the 4080.

    I've been keeping track of ibuypower and cyberpower custom builds and there is only a few hundred dollar difference between a 4080 and a 4090 PC on builds that are all over $3k except for the i7 versions that are just under $3k.
    Reply
  • Math Geek
    makes sense. if the 7900 will perform similar to the 4080 (blatant guess based on rumor mill) and use similar power, then obviously there is no efficiency gap to brag about.

    even the extra power used by the 4090 is still pretty good on a perf/watt scale compared to the 4080/7900 .

    i'd stick with the "our card won't melt the connections and possible ruin your psu and $1600-2000 gpu" angle for my press at this point :)
    Reply
  • Luneder
    Math Geek said:
    makes sense. if the 7900 will perform similar to the 4080 (blatant guess based on rumor mill) and use similar power, then obviously there is no efficiency gap to brag about.

    even the extra power used by the 4090 is still pretty good on a perf/watt scale compared to the 4080/7900 .

    i'd stick with the "our card won't melt the connections and possible ruin your psu and $1600-2000 gpu" angle for my press at this point :)
    they've pretty much solved the connector thing, only 50 cards out of 124,000 sold had the problem (0.04% of cards) and the 50 cases were all because they didn't have the connector seated in all the way. the fix is pretty simple either they add a better clip or they shorten the sensor pins so the cable won't send power if the connector is not seated all the way, the current sensor pins are too long and giving a false positive on being seated in all the way.
    Reply
  • nimbulan
    Math Geek said:
    makes sense. if the 7900 will perform similar to the 4080 (blatant guess based on rumor mill) and use similar power, then obviously there is no efficiency gap to brag about.

    even the extra power used by the 4090 is still pretty good on a perf/watt scale compared to the 4080/7900 .

    i'd stick with the "our card won't melt the connections and possible ruin your psu and $1600-2000 gpu" angle for my press at this point :)
    Yeah based on the few performance numbers AMD's posted for the 7900 XT, compared to independent reviews of the 4080, it puts them basically equal in rasterization performance. The 7900 XTX looks around 15-20% faster than that in rasterization, but the 4080 will still beat it by 30% in raytracing.

    My guess is that AMD was expecting nVidia's cards to hit their TGP limit under most workloads, as has been the case in the past, and prepared a slide with comparisons based on that. But with the 40 series frequently using significantly less power than the TGP limit (and to be clear I'm not expecting the same to be true for AMD,) that will likely make nVidia's cards more power efficient this gen.
    Reply
  • Alvar "Miles" Udell
    To borrow an AMD phrase, did nVidia just "jebait" AMD into showing their hand?
    Reply
  • criticaloftom
    All I know as a consumer is that with a 'hyper' expensive purchase either way; I won't be trusting Nvidia.
    The press about how many inserts the plug is rated for; and the mere fact that they have any that create a system fire means they have done their dash.
    Maybe in a few years consumers will forget, but for now that connector is a dead selling point.
    Reply
  • missingxtension
    Soo about the RYX 4080....
    Reply
  • russell_john
    Admin said:
    AMD reportedly hid a performance per watt slide at the very last moment, relating to its RX 7900 XTX/XT coverage on November 15th. AMD's reasoning is unknown, but we suspect its related to Nvidia's RTX 4080, and excellent power efficiency.

    AMD Backs Off RDNA 3 Efficiency Comparison to RTX 4090 : Read more
    Architecture only gives you small gains in efficiency usually in single digit percentages. In the last 20 year nearly all the efficiency gains have come from using a smaller node size and since the 4000 and 7000 series cards are on essentially the same node they are likely to have very similar performance per watt. The edge will likely go to AMD since they don't have the additional Tensor Cores which add overhead even when they aren't being used. However when you look at performance per watt for Ray Tracing Nvidia is still likely to blow AMD away although they may whittle down the 23% efficiency edge Nvidia had last generation but I suspect not by much
    Reply
  • russell_john
    missingxtension said:
    Soo about the RYX 4080....

    Yes proofreading seems to be a lost art .... Too much reliance on automatic proofreading that can't catch errors like that one
    Reply
  • russell_john
    criticaloftom said:
    All I know as a consumer is that with a 'hyper' expensive purchase either way; I won't be trusting Nvidia.
    The press about how many inserts the plug is rated for; and the mere fact that they have any that create a system fire means they have done their dash.
    Maybe in a few years consumers will forget, but for now that connector is a dead selling point.

    You better get used to it because that connector is going to be on motherboards eventually (In fact that was what Intel originally designed it for) since it is part of the ATX 3.0 standard. Intel's intention with ATX 3.0 is to get rid of the 24 pin motherboard connector, and the 12V Molex connector and replace it with this single 12V connector and have all the voltage conversion/regulation for the other voltages on the motherboard itself. You'll basically have two of these 16 pin connectors coming from the power supply, one for the motherboard and another for the graphics cards.

    What is throwing everyone for a loop is Intel has been using this connector on server motherboards for almost two years without issue but then again those were installed by professionals

    BTW you can't start a fire without a flame and the composite plastic material used for those connectors are flameproof .... Melting DOES NOT equal fire
    Reply