AMD Brags That Radeon 16GB GPUs Start at $499, Unlike Nvidia

Radeon RX 7900 XTX
Radeon RX 7900 XTX (Image credit: AMD)

AMD's Radeon products are no strangers to the list of best graphics cards. However, the chipmaker has also been very vocal that modern graphics cards should have at least 16GB of memory to push the latest triple-A titles. Sasa Marinkovic, the senior director of gaming marketing at AMD, has highlighted in a new tweet that AMD's Radeon 16GB graphics cards start at $499, unlike Nvidia, where you don't get 16GB until the GeForce RTX 4080, which has a $1,199 MSRP.

Marinkovic's comparison is evidently based on the size of the memory subsystem and pricing. Rasterized and ray tracing performance isn't taken into account. There appear to be some discrepancies in pricing, though. For example, the AMD executive listed the Radeon RX 6800 with a $499 price tag, meaning that he's not using MSRP for comparison. The Navi 21-based graphics card debuted in 2020 for $579 but now sells for as low as $469. Meanwhile, he listed the competing GeForce RTX 3070 for $549, which is neither its MSRP ($499) nor the cheapest model ($485) on the market.

Obviously, not every Radeon graphics card sports a 16GB. However, the chipmaker uses the Radeon RX 6800 series as the starting point. The Radeon RX 6700 series, such as the Radeon RX 6700 XT, maxes out at 12GB. But AMD isn't wrong. The chipmaker's Radeon graphics cards generally offer more onboard memory at lower prices than competing Nvidia GeForce RTX SKUs. In this generation, mainstream consumers don't get access to 16GB from Nvidia cards unless they spend more than $1,000. AMD's latest Radeon RX 7900 XT and RX 7900 XTX offer 20GB and 24GB with a sub-$1,000 price tag.

It's funny that Marinkovic only compared AMD's Radeon products with Nvidia's GeForce offerings when a third player is on the market. If we don't consider performance and only base the comparison on pricing, the cheapest 16GB graphics card title belongs to Intel's Arc A770 16GB, which retails for $349.

(Image credit: Sasa Marinkovic/Twitter)

The thing about these kinds of bold statements is that you have to back it up, or it'll eventually come back and bite you in the ass. Remember the whole "Game Beyond 4GB" fiasco back in 2020? AMD emphasized how 4GB had become outdated and that future graphics cards should have 8GB as the minimum. A little over a year and a half later, the chipmaker launched the Radeon RX 6500 XT, which only had 4GB of memory. Whether you believe in coincidence or not, the blog post had gone missing during the Radeon RX 6500 XT launch, but AMD eventually republished it.

AMD still has a lot of price gaps to fill with its latest RDNA 3 lineup. The Radeon RX 7900 series has been out since November 2022, and consumers have been waiting eagerly for the Radeon RX 7800 or Radeon RX 7700 series. The former will likely have 16GB, but the latter's memory is probably smaller. AMD's Radeon RX x700-tier SKUs have never had more than 12GB, so the AMD better surprises consumers or the chipmaker's latest swagger won't age well.

Modern games are getting increasingly demanding, although some are bad ports or lack optimization, such as Star Wars Jedi: Survivor, which has proven to consume up to 21GB of VRAM. It's not an exaggeration to think that 16GB graphics cards will become the new norm eventually.

Zhiye Liu
News Editor and Memory Reviewer

Zhiye Liu is a news editor and memory reviewer at Tom’s Hardware. Although he loves everything that’s hardware, he has a soft spot for CPUs, GPUs, and RAM.

  • LabRat 891
    AMD's own (older) offerings and Intel's ARC make the particular marketing point, kinda moot.

    You can get single-mDP 16GiB HBM2 Vega cards for well less than $100, and Intel offers 16GiB GDDR6 cards well under $499 too.
    Sure, neither uArch meets or beats the RX6800s, but from a 'VRAM/longevity' PoV...
    "Meh"
    Reply
  • Okay. Lets do some VRAM comparison between both the camps, based on gen after gen.
    NVIDIA Generational VRAM Comparison:
    GPUGEFORCE 40 SERIESGEFORCE 30 SERIESGEFORCE 20 SERIESGEFORCE 10 SERIES90-90 Ti Tier24 GB24 GBN/AN/A80-80 Ti Tier16 GB10-12 GB8-11 GB8-11 GB70-70 Ti Tier12 GB8 GB8 GB8 GB60-60 Ti TierTBA8-12 GB6-8 GB3-6 GB50-50 Ti TierTBA8 GB4-6 GB2-4 GB
    AMD Generational VRAM Comparison:GPURADEON 7000 SERIESRADEON 6000 SERIESRADEON 5000 SERIES900 Tier24-20 GB16 GBN/A800 TierTBA16 GBN/A700 TierTBA12 GB8 GB600 TierTBA8 GB6 GB500 TierTBA4-8 GB4-8 GB400 TierTBA4 GBN/A
    Reply
  • atomicWAR
    LabRat 891 said:
    AMD's own (older) offerings and Intel's ARC make the particular marketing point, kinda moot.

    You can get single-mDP 16GiB HBM2 Vega cards for well less than $100, and Intel offers 16GiB GDDR6 cards well under $499 too.
    Sure, neither uArch meets or beats the RX6800s, but from a 'VRAM/longevity' PoV...
    "Meh"
    True but with a number of games nvidia card's (this gen and last with 8/12GB)that have the inability to run at setting they would have otherwise been capable of had it these cards had more vram. With games hitting the market left and right showing 8/12GB cards (depending on resolution) coming up short, it makes Nvidia lack of vram all the more glaring.
    Reply
  • hotaru.hino
    I at least appreciate the article writer for poking holes at AMD's marketing.
    Reply
  • AMD actually needs to be less vocal about its marketing campaign and bragging rights, else, this is going to backfire at some point in future, sooner or later.
    Reply
  • Elusive Ruse
    AMD better be giving us at least two 16GB cards priced at $500 and lower.
    Reply
  • hotaru.hino
    Metal Messiah. said:
    AMD actually needs to be less vocal about its marketing campaign and bragging rights, else, this is going to backfire at some point in future, sooner or later.
    If AMD really wants to tell me how awesome their graphics cards are, they should be able to do it without comparing themselves to the competition. Or at least trying to put them down.

    It's like why I hate 99% of political campaign ads. They never tell me why I should vote for the person they're telling me to vote for, just why I shouldn't vote for the other guy.
    Reply
  • waltc3
    It seems as if it took game developers forever to start using up to 8GBs of Vram--and for years 2GBs & 4GBs Vram was the norm, and before that is going too far back for this post. Suddenly, we seem awash in games that can make use of 16GBs + of Vram. I think that's great, and I don't have a problem with it at all. But I am concerned that with developers trying to add "ray tracing" bullets to their game advertising that the advantages of using all of that very fast Vram are being dulled significantly by making those games far too CPU-limited. People don't buy expensive gaming GPUs to be CPU limited, really. AMD is addressing that problem with its x3d CPUs--but it should not be a problem, imo. Still, most people today turn off RT in their games that demand high frame rates. For slow-paced games, it doesn't matter. Sort of a strange situation.
    Reply
  • oofdragon
    AMD is right though, Nvidia doesn't offer even 12GB sub $600 when at that price we got 6950xt now 🥳 Honestly why even bother with anything priced higher? AMD all the day for me, 4060Ti coming at 8GB and there will be fools buying it. There are people who actually buy a 4070 over a 6950XT lol 😂 Nvidia charges $200 more for a 4070Ti and even this one is 12GB LOL 😂😂😂
    Reply
  • oofdragon
    Nvidia has become for GPUs what Canon is for Cams and Apple to smartphones.. they only thrive because consumers are mostly non thinking npcs

    $200.. 6600XT, Nvidia has 2060S, one tier lower
    $300.. 6700XT, Nvidia 3060, one tier lower
    $400.. 6800, Nvidia has 8GB 3070 lol, one tier lower
    $600.. 6950XT, Nvidia has 4070, one tier lower
    $950!.. 7900XTX, matches the $1600 4090 on a lot of games...... enough said

    People who buy a $800 12GB today from Nvidia, even a $1200 16GB, are completelly clueless. For $900 we have today AMD with better raster than both and 24GB, its mind blowing actually that people buy based on marketing not on real value.
    Reply