Nvidia's H100 AI GPUs cost up to four times more than AMD's competing MI300X — AMD's chips cost $10 to $15K apiece; Nvidia's H100 has peaked beyond $40,000: Report

AMD Instinct MI300X GPU for AI and High-Performance Computing Workloads
(Image credit: AMD)

According to Citi's price projections for AMD's MI300 AI accelerators, Nvidia currently charges up to four times more for its competing H100 GPUs, highlighting its incredible pricing power as a shortage of H100 GPUs continues. We contacted AMD to confirm Citi's pricing projection, but an AMD representative told Tom's Hardware the company doesn't share that pricing publicly. 

AMD has formally started volume shipments of its CDNA 3-based Instinct MI300X accelerators and MI300A accelerated processing units (APUs), and some of the first customers have already received their MI300X parts, but pricing for different customers varies based on volumes and other factors. But in all cases, Instincts are massively cheaper than Nvidia's H100.

Citi (via SeekingAlpha) estimates that AMD sells its Instinct MI300X 192GB to Microsoft for roughly $10,000 a unit, as the software and cloud giant is believed to be the largest consumer of these products at this time (and it has managed to bring up GPT-4 on MI300X in its production environment). Other customers have to pay around $15,000 for an Instinct MI300X GPU for artificial intelligence (AI) and high-performance computing (HPC) applications. Both prices are massively lower than Nvidia charges for its hugely popular H100 80GB AI and HPC GPU.

Just like AMD, Nvidia also does not officially disclose the pricing of its H100 80GB products as it depends on numerous factors, such as the volume of the batch and overall volumes that a particular client procures from Nvidia. But over the recent quarters, we have seen Nvidia's H100 80GB HBM2E add-in-card available for $30,000, $40,000, and even much more at eBay. Meanwhile, the more powerful H100 80GB SXM with 80GB of HBM3 memory tends to cost more than an H100 80GB AIB. 

In general, the prices of Nvidia's H100 vary greatly, but it is not even close to $10,000 to $15,000. Furthermore, given the memory capacity of the Instinct MI300X 192GB HBM3, it makes more sense to compare it to Nvidia's upcoming H200 141GB HBM3E and Nvidia's special-edition H100 NVL 188GB HBM3 dual-card solution designed specifically to train large language models (LLMs) that probably sell for an arm and a leg.

Given that tons of AI applications and workloads are optimized for Nvidia's CUDA software stack, demand for its compute GPUs is overwhelming, which is why the company can sell its Hopper-based products at a huge premium. Meanwhile, AMD is trying to attract clients to its CDNA 3-based Instinct MI300-series products, so it might have decided to sell them at a relatively low price. 

AMD expects sales of its data center GPUs — which includes MI300-series devices — to exceed $3.5 billion, and the company says it has some supply still available, which stands in contrast to Nvidia's rumored 52-week wait times. In either case, analysts from Citi deem AMD's $3.5 billion in sales an underestimation. Christopher Danely, an analyst with Citi, believes that AMD could generate $5 billion on data center GPUs this year, and $8 billion in 2025.

Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • Pierce2623
    It’s kinda crazy that companies are so lazy they’ll pay 4x for the same performance just for an easier to use software stack. If AMD put a real push behind their software stack, it still wouldn’t matter because Nvidia just has the mindshare period.
    Reply
  • oofdragon
    So the craze about Artificial Intelligence is basically because most people lack Natural Intelligence, seeing as everyone is paying 4x more for the same performance. There we have it.. why RTX sells more than XTX... lack of NI
    Reply
  • Neilbob
    I'm sure said companies are just erring on the side of caution, trying to keep competition alive...

    And after all, everyone be concerned. Nvidia are right on the edge of being completely destitute, so bad they're approaching Apple levels of poverty. Doesn't it make your heart break?
    Reply
  • Freestyle80
    oofdragon said:
    So the craze about Artificial Intelligence is basically because most people lack Natural Intelligence, seeing as everyone is paying 4x more for the same performance. There we have it.. why RTX sells more than XTX... lack of NI
    yeah why dont they worship AMD like you, AMD are gods, more people should be bowing down to them and buy anything they release
    Reply
  • Argolith
    Talking about the article... Hopefully with more money coming in they will have more to invest on the gaming side of things and maybe use these accelerators of theirs to build up a strong(er) alternative to DLSS... but I feel like they have little to no incentive at the moment (after all despite being similar to GPUs this is AI accelerators we're talking about and they sell to enterprise at much steeper prices) and probably we will just end up seeing more production capacity shifted away from gaming. Who knows, one day some cool feature might trickle down the product stack... Maybe?
    Unfortunately I'm starting to forget the days Radeon moved a decent amount of units or introduced cool stuff like HBM to GPUs your average Joe might buy.
    Reply