Nvidia Boosts Orders of Compute GPUs for AI: Report

Nvidia Hopper H100 GPU and DGX systems
(Image credit: Nvidia)

Nvidia is boosting orders for advanced packaging services at TSMC due to increasing demand for its compute GPUs aimed at AI applications, reports DigiTimes. Nvidia appears to be so optimistic about demand for its compute GPUs with CoWoS (chip on wafer on substrate) packaging, that it placed additional orders for the whole year. 

TSMC reportedly committed to process an additional 10,000 CoWoS wafers for Nvidia throughout 2023 to support growing demand for its widely used AI chips. The report estimates that this means an extra 1,000 to 2,000 wafers each month for the remaining part of the year. The story does not reveal which compute GPUs Nvidia plans to increase production of, but at present the company has A100, A30, H100, and China-specific A800 and H800 GPUs in its line-up. 

TSMC's monthly CoWoS capacity ranges from 8,000 to 9,000 wafers, so providing Nvidia an extra 1,000 to 2,000 wafers per month will tangibly increase utilization rate of the foundry's advanced packaging facilities. As a result, other companies in the industry could suffer from short supply of CoWoS services.

Due to wide adoption of Nvidia's CUDA platform for AI and other high-performance workloads, there are dozens of large-scale customers that rely on the company's hardware to run their AI applications. Just yesterday Google announced its new A3 Nvidia H100-based supercomputer offering 26 ExaFLOPS of AI performance. Meanwhile, Microsoft, Oracle, and even Elon Musk's upcoming AI project procured tens of thousands of Nvidia's AI GPUs in the recent quarters. 

Since the use of artificial intelligence is only set to grow, there will be increased demand for Nvidia's high-end A100 and H100 chips for training and neural networks. That said, it is about time for Nvidia to secure additional packaging capacity to meet demand for high-end AI GPUs.

It is noteworthy that Nvidia is boosting orders for compute GPUs even after it lost ability to ship its most powerful processors to Chinese entities without permission from the U.S. government. Either Chinese companies are willing to buy less powerful A800 and H800 GPUs, or demand from American, European, and Japanese companies has offset declined shipments to China.

Anton Shilov
Freelance News Writer

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • edzieba
    For an extra 10,000 wafers, that's on the order of 640,000 H100 dies (minus defects that cannot be binned out). Assuming an average of $20k volume pricing per card, and a very conservative 50% yield, that's on the order of $6.4bn in revenue above what was previously expected. Split across 4 quarters, that'd be something like a 50% uptick in datacentre revenue.
    Reply
  • Alvar "Miles" Udell
    And this is why AMD and especially Nvidia don't care about "affordable" consumer GPUs.
    Reply
  • kal326
    Is this above and beyond their initial commits they tried to scale back last year with the crypto collapse? Or did they allocate all that out with the rush to ship compute cards pre ban last fall?
    Reply
  • bit_user
    Alvar Miles Udell said:
    And this is why AMD and especially Nvidia don't care about "affordable" consumer GPUs.
    No, AMD doesn't have that kind of demand for their compute-focused CDNA processors. The reason they're holding back on rolling out lower-end models is simply that they have too much 6000-series inventory piled up.

    This is very much a Nvidia story.
    Reply
  • Metal Messiah.
    Just recently there were reports, or shall I say rumors circulating on the web that AMD and Microsoft have reportedly joined forces to create a new artificial intelligence (AI) processor, codenamed Athena, which is said to be capable of training models and making inferences on new data.

    The news was later debunked though. Microsoft is simply going to use existing and upcoming AMD accelerators to power its AI agenda.

    1654283507357851648View: https://twitter.com/dylan522p/status/1654283507357851648
    Though, AMD's shares rose as much as 12% last week on Thursday following this media report claiming the chipmaker was working with Microsoft.
    Reply