Spitballing Nvidia's RTX 5090 GPU die manufacturing costs — die could cost as little as $290 to make

Nvidia GeForce RTX 5090
(Image credit: Nvidia)

Nvidia's GB202 graphics processing unit has a die size of 761.56 mm^2, which makes it one of the largest GPUs for client PCs ever produced. The graphics card model it powers — the GeForce RTX 5090 — also appears among the most expensive add-in boards ever. Perhaps this is because the GB202 chip costs a fortune to produce. We spitballed some figures for what it might cost Nvidia to punch out these massive dies for its flagship GPUs. However, outside of TSMC and Nvidia, the details of actual yields are closely guarded secrets, so take our calculations with a grain of salt. Let's analyze the possibilities. 

A 300-mm wafer can fit roughly 72 GB202 candidates, assuming that one die measures roughly 31.5 mm × 24.2 mm. This is not a lot, considering the fact that TSMC may charge as much as $16,000 per 300-mm wafer produced using its 4nm-class or 5nm-class fabrication technologies. Considering defect density and yields, Nvidia may have to spend around $290 to make a GeForce RTX 5090 graphics processor, though it could also increase to $340 if only the perfect dies were sellable. These are very rough napkin-math estimates, though, so take them with a grain of salt. Other factors, such as parametric yields, should also be considered, so calculating the end results involves more than a bit of fuzzy math.

Even if a fully packaged and binned GeForce RTX 5090 silicon costs Nvidia $350, the company will still be able to make money on its $1999 graphics board. However, not all of that is pure profit, as other additives, such as the VRAM and board assembly, add considerable cost as well. That's also not to mention developing drives, support, supply chains, and a myriad of other costs that go into the final product. 

What is no less important is that Nvidia may have as many as 47 fully functional GB202 dies per wafer, and these can be sold as the RTX 6000 ‘Blackwell Generation’ professional graphics card for CAD and DCC applications or as the L50s board for datacenter-grade inference AI workloads. These solutions tend to cost thousands of dollars, so Nvidia can make a lot of money not only on the $2,000 GeForce RTX 5090 but also on the $6,000 ProViz and AI products powered by the GB202.

Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.