Nvidia Is Reportedly Building A 900W RTX 40 Series Graphics Card

GeForce RTX 3090 Ti launch images
(Image credit: Nvidia)

Nvidia is allegedly building a monstrous RTX 40 series graphics card capable of consuming 900W of power, according to anonymous Twitter hardware leaker @kopite7kimi. 

If the rumors are true — and they may not be, as this is the first we've heard of a 900W GPU and no documents have been leaked — this graphics card could be for testing purposes only and won't be released to the public.

Full core specifications for the alleged 900W monstrosity are unknown, though the rumor suggests it will be powered by the AD102, Nvidia's top-tier die for the RTX 40 series, through twin 16-pin power connectors. The rumor also claims the card will be equipped with 48GB of 24GBps GDDR6X memory, which are the same memory modules on Nvidia's RTX 3090 Ti.

With power consumption going through the roof on modern GPUs, Nvidia's move to make a 900W graphics card for testing purposes makes sense. When Nvidia first broke the 300W power barrier with Ampere, there were major power supply issues right at the start with the first revision of Nvidia's RTX 3080 and RTX 3090 models. Nvidia and its AIB partners presumably don't want to make that mistake again.

With 900W of power, Nvidia will know exactly what components and VRM configurations to use for such a high power capacity. We doubt the RTX 40 series GPUs will push 900W of power, at least for the launch models. But if Nvidia already knows how to push 900W safely and reliably ahead of time, it will be able to easily power Ada GPUs which will potentially have 450-600W of power and thermal headroom.

For those of you doubting Ada will need that much power in the first place, the evidence suggests otherwise. Based on what we know about the RTX 40 series, we believe the AD102 will be another "Ampere phase" for Nvidia, meaning we'll see another huge jump in core counts and power consumption. We believe the top die AD102 will pack as much as 71% more SM's compared to GA102, which should provide a monstrous jump in performance without taking into consideration other architectural enhancements and optimizations.

Simply put, GPU architectures have become so efficient in the modern day that it's becoming harder and harder to cram more power efficiency and performance out of a single transistor. The only way forward now is to cram more power into these chips, as evidenced by recent Nvidia and Intel leaks suggesting the next generation of CPUs and GPUs will consume far more power than the current generation.

RTX 4080 and RTX 4070 Leaks

In another tweet, @kopite7kimi also leaked the supposed core specifications of Nvidia's next-generation RTX 4080 and RTX 4070 GPUs, which will thankfully not consume 900W.

Again, this is just a rumor.

Nvidia's GeForce RTX 4080 will supposedly use the runner-up GPU die for the Ada generation known as AD103, and feature a power consumption similar to that of the GA102. We aren't sure if this means Nvidia's 900W AD102 graphics card, or if it means an "RTX 4090" designed for public use.

Either way, it seems like this card could run somewhere around a 500W-800W TGP. Memory specs reportedly feature 16GB of GDDR6X memory.

The RTX 4070, on the other hand, appears to be the complete opposite of the RTX 4080 with an alleged power consumption of just 300W. This GPU will operate on the AD104 die and have 12GB of GDDR6 memory (not G6X). 

However, these modules will almost certainly not use the current GDDR6 modules we know of today on Ampere cards. We already know that faster 18Gbps GDDR6 modules will be ready in time for Ada, so we expect Nvidia to swap to these new modules for its entire suite of 40 series products that won't use G6X memory.

Aaron Klotz
Contributing Writer

Aaron Klotz is a contributing writer for Tom’s Hardware, covering news related to computer hardware such as CPUs, and graphics cards.

  • bolweval
    That's like a space heater!
  • drtweak
    bolweval said:
    That's like a space heater!

    Right? Like 900 watts seems like you need a Chiller or LN2 not just air or water cooled to keep it cool.
  • ph00ny
    You can actually cool 900w power load with a decent water cooling. Quite a few folks are achieving this with their KP3090
  • FunSurfer
    ...But can it run Cyberpunk 4k @144hz ultra + ray tracing on psycho no dlss
  • Eximo
    Sounds more like they have a card with dual 12-pin (16-pin) connectors and if you double 450W, you get 900W.
  • derekullo
    LHC ain't got nothing on my future gaming rig !
  • tennis2
    Just gotta re-wire my house before I buy.
  • Flayed
    300W for the 4070 is too much for me
  • woot
    :oops: Time to upgrade your breakers/fuses
  • jkflipflop98
    Nvidia appears to be slowly figuring out what I've been saying for years now. An RTX card is basically an entire PC already. It just needs a general compute chiplet, then USB3 and NVMe added and you have an entire computer that is built for gaming.