GeForce RTX 4090 May Have 24GB of 21 Gbps GDDR6X VRAM

GPU
GPU (Image credit: Shutterstock)

Nvidia's forthcoming GeForce RTX 40-series (Ada Lovelace) graphics cards continue to make the rounds in the rumor mill. Hardware tipster kopite7kimi, who has an impeccable history with Ampere leaks, has revealed the potential memory configurations for the GeForce RTX 4090 and RTX 4070.

The GeForce RTX 4090 reportedly rocks the AD102 die and is likely the flagship silicon for Nvidia's next-generation lineup. The graphics card may arrive with 24GB of GDDR6X memory at 21 Gbps. That's the same recipe Nvidia used for the GeForce RTX 3090 Ti. Therefore, Nvidia may not push the memory limit for this generation, seemingly maxing out at 24GB as the chipmaker did on Ampere. 

Assuming that the GeForce RTX 4090 retains the 384-bit memory interface as the GeForce RTX 3090 Ti, the graphics card would deliver a memory bandwidth up to 1 TBps. Thus the GeForce RTX 4090 only offers 7.7% higher memory bandwidth than its predecessor, the GeForce RTX 3090.

In addition to the memory upgrade, the GeForce RTX 4090 would also sport more CUDA cores. However, the exact amount is unknown for now. Our estimation is between 17,000 to 18,500. Logically, Nvidia would need to raise the power limit on the GeForce RTX 4090 to accommodate the extra CUDA cores. Early diagrams of the GeForce RTX 4090's feasible PCB show that the graphics card could pull up to 600W of power.

Nvidia GeForce RTX 40-Series Specifications

Swipe to scroll horizontally
Header Cell - Column 0 GeForce RTX 4090*GeForce RTX 3090 TiGeForce RTX 3090GeForce RTX 4080*GeForce RTX 3080 12GBGeForce RTX 3080GeForce RTX 4070*GeForce RTX 3070
ArchitectureAD102GA102GA102AD103GA102GA102AD104GA104
VRAM Speed (Gbps)212119.5?19191814
VRAM (GB) 242424161210128
TDP (watts)600450350450350320300220

*Specifications are unconfirmed.

The GeForce RTX 4080, on the other hand, might tap into the AD103 silicon. The hardware leaker previously stated that it would wield 16GB of GDDR6X memory. That's a 60% more memory than the vanilla GeForce RTX 3080 and a 33.3% upgrade over the subsequent GeForce RTX 3080 12GB. Again, we're unsure of the GeForce RTX 4080's CUDA core count, but kopite7kimi believes that AD103 has a similar TGP (total graphics power) as GA102. The maximum limit on GA102 is 450W, meaning Nvidia may have to keep the GeForce RTX 4080's CUDA core count to around 10,000 to remain within GA102's TGP.

As for the GeForce RTX 4070, Nvidia could use the AD104 silicon for the graphics card. The number of CUDA cores may hover over the 7,500 mark, give or take. According to kopite7kimi, the GeForce RTX 4070 will arrive with 12GB of 18 Gbps GDDR6 memory. So it won't just have 50% higher memory capacity than the GeForce RTX 3070, but the GDDR6 chips are also faster.

The new GeForce RTX 40-series graphics cards will probably hit the market before the year is over. Assuming that Nvidia continues with the same cadence as Ampere, we could see GeForce RTX 4090 and RTX 4080 in September, with the slower models arriving months later.

Zhiye Liu
RAM Reviewer and News Editor

Zhiye Liu is a Freelance News Writer at Tom’s Hardware US. Although he loves everything that’s hardware, he has a soft spot for CPUs, GPUs, and RAM.

  • VforV
    RTX 4090, PG137/139-SKU330, AD102-300, 21Gbps 24G GDDR6X, 600W RTX 4070, PG141-SKU341, AD104-400, 18Gbps 12G GDDR6, 300W
    Ouch and double ouch!

    I don't care for stupid 600W GPUs, but 300W for the RTX 4070 I do no like at all!
    I'll wait for RDNA3 RX 7700 XT at 250W.
    Reply
  • gargoylenest
    guess you will need a special kind of agreement with electric company to power a 4090ti...
    Reply
  • ezst036
    Looks like any future GPU review ought to give projections on what the monthly electric bill impact will be.
    Reply
  • JarredWaltonGPU
    VforV said:
    Ouch and double ouch!

    I don't care for stupid 600W GPUs, but 300W for the RTX 4070 I do no like at all!
    I'll wait for RDNA3 RX 7700 XT at 250W.
    RTX 3070 Ti is already basically at 300W (290W reference, more than that on custom cards). ¯\(ツ)
    Reply
  • JarredWaltonGPU
    ezst036 said:
    Looks like any future GPU review ought to give projections on what the monthly electric bill impact will be.
    Even if you play games for 12 hours per day, the difference between a 200W and 300W GPU would be pretty negligible.

    100W * 12 hours = 1.2 kWh. If you have super expensive electricity, that might work out to almost $0.50 per day. But if you can afford to play games for 12 hours every day, I can't imagine the $15 per month would be much of a hardship.
    Reply
  • hotaru.hino
    Also if you really want to call yourself a "power" user, you would do well to tune your cards for efficiency and not chase after absolute maximum performance. That last 25% or so of the power envelope may just be for squeezing out another 5-10% more performance at best. With undervolting, you could drop the power consumption significantly without losing much performance.
    Reply
  • spongiemaster
    hotaru.hino said:
    Also if you really want to call yourself a "power" user, you would do well to tune your cards for efficiency and not chase after absolute maximum performance. That last 25% or so of the power envelope may just be for squeezing out another 5-10% more performance at best. With undervolting, you could drop the power consumption significantly without losing much performance.
    I guess this depends on the cost of your electricity. If you're mining, it makes total sense to optimize power usage. If I was strictly a gamer, this would be a total waste of time for me. Starting with a 300W GPU, if I cut power consumption by 25%, that's 75W. If I played games 6 hours a day, everyday of the year, and that maxed out the GPU 100% of the time, all impractical in reality. At my current electricity rate of 9.84c/kWh, that comes out to a savings of 4.5 cents a day, $1.33/month, and $16.16 per year. Who cares? Then as a bonus, I lose 5-10% of my performance. I understand, not everyone has electricity rates that low, but even if you double the electricity cost, you're looking at only $32 for an entire year. The hysteria over the additional cost for electricity for these cards is completely blown out of proportion.
    Reply
  • LastStanding
    I think the argument about the extra "power usage" will never go away and in most advanced territories, many will not lose any sleep thinking about it.

    But, in my opinion, the major attention/concern, and this outweighs everything in all electronic's components, that SHOULD be focused on is... thermals!
    Reply
  • Coolmemesbudd
    JarredWaltonGPU said:
    RTX 3070 Ti is already basically at 300W (290W reference, more than that on custom cards). ¯\(ツ)
    Feels like a stretched point. At least compare generational equivalents; the 3070 is a ~220W card. That 290W stands to prove inefficiency that you are seemingly happy to swallow for a mere 6-8%? Reckon the 4070Ti would draw upper 300 at this rate
    Reply
  • watzupken
    JarredWaltonGPU said:
    Even if you play games for 12 hours per day, the difference between a 200W and 300W GPU would be pretty negligible.

    100W * 12 hours = 1.2 kWh. If you have super expensive electricity, that might work out to almost $0.50 per day. But if you can afford to play games for 12 hours every day, I can't imagine the $15 per month would be much of a hardship.
    In my experience, I feel it is not just about the power price you pay for upgrading from a 200 to 300W GPU.. For example, with previous gen of GPUs I’ve tested, i.e. GTX 1080 Ti, RTX 2060 Super, etc, I’ve never once experience where my room started to heat up fairly rapidly. With the RTX 3080, I could feel my room warming up even with air conditioning on, and also see the rise in temps on the thermometer. The most I logged is a degree celcius increase in slightly over 30 mins of gaming. You can imagine the heat that gets dumped out of a GPU that draws north of 400W is going to be worst. To me, the limit is probably going to be below 350W so as not to turn my room into a sauna, or kill my air conditioning system.
    Reply