Skip to main content

GeForce RTX 4080 May Come In 16GB and 12GB Flavors

A GeForce RTX 3090
(Image credit: Nvidia)

The alleged specs of some of the first Nvidia GeForce RTX 40-series graphics cards have appeared online. The source, MEGASizeGPU (opens in new tab), today provided some details about the RTX 4090 reference design. However, the biggest surprise was his assertion that two RTX 4080 reference designs were reportedly brewing in Jensen Huang’s green cauldron. Before going on, please remember that this rumor may or may not be accurate, so add a dash of salt to the broth.

Some reckon that Nvidia’s Ada Lovelace architecture GPUs for consumers could start to trickle into public view (officially) as soon as this month. But, first, it may begin with high-end launches and releases, just like it did with the Ampere generation. So it is with little surprise that the first confident leaks of tech specs emerging via the Twitterverse, other social media, and tech forums concern purported products like the GeForce RTX 4090 and RTX 4080.

MEGAsizeGPU’s assertions regarding the RTX 4090 are pretty straightforward. He says the RTX 4090 allegedly comes with 24GB of GDDR6 and a reference design PCB constructed from 14 layers. The Nvidia RTX 3090 Ti FE board featured a 12-layer design to put the PCB layers spec into perspective. Still, top-end models for the most strenuous demands (like the K|NGP|N edition) came with 14-layer PCBs to facilitate greater power concentration.

The Ada Lovelace specs Tweet gets edgy where the RTX 4080 is concerned. MEGAsizeGPU claims there are already two “completely different” designs in the pipeline; a 12GB model with ten-layer PCB, and a 16GB model with 12-layer PCB. Both use GDDR6X like the RTX 4090. Nvidia has put out SKUs which vary only (or primarily) by memory quota in previous generations, but it might be more logical if this lower-spec GPU is an RTX 4070 (Ti). MEGAsizeGPU later added that the 12GB sample had a 192-bit memory interface. If we follow through with the theory that the weakest GPU highlighted is the RTX 4070, as it is too different to market as an RTX 4080, that also lines up better with our previous coverage.

If you are worried that one of the first high-end RTX 40 cards to be launched might come packing a 192-bit memory interface – like previous gen 060 tier cards – please check our frequently updated Nvidia Ada Lovelace everything we know feature. In short, about the memory bus, Nvidia is starting to use a large L2 cache with RTX 40-series cards, which will help make up for raw memory bandwidth in their targeted gaming resolutions.

Mark Tyson
Freelance News Writer

Mark Tyson is a Freelance News Writer at Tom's Hardware US. He enjoys covering the full breadth of PC tech; from business and semiconductor design to products approaching the edge of reason.

  • atomicWAR
    I truly hope this is not the case. A high end gpu should have no less than 16GB of ram. Anything shy of that is a high end product not worth buying in 2022 and beyond.
    Reply
  • coromonadalix
    How About 24 megs for a change, and yes 12 gb is worthless 16gb is a minimum
    Reply
  • watzupken
    I always question if more is indeed better all the time. If the price between the 16GB and 12GB are the same, then obviously, go with the 16GB version. But the reality is that it is not generally going to be the case, assuming the same build quality from a same brand. 12GB in my opinion is plenty for a gaming on 1440p, which I guess should be more popular than 4K. I guess more VRAM may translate to better texture being used, but I guess it can at the same time encourage sloppy optimization by the developers.
    Reply
  • atomicWAR
    watzupken said:
    I always question if more is indeed better all the time. If the price between the 16GB and 12GB are the same, then obviously, go with the 16GB version. But the reality is that it is not generally going to be the case, assuming the same build quality from a same brand. 12GB in my opinion is plenty for a gaming on 1440p, which I guess should be more popular than 4K. I guess more VRAM may translate to better texture being used, but I guess it can at the same time encourage sloppy optimization by the developers.

    I can't speak to 12GB of vram directly but I can speak to 11GB of vram having issues with some games even at 1440P and that has been true since I got it in 2019. Granted its only a hand full of games at this point but it started three years ago with one game at that time I got the card then has slowly become worse. Game vram usage only goes up with time.

    If 11GB wasn't enough in 2019 for the first time (for me) at 1440P, it certainly will cause problems in the near future. So yeah 12GB is not enough for the high end. If it was just the RTX 4070 series pulling this move or something further down the stack I could get on board (ie gaming at 1080P high refresh rates). But its not so I strongly but respectfully disagree with your stance (minus encouraging sloppy devs, 100% with you there)

    Edit/note: while I typically game at 4K I do frequent 1440P so max setting are playable as well which is where my experiences in this post are being expressed.
    Reply
  • BILL1957
    watzupken said:
    I always question if more is indeed better all the time. If the price between the 16GB and 12GB are the same, then obviously, go with the 16GB version. But the reality is that it is not generally going to be the case, assuming the same build quality from a same brand. 12GB in my opinion is plenty for a gaming on 1440p, which I guess should be more popular than 4K. I guess more VRAM may translate to better texture being used, but I guess it can at the same time encourage sloppy optimization by the developers.
    One thing to consider is that when buying an upper end GPU the purchaser should reasonably expect such purchase to play games for at least the next 3 years at minimum without needing to start tweaking settings lower.
    With 12g of vram I would be a lot more leery of that holding true than when considering the same card with 16g vram.

    I have been looking forward to the 4080 release and plan on buying one but if that vanilla 4080 card with a msrp of probably $850 or more only has 12g of vram for the first time in my life move over Nvidia team red here I come and I have been a staunch Nvidia GPU purchaser for over 2 decades!
    Reply