Skip to main content

Nvidia's GeForce RTX 3080 12GB Is Probably At The Door

EVGA GeForce RTX 3080 12GB FTW3 Ultra Gaming
EVGA GeForce RTX 3080 12GB FTW3 Ultra Gaming (Image credit: VideoCardz)

The GeForce RTX 2060 12GB isn't the only recent graphics card from Nvidia with upgraded memory. VideoCardz has shared two renders of a custom GeForce RTX 3080 12GB, a rumored Ampere graphics card that has been in the rumor mill for a couple of months now.

As far as the ingredients go, the GeForce RTX 3080 12GB will presumptively continue to use Nvidia's GA102 die, the same silicon that's inside the GeForce RTX 3080. However, the rumors point to a more generous amount of enabled Streaming Multiprocessors (SM). While the regular GeForce RTX 3080's silicon has 68 SMs (8,704 CUDA cores), the 12GB variant may arrive with two additional SMs. That would reportedly put the core counts to 8,960 CUDA cores, 280 Tensor cores, and 70 RT cores. It's not a massive upgrade, but a 3% higher amount of CUDA cores should give the GeForce RTX 3080 12GB a fairly noticeable advantage in benchmarks.

The GeForce RTX 3080 12GB, as its name denotes, will arrive with 12GB of GDDR6X memory, 2GB more than the standard GeForce RTX 3080. The memory modules likely won't change, and we'll probably see 19 Gbps chips on the GeForce RTX 3080 12G. However, the revamped Ampere graphics card presumably comes with a 384-bit memory interface. The wider bus would allow the GeForce RTX 3080 12GB to hit a memory bandwidth of 912.4 GBps, 20% higher than the regular GeForce RTX 3080 with its 320-bit memory interface.

(Image credit: VideoCardz)

Having more cores and memory means that GeForce RTX 3080 12GB should have a more demanding TDP. The GeForce RTX 3080 has a 320W TDP, so it wouldn't come as a surprise if the GeForce RTX 3080 12GB ends up debuting with a TDP rating between 340W to 350W. If there is any truth to the rumor, then the GeForce RTX 3080 12GB's TDP could be in the same league as the GeForce RTX 3080 Ti (350W).

Regarding the EVGA GeForce RTX 3080 12GB FTW3 Ultra Gaming, the graphics card appears to feature the same 2.75-slot design as the GeForce RTX 3080 12GB FTW3 Ultra Gaming. The beefy iCX3 cooling system is still present, and the graphics card continues to depend on three 8-pin PCIe power connectors to get all the juice it requires. If it weren't for the "12GB GDDR6X" label on the packaging, you wouldn't be able to tell the two apart.

VideoCardz believes that Nvidia could launch the GeForce RTX 3080 12GB tomorrow while lifting the review embargo and giving retailers the green light to sell the new Ampere monster. Given the graphics card shortage, there will likely be limited stock. Whatever is available may not be at MSRP, either.

  • VforV
    Reference fairy tale MSRP: $1000. *(I would be really surprised if it's $900, not that it would be much better)
    Actual shop price: $1800, or more.

    Jensen needs that new kitchen so he can "cook" those new Lovelace GPUs at over 550W to fight RDNA3. So now nvidia will have 3080 12GB, 3080Ti, 3090 and 3090Ti (at $3000 !? real price) all of them vs RX 6900 XT(H)... that's pretty funny. But we all know he cares more about miners, that's why he gave the 3050 8GB Vram, for them.

    Despite me not liking nvidia and making fun of Jensen, I think he's brilliant at what he does, especially at marketing. I consider fools those with more money than sense that buy these extremely overpriced GPUs. Not to mention how all these bricks will be mocked by next gen GPUs with their 2x or more perf over them.

    Buy a 3090 Ti at $3000 next month, only to see a 4090(Ti) 6-8 months later beat it by 2x perf for almost the same price. :ROFLMAO:
    edit: Here we go, looks like I was right about the price:
    MSI GeForce RTX 3080 12GB cards already on sale in Germany at 1699 EUR
    Reply
  • RodroX
    I guess is nice for someone that can actually find one, have the money, and play games at 4K that need 12GB of VRAM. Thats really a small bubble of users.
    Reply
  • VforV
    RodroX said:
    I guess is nice for someone that can actually find one, have the money, and play games at 4K that need 12GB of VRAM. Thats really a small bubble of users.
    Pricing aside, the 10GB version was never a 4K GPU and it will prove going forwards even more so.

    So yes, 12GB is the minimum a GPU like this should have. I believe that this "intentional mistake" will not happen again from nvidia, especially now that intel enters the GPU wars too.
    This is a direct response to exactly that, countering intel Arc in terms of Vram, the 3080 12GB and 3070 Ti 16GB both are.

    Only the 3090 Ti is the e-peen move to beat the RX 6900 XT(XH) at all resolutions. Jensen does not like that the RX 6900 XT(XH) wins at 1080p high refresh gaming. He really does not like that Steve from HUB is using it in his CPU benchmarks (justifiably so, unlike most of the other sites and YT channels that continue to promo the 3090) and intends to change his mind with the 3090 Ti.
    It's about perception and mind share here... besides the obvious much higher margins and getting us ready psychologically to accept both more expensive and more power hungry (550W+) next gen Lovelace GPUs.
    Reply
  • RodroX
    In any case with intel been... well intel, it does not matter how many players we get in the GPU segment, I think we gamers are still long away from seen fair gpu prices back, if ever again.

    As for GPU power requirements, thats something we may have to wait till we get the real product and see some benchmarks and test. After all if 2x the performance for extra 55% more watts than current top gen (more or less), it does not seem too bad of a deal. Don't get me wrong! it is of course not great or ideal for people like me that care a lot about power usage.
    But If you just want, like me, something better (performance wise) than for example my current card, we may be able to get it with a decent power usage, if things get better with stock and prices on next gen.
    Reply
  • VforV
    RodroX said:
    In any case with intel been... well intel, it does not matter how many players we get in the GPU segment, I think we gamers are still long away from seen fair gpu prices back, if ever again.

    As for GPU power requirements, thats something we may have to wait till we get the real product and see some benchmarks and test. After all if 2x the performance for extra 55% more watts than current top gen (more or less), it does not seem too bad of a deal. Don't get me wrong! it is of course not great or ideal for people like me that care a lot about power usage.
    But If you just want, like me, something better (performance wise) than for example my current card, we may be able to get it with a decent power usage, if things get better with stock and prices on next gen.
    I agree on all points.

    For me, I'm not interested in over 250W GPUs, but my concern is that if the top end gets to 550-600W then those mid range carsd I'm interested in, may no longer be at 250W, but 300-350W and that I would not like at all.

    The same I'm not interested in CPUs over 125W. At least there things have not gone so crazy and after the failure or Rocket Lake, AL is more efficient now. Zen4 and Raptor Lake should also both be as or more efficient, so that's good on that front.
    Reply