Alleged Nvidia RTX 40-series Super GPU specs and launch dates leaked

Nvidia GeForce game ready driver update
(Image credit: Nvidia)

Rumors about Nvidia's plans to refresh its desktop GPU lineup with GeForce RTX 40-Series GPUs with "Super" variants have been circulating for a while, but they did not seem credible. However, there are now several leaks alleging that Nvidia will introduce its GeForce RTX 4080 Super, GeForce RTX 4070 Ti Super, and GeForce RTX 4070 Super at an event on January 8, and then will gradually roll them out later next month.

The new GeForce RTX 40-series Super graphics cards are set to be officially introduced one day before CES 2024 takes off, according to IT HomeEXPreview, and Hassan Mujtaba, who published an alleged excerpt from an Nvidia document. Meanwhile, Nvidia's GeForce RTX 4070 Super is expected to hit store shelves on January 17; the GeForce RTX 4070 Ti Super is to be available on January 24; and the GeForce RTX 4080 Super is expected to hit the market on January 31.

There is another interesting part of these reports: re-emphasized specifications of Nvidia's purported GeForce RTX 4080 Super, GeForce RTX 4070 Ti Super, and GeForce RTX 4070 Super products, which will likely join the ranks of the best graphics cards, if they hit the market.

Alleged Nvidia RTX 40-Series Super Specifications

Swipe to scroll horizontally
Header Cell - Column 0 GPUFP32 CUDA CoresMemory ConfigurationL2 CacheTBPMSRP
*GeForce RTX 4090 TiAD10218176 (?)24GB 384-bit 24 GT/s GDDR6X (?)96 MB (?)600W (?)Arm+Leg
GeForce RTX 4090AD1021638424GB 384-bit 21 GT/s GDDR6X72 MB450W$1,599
*GeForce RTX 4080 SuperAD1031024016GB 256-bit 24 GT/s GDDR6X64 MB320W$999–$1,099
GeForce RTX 4080AD103972816GB 256-bit 22.4 GT/s GDDR6X64 MB320W$1,199
*GeForce RTX 4070 Ti SuperAD103-275/AD102-175844816GB 256-bit 22.4 GT/s GDDR6X48 MB285W$799–$849
GeForce RTX 4070 TiAD104768012GB 192-bit 21 GT/s GDDR6X48 MB285W$799
*GeForce RTX 4070 SuperAD104-350/AD103-175716812GB 192-bit 21 GT/s GDDR6X48 MB225W$599–$649
GeForce RTX 4070AD104588812GB 192-bit 21 GT/s GDDR6X36 MB200W$599
GeForce RTX 4060 TiAD10643528GB/16GB 128-bit 18 GT/s GDDR632 MB160W$399/$499
GeForce RTX 4060AD10630728GB 128-bit 17 GT/s GDDR624 MB115W$299

*Specifications are unconfirmed.

As it turns out, Nvidia's GeForce RTX 4080 Super is expected to come with a fully-enabled AD103 GPU and slightly faster memory than the predecessor, but it will maintain its total graphics power (TGP) of 320W. Interestingly, it is projected that the RTX 4080 Super will cost between $999 and $1,099, making the original RTX 4080 obsolete.

Meanwhile, the GeForce RTX 4070 Ti Super could come equipped with either AD103-275 or AD102-175 GPU featuring 8448 CUDA cores as well as 16GB of GDDR6X memory. It is rumored that the official recommended retail price is between $799 and $849, which will force graphics board makers to phase out non-Super GeForce RTX 4070 Ti fairly quickly — or significantly reduce its price.

As for the GeForce RTX 4070 Super, some believe that it will be based on the AD104-350 or AD103-175 GPU with 7168 CUDA cores, which is significantly more than 5888 CUDA cores in the case of the non-Super RTX 4070. Meanwhile, the product's MSRP is between $599 and $649, which will mean  the original GeForce RTX 4070 either becomes obsolete or gets a significant price cut.

While the information comes from multiple sources, it should be taken with the usual serving of salt since plans tend to change. Pricing, specs, and availability are not officially confirmed, though it's a relatively safe bet we'll see all three new GPUs next month.

Anton Shilov
Freelance News Writer

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • TCA_ChinChin
    This whole super refresh makes the prices all a lot more bearable, even if I still think they are on the high end. Especially the potential 16GB RTX 4070ti super. The potential MSRP drop for the RTX 4080 super is also a nice update to compete on a price level with AMDs generally cheaper flagships.
    Reply
  • Jagar123
    No thanks Nvidia
    Reply
  • valthuer
    GeForce RTX 4070 Super? Seriously Nvidia?

    With just 12 GBs of VRAM, i'm telling you, there's nothing super about this thing!
    Reply
  • JarredWaltonGPU
    valthuer said:
    GeForce RTX 4070 Super? Seriously Nvidia?

    With just 12 GBs of VRAM, i'm telling you, there's nothing super about this thing!
    There's nothing wrong with 12GB, and at least it has a 192-bit interface. The RTX 4070 is effectively tied with the RTX 3080 10GB. RTX 4070 Ti effectively ties the RTX 3080 Ti, despite having half the interface width. A new RTX 4070 Super will have both enough capacity and bandwidth (factoring in the L2 cache) to provide good performance. There will be edge cases where you can exceed 12GB, but do people really expect to run 4K with maxed out settings (including RT) in every game on a ~$600 GPU? If so, good luck!

    Most games will probably do just fine and break 60 fps, particularly with DLSS upscaling plus the potential for Frame Generation (if you want to count that). But there will always be exceptions to the rule, and in such cases, stepping down the settings a notch should suffice.

    If you want more than 12GB from Nvidia, you'll need to pony up for the 4080 Super.
    Reply
  • valthuer
    JarredWaltonGPU said:
    There's nothing wrong with 12GB, and at least it has a 192-bit interface. The RTX 4070 is effectively tied with the RTX 3080 10GB. RTX 4070 Ti effectively ties the RTX 3080 Ti, despite having half the interface width. A new RTX 4070 Super will have both enough capacity and bandwidth (factoring in the L2 cache) to provide good performance. There will be edge cases where you can exceed 12GB, but do people really expect to run 4K with maxed out settings (including RT) in every game on a ~$600 GPU? If so, good luck!

    Most games will probably do just fine and break 60 fps, particularly with DLSS upscaling plus the potential for Frame Generation (if you want to count that). But there will always be exceptions to the rule, and in such cases, stepping down the settings a notch should suffice.

    If you want more than 12GB from Nvidia, you'll need to pony up for the 4080 Super.

    Sorry Jarred.

    It's just that i had a terrible experience from 4070 Ti, so i'm not expecting the Super version to be any better.

    For the past 12 years or so, i had been a 1080p gamer and, believe it or not, i was determined to stay that way.

    I live in Greece, and, back in early 2023, i bought 4070 Ti for approximately 1,000€.

    Combined it with a 13900K CPU, an ASROCK Z790 PG Lightning Motherboard, 32 GBs of XPG Lancer modules... The whole shebang.

    At first, i thought it was impressive.

    That was until i saw games like Resident Evil 4 remake, Atomic Heart and Far Cry 6, CTD repeatedly at 1080p, with Direct3D fatal errors, due to 4070 Ti's insufficient VRAM.

    That's what got me disappointed, or rather enraged, considering the fact that i had just spent my money on a new generation GPU.

    If a card like 4070 Ti can't handle Full HD, i realised it was rather improbable it could handle all the AAA games that were gonna be released during the next few months.

    I wouldn't dare to have 4K Ultra aspirations with a 4070 Ti. But, at 1080p, and with a high-end rig, i demand nothing less than playable framerates at max settings.

    So, in June, i pulled the trigger, got rid of my 4070 Ti and bought the 4090 - which, in turn, made me buy a 4Κ monitor.

    Cards like 4070 Ti and Super, COULD have been great, HAD Nvidia taken things seriously and equipped them with 16 GB.

    But, at 12 GB, they are just a pathetic GPUs, not worthy of their money.

    I know i sound demanding, but, in reality, i'm not: the games i try to play are!

    I mean, what can i really do about a title like Resident Evil 4 remake, when it needs no less than 14 GBs of VRAM at 1080p Ultra? What margin of choice does that leave me?

    Having a brand new, expensive product, that makes you lower game settings right off the bat, is not something i intend to settle for.

    P.S. Knowing what i know now, i'm not sure why anyone would even consider 12GB VRAM in 2023. 1080 Ti from 2017 had 11GB for crying out loud.

    Nvidia must be drunk. Even if the 4090 had 30% less performance I'd still buy it for the 24GB VRAM alone.
    Reply
  • thisisaname
    JarredWaltonGPU said:
    There's nothing wrong with 12GB, and at least it has a 192-bit interface. The RTX 4070 is effectively tied with the RTX 3080 10GB. RTX 4070 Ti effectively ties the RTX 3080 Ti, despite having half the interface width. A new RTX 4070 Super will have both enough capacity and bandwidth (factoring in the L2 cache) to provide good performance. There will be edge cases where you can exceed 12GB, but do people really expect to run 4K with maxed out settings (including RT) in every game on a ~$600 GPU? If so, good luck!

    Most games will probably do just fine and break 60 fps, particularly with DLSS upscaling plus the potential for Frame Generation (if you want to count that). But there will always be exceptions to the rule, and in such cases, stepping down the settings a notch should suffice.

    If you want more than 12GB from Nvidia, you'll need to pony up for the 4080 Super.
    The 1080 was good for 1080 gaming does that mean that4 generation later the 4070ti is still only a 1080 card. If that is so then there has been little progress since then.
    Reply
  • JarredWaltonGPU
    valthuer said:
    Sorry Jarred.

    It's just that i had a terrible experience from 4070 Ti, so i'm not expecting the Super version to be any better.

    For the past 12 years or so, i had been a 1080p gamer and, believe it or not, i was determined to stay that way.

    I live in Greece, and, back in early 2023, i bought 4070 Ti for approximately 1,000€.

    Combined it with a 13900K CPU, an ASROCK Z790 PG Lightning Motherboard, 32 GBs of XPG Lancer modules... The whole shebang.

    At first, i thought it was impressive.

    That was until i saw games like Resident Evil 4 remake, Atomic Heart and Far Cry 6, CTD repeatedly at 1080p, with Direct3D fatal errors, due to 4070 Ti's insufficient VRAM.

    That's what got me disappointed, or rather enraged, considering the fact that i had just spent my money on a new generation GPU.

    If a card like 4070 Ti can't handle Full HD, i realised it was rather improbable it could handle all the AAA games that were gonna be released during the next few months.

    I wouldn't dare to have 4K Ultra aspirations with a 4070 Ti. But, at 1080p, and with a high-end rig, i demand nothing less than playable framerates at max settings.

    So, in June, i pulled the trigger, got rid of my 4070 Ti and bought the 4090 - which, in turn, made me buy a 4Κ monitor.

    Cards like 4070 Ti and Super, COULD have been great, HAD Nvidia taken things seriously and equipped them with 16 GB.

    But, at 12 GB, they are just a pathetic GPUs, not worthy of their money.

    I know i sound demanding, but, in reality, i'm not: the games i try to play are!

    I mean, what can i really do about a title like Resident Evil 4 remake, when it needs no less than 14 GBs of VRAM at 1080p Ultra? What margin of choice does that leave me?

    Having a brand new, expensive product, that makes you lower game settings right off the bat, is not something i intend to settle for.

    P.S. Knowing what i know now, i'm not sure why anyone would even consider 12GB VRAM in 2023. 1080 Ti from 2017 had 11GB for crying out loud.

    Nvidia must be drunk. Even if the 4090 had 30% less performance I'd still buy it for the 24GB VRAM alone.
    If you were crashing to desktop regularly, that sounds like something other than insufficient VRAM. I can only point to Hogwarts Legacy as a game I’ve seen in the past year or so where 12GB on an Nvidia GPU had problems, and then only at 4K and max settings. 8GB cards had issues at 1440p as well, but IIRC they still did fine at 1080p.

    And like I said, a few exceptions where you need to run High instead of Ultra settings is hardly the end of the world. The whole situation with people raging about lack of VRAM has gotten quite ridiculous.

    4070 Ti was not a great card because it was $800 for a card that replaced a $600 model. 4070 at $600 was far more reasonable, so a faster 4070 Super for the same price would be even better. But the 4070 Super won’t match the 4070 Ti in performance either, it will just be cheaper.
    Reply
  • valthuer
    JarredWaltonGPU said:
    If you were crashing to desktop regularly, that sounds like something other than insufficient VRAM. I can only point to Hogwarts Legacy as a game I’ve seen in the past year or so where 12GB on an Nvidia GPU had problems, and then only at 4K and max settings. 8GB cards had issues at 1440p as well, but IIRC they still did fine at 1080p.

    And like I said, a few exceptions where you need to run High instead of Ultra settings is hardly the end of the world. The whole situation with people raging about lack of VRAM has gotten quite ridiculous.

    No, trust me: it WAS a VRAM only problem, whenever i tried to raise the settings over a specific limit.

    It happened on certain games and i was able to trace the error codes. Each and every time, i would find users facing the exact same issues, the only solution being... lower the settings and everything will be fine! And every time i read that, i was like: "how come I didn't think of that?" :ROFLMAO: :ROFLMAO:

    The proposed "solution", worked for me as well, but, as you can imagine, i wasn't very satisfied.

    My rig, was otherwise performing as expected.

    I could live with frame rate drops: i have experienced them repeatedly and it's no big deal.

    I could easily stomach stuttering: not the end of the world, as far as i'm concerned.

    However, i will not/i shall not/i cannot accept CTDs at 1080p, from a latest generation 1,000€ GPU.

    This is where i personally drew the line.

    The whole situation with people raging about lack of VRAM, is a real problem. I just had to pay dearly before i realise how big it is.
    Reply
  • JarredWaltonGPU
    valthuer said:
    No, trust me: it WAS a VRAM only problem, whenever i tried to raise the settings over a specific limit.

    It happened on certain games and i was able to trace the error codes. Each and every time, i would find users facing the exact same issues, the only solution being... lower the settings and everything will be fine! And every time i read that, i was like: "how come I didn't think of that?" :ROFLMAO: :ROFLMAO:

    The proposed "solution", worked for me as well, but, as you can imagine, i wasn't very satisfied.

    My rig, was otherwise performing as expected.

    I could live with frame rate drops: i have experienced them repeatedly and it's no big deal.

    I could easily stomach stuttering: not the end of the world, as far as i'm concerned.

    However, i will not/i shall not/i cannot accept CTDs at 1080p, from a latest generation 1,000€ GPU.

    This is where i personally drew the line.

    The whole situation with people raging about lack of VRAM, is a real problem. I just had to pay dearly before i realise how big it is.
    Was this a quick CTD after launching, or something that only happened after playing for a while? Because I’m serious when I say I almost never see CTD that I’d place the blame on VRAM. And if it happens after 30 or more minutes, that again implies poor coding. Maybe it’s coding related to VRAM management, but lack of VRAM is either a fast CTD or you get page swapping. 🤷‍♂️
    Reply
  • CelicaGT
    valthuer said:
    No, trust me: it WAS a VRAM only problem, whenever i tried to raise the settings over a specific limit.

    It happened on certain games and i was able to trace the error codes. Each and every time, i would find users facing the exact same issues, the only solution being... lower the settings and everything will be fine! And every time i read that, i was like: "how come I didn't think of that?" :ROFLMAO: :ROFLMAO:

    The proposed "solution", worked for me as well, but, as you can imagine, i wasn't very satisfied.

    My rig, was otherwise performing as expected.

    I could live with frame rate drops: i have experienced them repeatedly and it's no big deal.

    I could easily stomach stuttering: not the end of the world, as far as i'm concerned.

    However, i will not/i shall not/i cannot accept CTDs at 1080p, from a latest generation 1,000€ GPU.

    This is where i personally drew the line.

    The whole situation with people raging about lack of VRAM, is a real problem. I just had to pay dearly before i realise how big it is.
    I actually looked this up, it seems that D3D crash in RE4 was likely due to a bad RT implementation, and it was crashing 4080's and 4090's too. The tie in with VRAM was because people mistook allocated VRAM with VRAM usage. You can actually exceed your physical VRAM size but it may behave as Jarred explained, textures will simply be streamed into the buffer as required which can cause hitches, stutters, and blurred or unloaded textures. Or sometimes nothing, it really depends on the game. If you had crashes in other games then indeed, you had either a hardware issue with that specific card, or a software issue elsewhere.
    Reply