New RTX 4070 May Come With Salvaged RTX 4080 Dies

GeForce RTX 4070
GeForce RTX 4070 (Image credit: Nvidia)

It didn't take long for the GeForce RTX 4070 to position itself as one of the best graphics cards for gamers. However, according to new information from kopite7kimi, one of the more credible leakers in the hardware scene, a second variant of the GeForce RTX 4070 may already be in the pipeline.

Chipmakers have been salvaging silicon for ages, and rightfully so nowadays since wafer prices aren't getting any cheaper. Take the GeForce RTX 3070 Ti, initially launched with the GA104 silicon. Nvidia would later recycle defective GA102 silicon and slip them into the GeForce RTX 3070 Ti. However, the die-swap occurred a little over a year after the graphics card's launch. The GeForce RTX 4070 just came out last month, so it seems a bit early for Nvidia to put out a second variant.

Currently, there are two theories. The more obvious says Nvidia is just salvaging dies due to pricey wafers — what else can it do with AD103 chips where one of the memory controllers doesn't work? The other rumor claims that switching to the new die could pave the way for a GeForce RTX 4070 with 16GB of memory.

The GeForce RTX 4070 currently utilizes the AD104 silicon that houses 60 Streaming Multiprocessors (SMs), but only 46 are enabled on the graphics card. The current rumor is that Nvidia may repurpose the AD103 silicon for the GeForce RTX 4070. The AD103 silicon, which powers the GeForce RTX 4080, is bigger than the AD104 die, so it's a more than adequate substitute. The GeForce RTX 4070 only has 5,888 CUDA cores, meaning a GeForce RTX 4070 AD103 would have slightly more than half of the SMs disabled.

Nvidia has received much criticism, including from AMD, for only incorporating 12GB of memory on the GeForce RTX 4070. AMD gloated that its Radeon 16GB graphics cards start from $479 (now), while consumers don't get the same amount of memory on the latest GeForce RTX 40-series graphics card until they hit the GeForce RTX 4080 tier, which starts at $1,199. This could fuel Nvidia to release a GeForce RTX 4070 16GB. Gigabyte had previously listed a GeForce RTX 4070 16GB on its website, lending some credence to the theory.

Gamers would probably like to see a GeForce RTX 4070 16GB with the increasing VRAM requirements from modern triple-A games. However, at least at this point, it's unlikely that one will launch. Releasing a 16GB model this early would only anger early GeForce RTX 4070 adopters — and let's not forget the GeForce RTX 4070 Ti, which also has 12GB. The regular GeForce RTX 4070, at $599, is already hard to swallow, and adding more memory would only push the MSRP even higher. Although that wouldn't be a severe issue with the GeForce RTX 40-series product stack since every SKU so far carries a premium price tag.

According to a reliable source, Nvidia will allegedly announce the GeForce RTX 4060 Ti this month. We've only seen retailer listings for an 8GB model; however, a 16GB variant is reportedly launching too. The GeForce RTX 4060 Ti 16GB, seemingly hitting the retail market in July, will be the cheapest Ada Lovelace SKU with 16GB that gamers can purchase. Nvidia could use clamshell mode on the VRAM and just put memory on both sides of the PCB as well, which would allow for 16GB with only a 128-bit interface.

We'll have to wait to find out more about the RTX 4060 Ti and whether or not a 4070 16GB model is forthcoming. We fully expect to see an 8GB 4060 Ti, and the RTX 3060 also launched with a 12GB model even though the RTX 3060 Ti, RTX 3070, and RTX 3070 Ti were all limited to just 8GB. So it's not out of the question for a lower tier part to be available with more VRAM than higher tier parts.

Zhiye Liu
RAM Reviewer and News Editor

Zhiye Liu is a Freelance News Writer at Tom’s Hardware US. Although he loves everything that’s hardware, he has a soft spot for CPUs, GPUs, and RAM.

  • hannibal
    4070 super?
    What would they call it...
    4080se would sound more likely...
    Only $999
    :)
    Reply
  • Jagar123
    I actually audibly laughed out loud reading the first sentence of this article, "It didn't take long for the GeForce RTX 4070 to position itself as one of the best graphics cards for gamers."

    I know management is making you put that in there to get clicks, but from everything I've read gamers everywhere are actually shunning the 4070 pretty hard. I know I am.

    I'd recommend maybe choosing a different opening in future articles? It just sounds inauthentic the way it is written now.
    Reply
  • JarredWaltonGPU
    Jagar123 said:
    I actually audibly laughed out loud reading the first sentence of this article, "It didn't take long for the GeForce RTX 4070 to position itself as one of the best graphics cards for gamers."

    I know management is making you put that in there to get clicks, but from everything I've read gamers everywhere are actually shunning the 4070 pretty hard. I know I am.

    I'd recommend maybe choosing a different opening in future articles? It just sounds inauthentic the way it is written now.
    The SEO gods require tribute. Also, I don't think the 4070 is all that bad, considering the whole package. It's ~3080 performance, plus DLSS3 and a couple extra GB of VRAM, for $100 less, and with ~60% of the power draw. It's not AWESOME and you wouldn't upgrade from a 3080 or 6800 (or higher) to the 4070. But if you have an RTX 20-series or RX 5000-series card and are looking for a potential upgrade, for the time being it's the "best $600" card in my book.

    And no, that doesn't mean you should run out and buy a $600 GPU right now. Give it another month and we'll see what 4060 Ti and maybe even RX 7600 XT look like, hopefully for a lot less than $600. I also want to see where RX 7800/7700 land, in performance, features, and price. But everything going on right now suggests things aren't actually going to get much better than the current status quo. I'd love to see an RX 7800 XT that clearly beats the RTX 4070 for about the same price. I'm not convinced that will happen.
    Reply
  • Metal Messiah.
    Another theory is that Nvidia will use the RTX 4090 Laptop GPU, where some of the dies are defective.

    The Laptop 4090 SKU is having the same specs as the NVIDIA GeForce RTX 4080 Desktop GPU, as far as core configurations are concerned.And more importantly, the RTX 4090 mobile uses 16 GB GDDR6 dedicated graphics memory with a clock speed of 20 Gbps (effective).

    So NVIDIA could re-use some of the laptop chips as well , since both the desktop 4080 and laptop 4090 chips have 16GB memory capacity and 256-bit bus.

    At the same time, the 4090 is still available in a multitude of variants, with power packages between 80 to 150W and support for up to +25W extra with Dynamic Boost 2.0
    Reply
  • helper800
    Metal Messiah. said:
    Another theory is that Nvidia will use the RTX 4090 Laptop GPU, where some of the dies are defective.

    The Laptop 4090 SKU is having the same specs as the NVIDIA GeForce RTX 4080 Desktop GPU, as far as core configurations are concerned.And more importantly, the RTX 4090 mobile uses 16 GB GDDR6 dedicated graphics memory with a clock speed of 20 Gbps (effective).

    So NVIDIA could re-use some of the laptop chips as well , since both the desktop 4080 and laptop 4090 chips have 16GB memory capacity and 256-bit bus.
    Don't they hardwire in wattage limits for those mobile chips? If so I don't think they could really reuse them for desktop cards. Though it would not be the first time mobile GPU chips ended up in desktop graphics cards.
    Reply
  • Jagar123
    JarredWaltonGPU said:
    The SEO gods require tribute. Also, I don't think the 4070 is all that bad, considering the whole package. It's ~3080 performance, plus DLSS3 and a couple extra GB of VRAM, for $100 less, and with ~60% of the power draw. It's not AWESOME and you wouldn't upgrade from a 3080 or 6800 (or higher) to the 4070. But if you have an RTX 20-series or RX 5000-series card and are looking for a potential upgrade, for the time being it's the "best $600" card in my book.

    And no, that doesn't mean you should run out and buy a $600 GPU right now. Give it another month and we'll see what 4060 Ti and maybe even RX 7600 XT look like, hopefully for a lot less than $600. I also want to see where RX 7800/7700 land, in performance, features, and price. But everything going on right now suggests things aren't actually going to get much better than the current status quo. I'd love to see an RX 7800 XT that clearly beats the RTX 4070 for about the same price. I'm not convinced that will happen.
    I know my aging 2080 is needing an upgrade. I just don't feel anything released so far warrants it. I am content continuing to wait. If that means I skip this generation because I don't think the prices are where they should be then so be it. I do want AMD to do better with their 7800 XT release but I don't have much hope for it being priced that well either.
    Reply
  • helper800
    Jagar123 said:
    I know my aging 2080 is needing an upgrade. I just don't feel anything released so far warrants it. I am content continuing to wait. If that means I skip this generation because I don't think the prices are where they should be then so be it. I do want AMD to do better with their 7800 XT release but I don't have much hope for it being priced that well either.
    Just wait until a 400 dollar card gives you +100% performance. I usually do not jump on a graphics card upgrade until I get at least +100%.
    Reply
  • Metal Messiah.
    helper800 said:
    Don't they hardwire in wattage limits for those mobile chips? If so I don't think they could really reuse them for desktop cards. Though it would not be the first time mobile GPU chips ended up in desktop graphics cards.

    No, the 4090 is actually available in a multitude of variants, with power packages between 80 to 150W and support for up to +25W extra with Dynamic Boost 2.0. So they can re-use some of these chips if need be. I just edited my post now btw to add this text.
    Reply
  • JarredWaltonGPU
    Metal Messiah. said:
    Another theory is that Nvidia will use the RTX 4090 Laptop GPU, where some of the dies are defective.

    The Laptop 4090 SKU is having the same specs as the NVIDIA GeForce RTX 4080 Desktop GPU, as far as core configurations are concerned. And more importantly, the RTX 4090 mobile uses 16 GB GDDR6 dedicated graphics memory with a clock speed of 20 Gbps (effective).

    So NVIDIA could re-use some of the laptop chips as well , since both the desktop 4080 and laptop 4090 chips have 16GB memory capacity and 256-bit bus.

    At the same time, the 4090 is still available in a multitude of variants, with power packages between 80 to 150W and support for up to +25W extra with Dynamic Boost 2.0
    I'm not sure what you're getting at. It's why we said "AD103" in the article, which is the same chip in both desktop 4080 and laptop 4090. Any differences are due to binning and finalizing of what voltage to use. Maybe you're just referring to the use of GDDR6 in the mobile 4090 rather than GDDR6X in the desktop 4080, but I'd be surprised for Nvidia to use GDDR6 in a 16GB desktop 4070 now. I actually expected the RTX 4070 to use GDDR6 before it was announced, and the fact that it doesn't probably means Nvidia has plenty of GDDR6X to go around.
    Reply
  • Metal Messiah.
    JarredWaltonGPU said:
    I'm not sure what you're getting at. It's why we said "AD103" in the article, which is the same chip in both desktop 4080 and laptop 4090. Any differences are due to binning and finalizing of what voltage to use. Maybe you're just referring to the use of GDDR6 in the mobile 4090 rather than GDDR6X in the desktop 4080, but I'd be surprised for Nvidia to use GDDR6 in a 16GB desktop 4070 now. I actually expected the RTX 4070 to use GDDR6 before it was announced, and the fact that it doesn't probably means Nvidia has plenty of GDDR6X to go around.

    I'm just saying that the Laptop 4090 SKU is also based on the AD103 chip as the desktop RTX 4080 variant. No other GPU is based on AD103, except the RTX 5000 Mobile Ada Generation SKU.

    So there might be a possibility for Nvidia to salvage some of these laptop chips, though using GDDR6 memory would b e kind of odd. Even I think this would be weird, unless some GDDR6X chips might be costing Nvidia too much (seems a bit unlikely though) ?

    Or maybe there are not many DESKTOP 4080 dies which might be defective to begin with ?
    Reply