Nvidia Flip-Flopping Over GeForce RTX 4070 Specs, Says Leaker

Nvidia GeForce RTX 4070
(Image credit: Nvidia)

We've covered several leaks and rumors regarding the specifications of Nvidia RTX 40 series graphics cards over recent weeks. According to a prominent leaker, as we reach the end of September 2022, Nvidia is still in two minds regarding the final specifications of the RTX 4070. Today, Twitter-based Kopite7Kimi has laid out his cards, saying there are two very different spec levels Nvidia is flip-flopping between for its important RTX 4070 SKU, poised to go up against the best graphics cards.

To make things clear for our readers, we have tabulated Kopite7Kimi’s rumored specs for the RTX 4070 below. In the first column are the specs the leaker shared nearly three weeks ago. This is the more powerful and power-hungry PG141-SKU340/341 mentioned in today’s Tweet. In the second column you will see the new alternative specification (PG141-SKU336/337), which some would describe as significantly cut-back.

Swipe to scroll horizontally

Nvidia GeForce

RTX 4070

RTX4070

Board code

PG141-SKU340/341

PG141-SKU336/337

CUDA cores

7,680

7,168

Base/ Boost/Max (MHz)

2,310 / 2,610 / 2,800

N/A

Memory config

12GB 21Gbps GDDR6X

10GB 21Gbps GDDR6X

TGP

285W

250W

Time Spy Extreme

<11,000

<10,000

Date spec first reported

Aug 4, 2022

Aug 29, 2022

The second SKU has toned down the GPU core count, cut the memory quota, and is approximately 10% slower in synthetic benchmarks. In his update, Kopite7Kimi didn’t share any GPU clock speed indicators, so we don’t have any clues whether GPU clocks have been adjusted as part of the drive down to 250W and a potentially lower entry price.

If the above rumors of Nvidia dithering between the above rather different configurations for the RTX 4070 are true, it looks like the green team is finding it hard to finalize its Ada Lovelace family. We assume Nvidia is juggling these specs to come up with a product that will be agreeable in the face of what its major competitor AMD has brewing with its Radeon RX 7000 Series.

(Image credit: Nvidia)

In the longer term, Nvidia can fill out its RTX 40 range with standard models and then supplement those with Ti models, "Super" variants, or whatever to reach a very wide range of performance / price points. However, it would be optimal to get the performance / price positioning of the RTX 4070 right from the start.

Nvidia's xx70 cards are often viewed as the sweet spot that balances price and performance. While the GTX 1060, RTX 2060, and RTX 3060 outsold the 1070, 2070, and 3070, the latter are typically quantified as "high-end" cards while the former are merely "midrange" offerings. Plus, we think it's likely that Nvidia won't launch an RTX 4060 until next year, meaning the RTX 4070 could be the least expensive of the new Ada Lovelace offerings we'll see this year — and current rumors suggest it should come close to matching the outbound RTX 3090 Ti in performance.

The next generation of video cards from both Nvidia and AMD are going to be some of the most exciting we've seen, and in these last weeks ahead of the first launches we may be witnessing some 4D chess shenanigans from the rivals. Is AMD is jebaiting the green team again? We shouldn't have to wait much longer to find out.

Mark Tyson
Freelance News Writer

Mark Tyson is a Freelance News Writer at Tom's Hardware US. He enjoys covering the full breadth of PC tech; from business and semiconductor design to products approaching the edge of reason.

  • AndrewJacksonZA
    With the xx60, xx70, xx80 and the competing names from AMD having their pricing go bonkers, I've stopped using the names of the cards segmenting them into "budget," "mid-range," etc. and now look at the price points. Companies love trying to pull psychological tricks with marketing, so I try and avoid allowing them to do so, especially with things like "the xx60 is now as powerful as the previous generation xx80" (as an example.)

    Just look at the price compared to the previous several generations, plus inflation for each.
    Reply
  • bigdragon
    The minimum amount of VRAM for 2023 is 16GB. AMD set that standard back in 2020, and so did the consoles. Both 12GB and 10GB are unacceptable for hardware priced as a high-end GPU. Nvidia needs to increase their 4070 VRAM or lower their price.

    I wonder if Nvidia will try to make more space between the 4080 and 4060. There wasn't much of a gap in the 30-series.
    Reply
  • AgentBirdnest
    Kopite7Kimi Flip-Flopping Over RTX 40 Series, Says AgentBirdnest.

    I'm tired of stories about this "leaker".
    Reply
  • DougMcC
    bigdragon said:
    The minimum amount of VRAM for 2023 is 16GB. AMD set that standard back in 2020, and so did the consoles. Both 12GB and 10GB are unacceptable for hardware priced as a high-end GPU. Nvidia needs to increase their 4070 VRAM or lower their price.

    I wonder if Nvidia will try to make more space between the 4080 and 4060. There wasn't much of a gap in the 30-series.

    Yeah but xx70 isn't considered high end, right? That's top top of the mid-range. xx80 and above is the high-end of GPUs.
    Reply
  • jkflipflop98
    Sounds like this "leaker" is pulling values directly from his backside and shotgunning as much crap as he can against the wall. At least one of his thousand "predictions" are bound to come true then he gets to say "See?! I told ya so!!"
    Reply
  • Eximo
    It comes down to how many cards are manufactured already and how far along the AIBs are into packaging.

    Also, nothing says those leaks weren't for laptop GPUs but still on development boards. Might very well be both versions of the RTX4070.

    5888 cores on the RTX 3070
    5120 cores on the RTX 3070 mobile
    Reply
  • alceryes
    bigdragon said:
    The minimum amount of VRAM for 2023 is 16GB. AMD set that standard back in 2020, and so did the consoles. Both 12GB and 10GB are unacceptable for hardware priced as a high-end GPU. Nvidia needs to increase their 4070 VRAM or lower their price.
    Agreed.
    Remember though, GPU chip mfgs. (NVIDIA/AMD) build limits into GPUs on purpose. The RTX 3080 10GB is a perfect example. It will run out of VRAM way before the rasterization power becomes an issue. Limiting cards to 8/10/12GBs is another way of the mfg. saying, "you will be looking at purchasing another GPU in ~3 or so years."
    Reply
  • JamesJones44
    bigdragon said:
    The minimum amount of VRAM for 2023 is 16GB. AMD set that standard back in 2020, and so did the consoles. Both 12GB and 10GB are unacceptable for hardware priced as a high-end GPU. Nvidia needs to increase their 4070 VRAM or lower their price.

    I wonder if Nvidia will try to make more space between the 4080 and 4060. There wasn't much of a gap in the 30-series.

    I think it depends on the use case/target. If you are talking top of the line card advertised for beyond 4K gaming, I agree. However, for a midrange 4K max resolution card it's not really a requirement, 10 to 12 GB is plenty for the games out there and in the pipe. VRAM at 4K may never really use a full 16 GB of memory, remember VRAM is largely used for textures and at 4K each texture is ~67 MB per texture uncompressed, at 16 GB that's well over 200 individual uncompressed textures per frame (238 theoretical), not many games out there are even close to that. However, as you approach 8K it becomes a bigger requirement, an uncompressed 8K texture is closer to 270 MB, that leaves you with a max of 59 uncompressed individual textures per frame theoretical (there is overhead and other factors, you probably won't be able to address the full 16 GB just for texture mapping).
    Reply
  • Phaaze88
    alceryes said:
    Agreed.
    Remember though, GPU chip mfgs. (NVIDIA/AMD) build limits into GPUs on purpose. The RTX 3080 10GB is a perfect example. It will run out of VRAM way before the rasterization power becomes an issue. Limiting cards to 8/10/12GBs is another way of the mfg. saying, "you will be looking at purchasing another GPU in ~3 or so years."
    3060 being the opposite: rasterization power falling short before the extra Vram can be taken advantage of.
    Either way, "you will be looking at purchasing another GPU in ~3 or so years."

    3090(Ti)? Similar boat as the 3060; few people will take advantage of 24GBs of Vram for games. Sell it as a poor man's Quadro when it's no longer satisfactory, I guess.
    Next gen should have a lower tier model matching it while being more affordable and requiring less power to run.


    "Where's your futureproofing now?"
    /S
    Reply
  • cAllen
    Future proofing is better defined as time limited deferral.
    Reply