"Very Few Are Interested" in RTX 4060 Ti 16GB GPUs, Nvidia AIB Sources Reportedly Say
Is the proposed $499 pricing too close to the RTX 4070?
According to Andreas Schilling, an editor at Germany's HardwareLuxx, Nvidia add-in-board partners are largely disinterested in preparing and promoting GeForce RTX 4060 Ti 16GB graphics cards. This could mean fewer choices than expected for those interested in this $499 price point targeting GPU.
The German tech magazine editor Tweeted (h/t VideoCardz) that he had discussed the upcoming launch of the RTX 4060 Ti 16GB model with some AIBs. In particular, he raised his concerns by mentioning "how many fewer models there are compared to the standard variant." Reading between the lines, AIBs may only be going through the motions with this launch, and that will ultimately mean fewer models, perhaps even resulting in some market scarcity.
Talked to some AIB partners: It looks like very few are interested in promoting the GeForce RTX 4060 Ti 16 GB in a big way. You can already see that in how many fewer models there are compared to the standard variant. Its getting to close to the GeForce RTX 4070.July 6, 2023
Market scarcity could be a bad thing for consumers, as it usually stymies price competition. However, in this case, the mix of performance, features, memory, and the $499 target price might mean no one will be interested in this product anyway. A major issue for this upcoming GeForce RTX 40 family member is that the $499 price point is getting perilously close to the gravitational pull of the RTX 4070 zone.
Graphics Card | RTX 4060 Ti | RTX 4060 Ti 16GB | RTX 4060 | RTX 4070 | RTX 3070 | RTX 3060 Ti | RX 6750 XT | RX 6700 | Arc A770 16GB |
---|---|---|---|---|---|---|---|---|---|
Architecture | AD106 | AD106 | AD107 | AD104 | GA104 | GA104 | Navi 22 | Navi 22 | ACM-G10 |
Process Technology | TSMC 4N | TSMC 4N | TSMC 4N | TSMC 4N | Samsung 8N | Samsung 8N | TSMC N7 | TSMC N7 | TSMC N6 |
Transistors (Billion) | 22.9 | 22.9 | 18.9 | 32 | 17.4 | 17.4 | 17.2 | 17.2 | 21.7 |
Die size (mm^2) | 187.8 | 187.8 | 158.7 | 294.5 | 392.5 | 392.5 | 336 | 336 | 406 |
SMs / CUs / Xe-Cores | 34 | 34 | 24 | 46 | 46 | 38 | 40 | 36 | 32 |
GPU Cores (Shaders) | 4352 | 4352 | 3072 | 5888 | 5888 | 4864 | 2560 | 2304 | 4096 |
Tensor Cores | 136 | 136 | 96 | 184 | 184 | 152 | N/A | N/A | 512 |
Ray Tracing "Cores" | 34 | 34 | 24 | 46 | 46 | 38 | 40 | 36 | 32 |
Boost Clock (MHz) | 2535 | 2535 | 2460 | 2475 | 1725 | 1665 | 2600 | 2450 | 2100 |
VRAM Speed (Gbps) | 18 | 18 | 17 | 21 | 14 | 14 | 18 | 16 | 17.5 |
VRAM (GB) | 8 | 16 | 8 | 12 | 8 | 8 | 12 | 10 | 16 |
VRAM Bus Width | 128 | 128 | 128 | 192 | 256 | 256 | 192 | 160 | 256 |
L2 / Infinity Cache | 32 | 32 | 24 | 36 | 4 | 4 | 96 | 80 | 16 |
ROPs | 48 | 48 | 48 | 64 | 96 | 80 | 64 | 64 | 128 |
TMUs | 136 | 136 | 96 | 184 | 184 | 152 | 160 | 144 | 256 |
TFLOPS FP32 (Boost) | 22.1 | 22.1 | 15.1 | 29.1 | 20.3 | 16.2 | 13.3 | 11.3 | 17.2 |
TFLOPS FP16 (FP8) | 177 (353) | 177 (353) | 121 (242) | 233 (466) | 163 | 130 | 26.6 | 22.6 | 138 |
Bandwidth (GBps) | 288 | 288 | 272 | 504 | 448 | 448 | 432 | 320 | 560 |
TDP (watts) | 160 | 160 | 115 | 200 | 220 | 200 | 250 | 175 | 225 |
Launch Date | May 2023 | Jul 2023 | Jul 2023 | Apr 2023 | Oct 2020 | Dec 2020 | May 2022 | Mar 2021 | Sep 2022 |
Launch Price | $399 | $499 | $299 | $599 | $499 | $399 | $549 | $479 | $349 |
Current Price | $399 | N/A | N/A | $599 | $442 | $377 | $379 | $269 | $349 |
The $499 MSRP of the GeForce RTX 4060 Ti 16GB is way higher than a typical Nvidia 60-class product. Meanwhile, the RTX 4070 adds a significant performance boost for approximately $100 more, with its 5,888 CUDA cores (i.e. over 35% more CUDA cores). Another big advantage of the RTX 4070 stems from its memory subsystem: using faster GDDR6, and a wider memory bus, for almost double the memory bandwidth.
Our Nvidia GeForce RTX 4060 Ti review made the RTX 4070 12GB's superiority very clear, and some extra bandwidth-constrained memory isn't going to help the former very much.
From a leak we reported yesterday, it probably won't be too long until the GeForce RTX 4060 Ti 16GB is released. A reliable Twitter tipster shared an image that indicates the 16GB version of the RTX 4060 Ti is going to be on shelves from July 18, and we typically see reviews a day before.
Referencing back to Andreas Schilling's Tweets, the HardwareLuxx editor asserts that the only reason for the existence of the GeForce RTX 4060 Ti 16 GB SKU is "to quiet those critics who find that 8GB are not enough." So, please stay tuned for our review of one of these questionable graphics cards, and we will be sure to let you know whether this memory-bumped GPU is worthwhile and whether it is worthy of inclusion in the ranks of the best graphics cards.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Mark Tyson is a news editor at Tom's Hardware. He enjoys covering the full breadth of PC tech; from business and semiconductor design to products approaching the edge of reason.
-
vinay2070 Huang won! He wants people to lean towards the 4070 instead of the 4060 Ti. 4070 is technically a 4060. So he is making people buy a 4060 for 599! What a master plan!Reply -
InvalidError "No one is interested in a 16GB RTX4060Ti."Reply
I'm sure plenty of people would be interested if it was $350 instead of $500-and-up.
Though 16GB on a 128bits bus might pose somewhat of a bandwidth challenge. High-res textures and buffers are the main things that consume VRAM and VRAM bandwidth requirement will rise accordingly. Looking forward to benchmarks showing how badly 128bits might fall flat on its face because of it with 16GB models taking VRAM exhaustion out of the equation. -
hotaru.hino Another big advantage of the RTX 4070 stems from its memory subsystem: using faster GDDR6, and a wider memory bus, for almost double the memory bandwidth.
That bandwidth is needed to feed the extra cores. It does not mean it gives it any actual advantage over the 4060 Ti if you normalize things (though I'm not sure how you'd normalize it). Or to put it in another way, it's like saying a Threadripper 3970X has a memory subsystem advantage over Ryzen due to being on a quad channel platform... even though there's at least twice as many cores to feed at this point. Or to reverse it, we feed 16-core Ryzen CPUs with dual channel memory and nobody seems to bat an eye.
In addition NVIDIA's method to combat this is load up the GPU with a lot of L2 cache. The GeForce 40 series has give or take anywhere from 8-12 times as much L2 cache over the GeForce 30 series. It's not that dissimilar from AMD having a lot of LLC on RDNA3.
And curious if memory bandwidth really had any significant impact, I dug around to see if I could find two video cards where the only thing that changed (more or less) was the bandwidth. Which the only card I was able to find within the last few generations was the RTX 3060 Ti, which got a GDDR6X upgrade. The rest of the specs are identical. It also saw little (as in <5%) improvement. -
salgado18 No one is interested in paying $100 for extra 8 GB of VRAM. $50 would be a nice price difference, but $100 (which is a 25% price increase over the 8 GB model) is way too much for the benefit.Reply
I mean, it sure is good to have lots of VRAM. My nephew has an RX 460 2 GB that doesn't play many games not because it is a 460, but because it is not the 4 GB model. I know that is a terribly slow card for today, but the future will come, and someone will be using the card. -
evdjj3j
I had to use a 2 GB 770 during the GPU shortage and 2 more GB of RAM would have made a huge difference.salgado18 said:No one is interested in paying $100 for extra 8 GB of VRAM. $50 would be a nice price difference, but $100 (which is a 25% price increase over the 8 GB model) is way too much for the benefit.
I mean, it sure is good to have lots of VRAM. My nephew has an RX 460 2 GB that doesn't play many games not because it is a 460, but because it is not the 4 GB model. I know that is a terribly slow card for today, but the future will come, and someone will be using the card. -
Kamen Rider Blade GDDR6 VRAM Prices Plummet: 8GB of Memory Now Costs $27If GDDR6 VRAM price costs $27 for 8 GB, why would the end user want to pay more than $27 or $30 more for the exact same card?Reply
You already have the original profit margin on the 4060Ti, nobody wants to give nVIDIA more profit margin for what is effectively a little bit longer time on the Pick & Place machine.
Hypothetically, $3 should cover the extra time & costs for the machines to do their jobs to place the extra GDDR6 VRAM Packages on the MoBo. -
bigdragon $100 more for 8GB? Does Nvidia not realize how inexpensive memory is now? We can all go to our favorite storefronts and see how much memory prices have come down. Nvidia is abusing its market position to intentionally kneecap what should have come standard on the base model 4060 Ti, if not the 4060. It's startling just how out of touch Nvidia is with its customers now. The lack of 4060 Ti 16GB models could be a positive, however. We don't need 5+ variants of the same tier card!Reply
8GB of VRAM no longer meets the minimum requirements of many AAA games. 12GB won't be enough much longer. It's incumbent on Nvidia to either provide more VRAM on their products or work directly with game engine developers to improve the way cross-platform ports utilize memory.
In an ideal world, VRAM would no longer be a thing and we'd have some sort of unified memory separate from the GPU...and user upgradable! -
InvalidError
If you increase bandwidth while keeping the workload exactly the same, you aren't going to see much change unless you already had a non-trivial bandwidth bottleneck.hotaru.hino said:That bandwidth is needed to feed the extra cores. It does not mean it gives it any actual advantage over the 4060 Ti if you normalize things (though I'm not sure how you'd normalize it). Or to put it in another way, it's like saying a Threadripper 3970X has a memory subsystem advantage over Ryzen due to being on a quad channel platform...
If you double VRAM then change the workload to fill that VRAM such as by loading 4k/6k/8k texture packs, chances are that reads are now scattered that much wider as they were before and bandwidth may play a much larger role, especially when most of it is high-res textures which have minimal impact on GPU-power requirements, only VRAM size and bandwidth.
Because nobody wants to make a bigger effort with higher expenses and higher liabilities for zero profit. AMD and Nvidia want 40+% and 60+% profit margins, so you need to add 40+% to whatever they put in the GPU kits they sell to AIBs. Then AIBs have to slap their own profit margin on top, then distributors and retailers.Kamen Rider Blade said:If GDDR6 VRAM price costs $27 for 8 GB, why would the end user want to pay more than $27 or $30 more for the exact same card?
People who say that are being misleading by omission. Practically all AAA games will play perfectly fine on 8GB GPUs, you just need to lower details a bit to make it fit comfortably.bigdragon said:8GB of VRAM no longer meets the minimum requirements of many AAA games. 12GB won't be enough much longer.
The statement you should be making is: "8GB of VRAM no longer meets the minimum requirements of many AAA games at 1080p high-details and beyond."
Most people looking for GPUs under $300 are perfectly fine dialing things down to save $100+ vs the next sensible step up. -
Kamen Rider Blade
Sadly, that isn't the world we live in.bigdragon said:In an ideal world, VRAM would no longer be a thing and we'd have some sort of unified memory separate from the GPU...and user upgradable!