Nvidia Announces GeForce RTX 3050, RTX 3090 Ti and Laptop 3070 Ti and 3080 Ti

Nvidia
(Image credit: Nvidia)

Nvidia has finally announced a "budget" RTX card with the GeForce RTX 3050. Not to be confused with the laptop RTX 3050 and 3050 Ti, the desktop variant is a slightly different beast as it doubles down on the VRAM. Along with the 3050 desktop card, Nvidia also revealed details on the already leaked RTX 3090 Ti, and announced RTX 3070 Ti and RTX 3080 Ti laptop GPUs. These will all compete to join our list of the best graphics cards. Here's what you need to know.

The RTX 3050 desktop card will use the same trimmed down GA107 GPU found in the mobile 3050/3050 Ti. That's not great news, since GA107 only has a 128-bit memory interface and up to 2560 CUDA cores. The desktop 3050 will use the maximum 20 SMs (streaming multiprocessors), giving it 2560 GPU cores, and of course the RTX 3050 also features RT cores for ray tracing and tensor cores for DLSS and other applications.

There is some good news in that the 3050 desktop card will at least feature 8GB of VRAM — much better than the 4GB used on the mobile 3050 Ti and 3050. Nvidia hasn't revealed whether that's 16Gbps GDDR6 or something else (18Gbps would be nice…), but we'll find out more in the next few weeks. Also on the good news front is that the TGP (total graphics power) is just 130W for the reference specs, though it sounds as though we'll see plenty of AIB partner cards still equipped with an 8-pin power connector — that's sufficient for 225W of total power, including the PCIe x16 slot.

The GeForce RTX 3050 will go on sale on January 27, with a suggested starting price of $249. It should also feature Nvidia's LHR anti-mining hardware, though as we've seen already, that's unlikely to truly stop miners from trying to grab the cards. Maybe Ethereum's long-awaited switch to proof of stake will help when that finally arrives sometime this year (fingers crossed). The RTX 3050 should also help to fill the gap between the RTX 3060 and the previous generation RTX 2060, though we'll have to wait and see how it performs once we can get cards in for testing.

(Image credit: Nvidia)

Next up, the RTX 3090 Ti was previously leaked and will basically give a minor bump in GPU core counts at the top of Nvidia's GeForce product stack. Given we now have Ti cards for the 3090, 3080, 3070, and 3060, it's a bit interesting that there isn't a desktop RTX 3050 Ti card, though that model does exist on laptops — except with lower performance than the desktop 3050. Anyway, the RTX 3090 Ti will utilize a fully-enabled GA102 GPU, giving it 84 SMs (streaming multiprocessors) and 10752 CUDA cores, compared to the RTX 3090's 82 SMs and 10496 CUDA cores.

Nvidia will likely also bump up the maximum GPU clocks on the 3090 Ti (it didn't reveal exact specs yet), along with running higher GDDR6X memory speeds. The 3090 used 24GB of 19.5Gbps memory with 24 8Gb chips, but the RTX 3090 Ti will feature 24GB of 21Gbps GDDR6X memory, presumably using 16Gb chips. If correct, that means all of the memory will now reside on one side of the PCB, which should hopefully help with cooling the hot and power-hungry GDDR6X.

Whatever the core specs, the RTX 3090 Ti Founders Edition will keep the same massive 3-slot design of the RTX 3090 Founders Edition. Nvidia's partners will naturally experiment with their own custom designs, which will include factory overclocks. Get ready for a new halo GPU… at least until Lovelace and the RTX 40-series launch, which we still expect to happen before the end of 2022.

Nvidia didn't announce a launch price for the RTX 3090 Ti, but considering the RTX 3090 routinely sells for over $2,000 already, we suspect it will start at $1,999. We also expect that, like the 3090, the 3090 Ti won't implement Nvidia's LHR technology, meaning it will offer full mining performance and should be capable of over 120 MH/s in Ethereum (at least until Ethereum 2.0 kills off mining). Nvidia will provide additional details on the 3090 Ti later this month.

Last but not least, Nvidia announced the new RTX 3070 Ti and RTX 3080 Ti for laptops. The existing RTX 3080 laptop GPU uses the GA104 GPU that powers the desktop 3070 and 3060 Ti. The RTX 3080 Ti laptop GPU will apparently stick with the GA104 (Nvidia again didn't provide specs, but we'll update once those become available), but it does include 16GB of GDDR6 16Gbps memory — double the capacity of the existing RTX 30-series laptop GPUs. As for the RTX 3070 Ti, other than improved performance relative to the existing RTX 3070 laptop GPU, we're still waiting for additional details.

Nvidia also announced new updates to its Max-Q technology. The fourth generation of Max-Q will deliver better power optimizations to boost battery life while keeping performance high. Laptops with the 3080 Ti and 3070 Ti will be available starting in February 2022.

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

  • COLGeek
    Let the price gouging begin! o_O
    Reply
  • JarredWaltonGPU
    COLGeek said:
    Let the price gouging begin! o_O
    No doubt. I'd love to see a lot of these cards actually go on sale for $249. Somehow, I don't think that's going to happen.
    Reply
  • avenge
    So again...low end GPU with 8 GB RAM? Who will need that for RTX 3050? Maybe it will run games @2k resolution or more and max settings? What happened with 3070 TI 16 GB?
    Reply
  • JarredWaltonGPU
    avenge said:
    So again...low end GPU with 8 GB RAM? Who will need that for RTX 3050? Maybe it will run games @2k resolution or more and max settings? What happened with 3070 TI 16 GB?
    Last rumor/leak was that it was "delayed" a few months, probably due to a lack of sufficient 16Gb GDDR6X modules. Which are probably going into the 3090 Ti first.

    As for the 8GB, it's better than 4GB. Granted, most of the time a GPU of this level won't strictly need 8GB, but I'd much rather have 8GB than 4GB at this stage — even if most of the time I was playing using settings that didn't exceed 4GB, just having the potential to push a bit higher is nice. Then again, most of the time I leave an RTX 3090 in my system. :D
    Reply
  • hotaru.hino
    avenge said:
    So again...low end GPU with 8 GB RAM? Who will need that for RTX 3050?
    It's either that or 4GB, and given that the RTX 3050 should be better than the RTX 2060 judging by the specs, I'd rather have the 8GB.
    Reply
  • spongiemaster
    hotaru.hino said:
    It's either that or 4GB, and given that the RTX 3050 should be better than the RTX 2060 judging by the specs, I'd rather have the 8GB.
    8GB is what makes it a viable target for Ethereum miners. 4GB would mean you're only competing with other gamers for the cards. If you're stuck paying $350 for a "lowend" GPU, would you rather it be the current 1650 or a 4GB RTX 3050? The price is still jacked up, but not having to compete with miner's bottomless demand means, you may actually be able to buy a 4GB card and not have to pay a 100%+ markup over MSRP.
    Reply
  • hotaru.hino
    spongiemaster said:
    8GB is what makes it a viable target for Ethereum miners. 4GB would mean you're only competing with other gamers for the cards. If you're stuck paying $350 for a "lowend" GPU, would you rather it be the current 1650 or a 4GB RTX 3050? The price is still jacked up, but not having to compete with miner's bottomless demand means, you may actually be able to buy a 4GB card and not have to pay a 100%+ markup over MSRP.
    Shipping it with 4GB is going to limit its useful longevity, and your concern is only one for the DIY market. System builders won't have this issue.

    EDIT: Also the card's going to get scalped and price jacked regardless. Even the 1650 is going for nearly double its original MSRP.
    Reply
  • spongiemaster
    hotaru.hino said:
    Shipping it with 4GB is going to limit it's useful longevity, and your concern is only one for the DIY market. System builders won't have this issue.
    Why choose one or the other? Why aren't 2 different memory configurations an option? I wouldn't recommend anyone buy for useful longevity in this market. Either buy what you want, or buy the cheapest option you can use as a stopgap. If PC gaming wants a prosperous future, GPU prices will have to normalize within the next year or 2. Barely better than IGP $300 dGPU's with $1000 midrange and $2000+ high end with off hour basement mining to offset the cost is not sustainable long term.
    Reply
  • InvalidError
    avenge said:
    So again...low end GPU with 8 GB RAM? Who will need that for RTX 3050?
    Firefox and Chrome alone can gobble up around 2GB of VRAM each if you leave GPU acceleration turned on. I had to force browsers to run on the IGP to stop them from hogging my GTX1050's memory and causing artifacts while gaming. For people who like using browsers with GPU acceleration enabled while gaming, 4GB would be an awfully close shave. I'd say 6GB is the bare minimum for comfort.
    Reply
  • hotaru.hino
    spongiemaster said:
    Why choose one or the other? Why aren't 2 different memory configurations an option?
    Because this creates a problem with production and essentially doubles the complexity of the supply chain. AMD and NVIDIA seem to be simplifying their VRAM chip orders to 2GB chips only. Adding 1GB chips into the mix will cut into the 2GB chip supply in some form or fashion. Not to mention if the lower configuration doesn't pan out for some reason, you're now left with stock that you can't use. Sure you could nerf the 2GB chips to 1GB, but... why not just use 2GB? And while I can't say this is a thing for memory, I'm sure the yield rate for 2GB chips is high enough now that there's no point in trying to make 1GB chips out of the defects.

    spongiemaster said:
    I wouldn't recommend anyone buy for useful longevity in this market.
    And some people only have so much money to spend and want something that will last at least 4-5 years.

    But again, your conjecture that selling 4GB card will allow the 3050 to have a better chance in the DIY market because it won't be price jacked by miners doesn't look promising when 1650s are going for way more than their MSRP.
    Reply