Nvidia GeForce RTX 5060 Ti 16GB review: More VRAM and a price 'paper cut' could make for a compelling GPU

Retail availability and pricing will be critical factors.

Nvidia GeForce RTX 5060 Ti 16GB card photos
(Image: © Tom's Hardware)

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

Nvidia GeForce RTX 5060 Ti 16GB Test Setup

(Image credit: Tom's Hardware)

This is mostly going to be a rehash of what we've said in other recent reviews, as our testing hasn't changed. At the end of last year, just in time for the Arc B580 launch, we revamped our test suite and our test PC, wiping the slate clean and requiring new benchmarks for every graphics card we want to have in our GPU benchmarks hierarchy.

We have finally updated the GPU benchmarks to use the new test suite and PC (older results are on pages two and three), and we're nearly finished with testing all current and previous generation GPUs. It's been a busy five months, with nine new GPU launches (ten if you count the 5060 Ti 8GB as a separate item), plus retesting previous generation cards.

While Nvidia offers extra software like DLSS that can boost performance and potentially even improve image quality (DLSS Transformers in quality mode can look better than native rendering with traditional TAA), all our primary testing omits the use of upscaling or frame generation technologies. That's because different algorithms — DLSS, FSR, and XeSS — don't always look or feel the same, so we feel it's best to start with the base level of performance you can expect.

Keep in mind that quality mode upscaling roughly equates to dropping resolution one 'notch' — so 4K with quality mode upscaling renders at 1440p. Performance mode upscaling drops the render resolution another notch (4K renders at 1080p before upscaling). The higher the upscaling factor, the more potential there is for noticeable upscaling artifacts.

Frame generation — including the new MFG (Multi Frame Generation) of the RTX 50-series — makes things even more complex. It can smooth out the presentation of frames to your display, while at the same time reducing the number of user input samples that get taken relative to the framerate, and introducing some additional input latency. The overall experience can vary quite a bit from game to game, as well as between different technologies like DLSS 3 framegen, DLSS 4 MFG, FSR 3.1 framegen, FSR 4 framegen, and even XeSS 2 framegen.

In short, trying to test and quantify performance for all of the various upscaling and frame generation algorithms adds a lot of complexity and uncertainty. The TLDR is that all upscaling and framegen solutions will boost performance (and/or smoothness), potentially at the cost of some image fidelity. If GPU X runs faster than GPU Y at native 1080p rendering, it should also be faster at 4K with performance mode upscaling... but depending on the supported algorithms, the game rendering may or may not look the same.

Our GPU test PC has an AMD Ryzen 7 9800X3D processor, the fastest current CPU for gaming purposes. We also have 32GB of DDR5-6000 memory from G.Skill with AMD EXPO timing enabled (CL30) on an ASRock X670E Taichi motherboard. We're running Windows 11 24H2, with the latest drivers at the time of testing.

We used AMD's 25.3.2 drivers for the 7700/7800 GPUs, with AMD's preview 24.30.31.03 drivers for the 9070, and older drivers on the 7600/7600 XT. The Nvidia GPUs have used several different drivers from the 572 family, with most using the latest 572.83 drivers. The RTX 5060 Ti 16GB cards were tested with preview 575.94 drivers. We haven't had time to retest everything on the latest releases, unfortunately, but we've retested a few games and apps where earlier results seemed to not correlate with later testing.

Our PC is hooked up to an MSI MPG 272URX QD-OLED display, which supports G-Sync and Adaptive-Sync, allowing us to properly experience the higher frame rates that RTX 50-series GPUs with MFG are supposed to be able to reach. Most games won't get anywhere close to the 240Hz limit of the monitor at 4K when rendering at native resolution, which is where framegen and MFG can be particularly helpful.

Our GPU test suite has been trimmed down to 18 games for now, as we had to cut a few that were showing oddities. We're in the process of retesting Control Ultimate using the updated Ultra settings, and we have a couple of other games we'll add as well once additional testing is complete. For now, we have four games with RT support enabled, and the remaining 14 games are run in pure rasterization mode.

We'll look at supplemental testing in the coming days to further investigate full RT along with DLSS 4 upscaling and MFG. While we've had a bit more time for this launch, it wasn't sufficient to go and test 11 other GPUs on additional games.

All games are tested using 1080p 'medium' settings (the specifics vary by game and are noted in the chart headers), along with 1080p, 1440p, and 4K 'ultra' settings. This provides a good overview of performance in a variety of situations. Depending on the GPU, some of those settings don't make as much sense as others, but everything so far has managed to (mostly) run up to 4K ultra.

Our OS has all the latest updates applied. We're also using Nvidia's PCAT v2 (Power Capture and Analysis Tool) hardware, which means we can grab real power use, GPU clocks, and more during our gaming benchmarks. We'll cover those results on page eight.

Finally, because GPUs aren't purely for gaming these days, we run professional and AI application tests. We've previously tested Stable Diffusion, using various custom scripts, but to level the playing field and hopefully make things a bit more manageable (AI is a fast moving field!), we're turning to standardized benchmarks.

We use Procyon and run the AI Vision test as well as the Stable Diffusion 1.5 and XL tests; MLPerf Client 0.5 preview for AI text generation; SPECworkstation 4.0 for Handbrake transcoding, AI inference, and professional applications; 3DMark DXR Feature Test to check raw hardware RT performance; and finally Blender Benchmark 4.3.0 for professional 3D rendering.

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

  • Amdlova
    I want to say that is a nice card... But with 180w TBP for those numbers it's a waste of sand.
    nvidia is afraid to bench the 8GB cards... Just buy an AMD card and be happy
    Reply
  • palladin9479
    Once the 8GB model comes out would be nice to see a quick article focusing just on 4060 and 5060 8 / 16 finding the settings where 8GB stops being capable. The 1080p medium graph shows that the 8GB cards work fine at that level, but then the next step is "ultra" which usually has ridiculous texture sizes. Would be nice to see 1080, 1440, 2160 "high" or "very high", one step down from ultra and see how well those cards do. Someone buying a xx60 class card isn't going to have a good experience playing at 4K "ultra".

    Amdlova said:
    nvidia is afraid to bench the 8GB cards... Just buy an AMD card and be happy

    It's right in the article just compare both versions of the 4060. Each test has a graph at the very end using 1080p medium and you can see the 8GB model does very well there. They didn't have time to do additional testing with "high" or "very high" intermediate levels and ultra has ridiculously large texture sizes that start to hurt 8GB cards. I see them as doing well on 1080/1440 with "high" settings, basically a budget gamer using whatever they can get their hands on. We laugh but I know a ton of guys like that at work, have wives and kids are upgrade a piece at a time.
    Reply
  • JarredWaltonGPU
    palladin9479 said:
    Once the 8GB model comes out would be nice to see a quick article focusing just on 4060 and 5060 8 / 16 finding the settings where 8GB stops being capable. The 1080p medium graph shows that the 8GB cards work fine at that level, but then the next step is "ultra" which usually has ridiculous texture sizes. Would be nice to see 1080, 1440, 2160 "high" or "very high", one step down from ultra and see how well those cards do. Someone buying a xx60 class card isn't going to have a good experience playing at 4K "ultra".
    In our test suite, 1080p ultra is still playable in all 18 games on an 8GB card, or at least an 8GB Nvidia card. (The RX 7600 may have some issues in one or two games.) There are however games like Indiana Jones where 8GB represent a real limit to the settings you can use. The TLDR is that it varies by game, but 1080p/1440p "high" should be fine on an 8GB card. I'd still pay the extra $50 if I were in the market for this sort of GPU (assuming it's only a $50 difference, naturally).
    Reply
  • cknobman
    So the new gen 60 TI class card cant even come close to matching the last gen vanilla 70 class card?
    Seems like a really bad "upgrade" to me.
    Definitely a 3 star, not 4, kind of score.

    Also if you have been keeping up with the news Nvidia is purposely not letting 8gb cards get reviewed.
    They told partners they are not allowed to sample those cards out for review.
    The only way you will get 8gb card reviews is AFTER release and when they are purchased at retail by reviewers.

    The only reason this is happening is because Nvidia knows the 8gb cards are crap. Making reviews wait until after retail availability ensures that at least the first batch will fly off shelves regardless.

    Nvidia is a terrible company.
    Reply
  • palladin9479
    JarredWaltonGPU said:
    In our test suite, 1080p ultra is still playable in all 18 games on an 8GB card, or at least an 8GB Nvidia card. (The RX 7600 may have some issues in one or two games.) There are however games like Indiana Jones where 8GB represent a real limit to the settings you can use. The TLDR is that it varies by game, but 1080p/1440p "high" should be fine on an 8GB card. I'd still pay the extra $50 if I were in the market for this sort of GPU (assuming it's only a $50 difference, naturally).

    Yeah it's all price dependent $50 USD to go from 8 to 16 is a no brainer, but there is a large market for older stuff including used cards (see your other article). Steam survey has 1080p being 56.40% of the market with 1440p being 19.06%, that's 3/4ths of the gaming market between those two resolutions. 8GB VRAM was 35.52% with 12GB at 18.42 and 6GB at 11.93%. Over 60% of the market was 8GB or less and only ~7.2% had 16GB or more. We've obviously got a center mass of sorts around 1080/1440 with 8GB, kind of the definition of "mainstream" and why I'm interested in that bracket despite youtubers claiming that a 8GB card can't run solitaire in 2025.

    It's not sexy but it's the vast majority of the consumer gaming market and with economies being what they are are prices going up, that market segment wants to squeeze as much out of it's limited disposable income as possible.

    I mean RX 7600 8GB at $290 USD. Dirt cheap by todays standards. The poster child of "1080p medium/high".

    https://www.amazon.com/PowerColor-Hellhound-Radeon-Gaming-Graphics/dp/B0C48LM7NN/
    Reply
  • Alvar "Miles" Udell
    I'd say this is a 3 star card.

    Should have knocked a star off just because the 8gb model exists to up charge for the 16gb model.

    The 19% rasterization performance improvement deserves another deduction because it is a terrible Gen over Gen increase, same across the 5000 series stack. Yes it's just a refinement generation, but even for MSRP you're talking over $400 for 1080p75/1440p60 in 2025 and not even matching last gens 4070, which will be made all the worse once custom editions tack on their upwards of $100 premiums.

    Granted this is an upper entry level gaming card, a PC built around it is still very much more expensive than a console, and needs to have performance that justifies it.
    Reply
  • DRagor
    I have checked my local market. All 8Gb cards were sold out while 16Gb were still in stock, some even at MSRP (although many had price close to 5070 lol). For me it is clearly foul play by NVIDIA: let the ppl watch 16Gb version reviews and then go buy cheaper 8Gb models coe they're cheapo and ppl don't understand difference.
    Reply
  • Gururu
    B580s are still in stock... So many similar cards tested from the big two, why not toss in the ARC B580 for buyer options? We know it sits in the 7600/4060 class or higher.
    Reply
  • Roland Of Gilead
    cknobman said:
    So the new gen 60 TI class card cant even come close to matching the last gen vanilla 70 class card?
    Totally agree with you. I was kinda hoping the 5060ti would have a similar bump like the 3060ti did, being faster than a 2080 Super. I kinda figured from the reviews of the new gen 50xx models that it wouldn't really hit the point. But to do so, so unspectacularly is not good.

    As pointed out, the 5060ti 16gb is the only choice for only $50 more. It's a no brainer.

    I'm quite happy now with my 4070 Super, and have no FOMO. Well, maybe apart from the 9070XT, which I think is hands down the award winner in the last roll out of GPU's. Defo the stand out card right now, if they are available.
    Reply
  • ThereAndBackAgain
    DRagor said:
    I have checked my local market. All 8Gb cards were sold out while 16Gb were still in stock, some even at MSRP (although many had price close to 5070 lol). For me it is clearly foul play by NVIDIA: let the ppl watch 16Gb version reviews and then go buy cheaper 8Gb models coe they're cheapo and ppl don't understand difference.
    Honestly, if people don't understand the difference between 8 GB VRAM and 16 GB VRAM, they shouldn't be spending $400+ on a GPU in the first place. But it's kind of hard to imagine someone knowledgeable enough to build their own PC not comprehending VRAM. The people who bought those cards most likely knew exactly what they were getting.
    Reply