Nvidia GeForce RTX 4060 Ti 16GB Review: Does More VRAM Help?

Double the memory, still the same underlying chip.

Gigabyte's Nvidia GeForce RTX 4060 Ti 16GB Gaming OC
(Image: © Tom's Hardware)

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

Gigabyte RTX 4060 Ti 16GB Gaming OC

(Image credit: Tom's Hardware)

We purchased our RTX 4060 Ti 16GB sample at retail from Newegg. It sells at the base $499 MSRP and comes with a triple-fan cooler and a factory overclock, which we figured would represent something of a best-case scenario for performance. Whether that's correct or not, we can't say, as this is the only 4060 Ti 16GB we've tested.

There's no Founders Edition, as mentioned already, so if you're interested in the 4060 Ti 16GB, you'll have to decide from among the various AIB models. The Gaming OC comes with typical Gigabyte packaging, with no extras to speak of in the box. There's a warranty pamphlet, padding around the card itself, and that's it.

One interesting bit of detail is that the card doesn't come with plastic cling-wrap covering the surfaces. That became a trend a decade ago, a trend that Nvidia and AMD generally skip on their reference cards. The idea is to keep fingerprints off the card, we think, until the end-user has unpacked and installed the card into a PC. Some people probably love peeling it off, but we don't miss it because the card is already protected by an anti-static bag.

The box is also relatively compact for a triple-fan GPU, which means less waste packaging material for the environment. As for dimensions, the Gigabyte Gaming OC measures 281x117x52mm, so it's a 2.5-slot width. It also weighs 830g, which is relatively lightweight. The fans are 74mm in diameter and not particularly large, and they lack the integrated rims found on higher-end fans. For a 160W TGP card, the three fans should be more than sufficient, but fan RPMs will end up slightly higher due to the design.

There's very little in the way of RGB lighting on tap here, with the only lit-up portion of the card being the Gigabyte logo on top. That's perhaps to be expected from a base-price model, and some will appreciate the lack of extra lighting. Others will want to look elsewhere for an appropriately blinged-out graphics card.

As is typical these days for Nvidia's RTX 40-series cards, there are four video outputs. However, Gigabyte opts for a dual DisplayPort 1.4a and dual HDMI 2.1 configuration. DisplayPort 1.4a has a maximum data bandwidth of 25.92 Gbps, compared to HDMI 2.1's 42 Gbps (after subtracting encoding overhead), so theoretically, this is a win for higher-resolution display use. Still, both are capable of driving 4K at 240Hz with DSC (Display Stream Compression), which is what our Samsung Odyssey Neo G8 supports.

One final item of note is that Gigabyte opted for a single 8-pin power connector on top. That's plenty, and it avoids the often janky 8-pin to 16-pin adapters found on Nvidia's Founders Edition, plus many of the higher-end RTX 40-series cards. There's no need for a single 8-pin to 16-pin adapter, and we haven't really seen anything useful in terms of the four sense pins. It's good to see common sense at play here.

Gigabyte RTX 4060 Ti 16GB Overclocking 

Gigabyte's Nvidia GeForce RTX 4060 Ti 16GB Gaming OC

(Image credit: Tom's Hardware)

Our overclocking process doesn't aim to fully redline the hardware, but instead looks to find "reasonably stable and safe" overclocks. We start by maxing out the power limit (using MSI Afterburner), which is 130% for the Gigabyte 4060 Ti 16GB Gaming OC — that varies by card manufacturer and model. 130% is on the higher end of what we typically see, and it does suggest that Gigabyte didn't increase the default TGP. Given that the card has twice as much memory, that might actually reduce performance at stock settings.

Next, we look for the maximum stable GPU core overclock. We were able to hit up to +225 MHz on the RTX 4060 Ti with our initial testing, but eventually settled in at +215 MHz after experiencing intermittent crashes. There's no way to increase the GPU voltage short of doing a voltage mod, which is often a limiting factor for RTX 40-series cards, and that's likely at play here.

The memory is rated for 18 Gbps, and we eventually settled in on a +1500 MHz overclock. (+1750 MHz showed display corruption followed by a game crash.) That gives a final 21 Gbps clock speed. As we've mentioned, Nvidia has error detection and retry for the VRAM, so you don't want to fully max out the memory speed. But with the 128-bit memory interface, pushing the clocks a bit higher might be beneficial.

With both the GPU and VRAM overclocks and a fan curve set to ramp from 30% at 30C up to 100% at 75C, we could run our full suite of gaming tests at 1080p and 1440p ultra without any issues. We'll include those results in the charts. Across our test suite, the manual overclock improved performance by about 8% on average.

Gigabyte RTX 4060 Ti 16GB Test Setup

(Image credit: Tom's Hardware)

We updated our GPU test PC at the end of last year with a Core i9-13900K, though we continue to also test reference GPUs on our 2022 system that includes a Core i9-12900K for our GPU benchmarks hierarchy. (We'll be updating that shortly.) Our RTX 4060 Ti 16GB review uses the 13900K results for gaming tests, which ensures, as much as possible, that we're not CPU limited. We also use the 2022 PC for AI tests and professional workloads.

Multiple games have been updated over the past few months. We tested the Gigabyte RTX 4060 Ti 16GB using Nvidia's latest 536.99 drivers, and retested the 4060 Ti 8GB Founders Edition with those same drivers for good measure. The remaining cards use the previously tested results from the past few months, though we've updated and retested every card in Minecraft.

Our current test suite consists of 15 games. Of these, nine support DirectX Raytracing (DXR), but we only enable the DXR features in six games. At the time of testing, 12 games support DLSS 2, five support DLSS 3, and five support FSR 2. We covered DLSS performance in our initial RTX 4060 Ti review, and feel those results are sufficient for the 16GB model as well.

For this review, we're adding three additional games with limited testing: F1 2023, Hogwarts Legacy, and The Last of Us: Part 1. These are all using the latest patches and drivers, and we're mostly interested in seeing if a few recent games that are generally more demanding of VRAM show larger differences in performance than our existing and somewhat older test suite.

We tested the 4060 Ti 16GB at 1080p (medium and ultra), 1440p ultra, and 4K ultra — ultra being the highest supported preset if there is one, and in some cases maxing out all the other settings for good measure (except for MSAA or super sampling). We'll only have limited commentary on the 4K results, as the 4060 Ti 16GB generally isn't intended to work well at that resolution (at least not in more demanding games).

Our PC is hooked up to a Samsung Odyssey Neo G8 32, one of the best gaming monitors around, allowing us to fully experience some of the higher frame rates that might be available. G-Sync and FreeSync were enabled, as appropriate. As you can imagine, getting anywhere close to the 240 Hz limit of the monitor proved difficult, as we don't have any esports games in our test suite.

We installed all the then-latest Windows 11 updates when we assembled the new test PC. We're running Windows 11 22H2, but we've used InControl to lock our test PC to that major release for the foreseeable future (though critical security updates still get installed monthly).

Our new test PC includes Nvidia's PCAT v2 (Power Capture and Analysis Tool) hardware, which means we can grab real power use, GPU clocks, and more during all of our gaming benchmarks. We'll cover those results on our page on power use.

Finally, because GPUs aren't purely for gaming these days, we've run some professional application tests, and we also ran some Stable Diffusion benchmarks to see how AI workloads scale on the various GPUs.

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

  • newtechldtech
    This card is good if you want to edit movies and need more VRAM .
    Reply
  • JarredWaltonGPU
    I just want to clarify something here: The score is a result of both pricing as well as performance and features, plus I took a look (again) at our About Page and the scores breakdown. This is most definitely a "Meh" product right now. Some of my previous reviews may have been half a point (star) higher than warranted. I've opted to "correct" my scale and thus this lands at the 2.5-star mark.

    I do feel our descriptions of some of the other scores are way too close together. My previous reviews were based more on my past experience and an internal ranking that's perhaps not what the TH text would suggest. Here's how I'd break it down:

    5 = Practically perfect
    4.5 = Excellent
    4 = Good
    3.5 = Okay, possibly a bad price
    3 = Okay but with serious caveats (pricing, performance, and/or other factors)
    2.5 = Meh, niche use cases
    ...

    The bottom four categories are still basically fine as described. Pretty much the TH text has everything from 3-star to 5-star as a "recommended" and that doesn't really jive with me. 🤷‍♂️

    This would have been great as a 3060 Ti replacement if it had 12GB and a 192-bit bus with a $399 price point. Then the 4060 Ti 8GB could have been a 3060 replacement with 8GB and a 128-bit bus at the $329 price point. And RTX 4060 would have been a 3050 replacement at $249.

    Fundamentally, this is a clearly worse value and specs proposition than the RTX 4060 Ti 8GB and the RTX 4070. It's way too close to the former and not close enough to the latter to warrant the $499 price tag.

    All of the RTX 40-series cards have generally been a case of "good in theory, priced too high." Everything from the 4080 down to the 4060 so far got a score of 3.5 stars from me. There's definitely wiggle room, and the text is more important than just that one final score. In retrospect, I still waffle on how the various parts actually rank.

    Here's an alternate ranking, based on retrospect and the other parts that have come out:

    4090: 4.5 star. It's an excellent halo part that gives you basically everything. Expensive, yes, but not really any worse than the previous gen 3090 / 3090 Ti and it's actually justifiable.

    4080: 3-star. It's fine on performance, but the generational price increase was just too much. 3080 Ti should have been a $999 (at most) part, and this should be $999 or less.

    4070 Ti: 3-star. Basically the same story as the 4080. It's fine performance, priced way too high generationally.

    4070: 3.5-star. Still higher price than I'd like, but the overall performance story is much better.

    4060 Ti 16GB: 2.5-star. Clearly a problem child, and there's a reason it wasn't sampled by Nvidia or its partners. (The review would have been done a week ago but I had a scheduled vacation.) This is now on the "Jarred's adjusted ranking."

    4060 Ti 8GB: 3-star. Okay, still a higher price than we'd like and the 128-bit interface is an issue.

    4060: 3.5-star. This isn't an amazing GPU, but it's cheaper than the 3060 launch price and so mostly makes up for the 128-bit interface, 8GB VRAM, and 24MB L2. Generally better as an overall pick than many of the other 40-series GPUs.

    AMD's RX 7000-series parts are a similar story. I think at the current prices, the 7900 XTX works as a $949 part and warrants the 4-star score. 7900 XT has dropped to $759 and also warrants the 4-star score, maybe. The 7600 at $259 is still a 3.5-star part. So, like I said, there's wiggle room. I don't think any of the charts or text are fundamentally out of line, and a half-star adjustment is basically easily justifiable on almost any review I've done.
    Reply
  • Lord_Moonub
    Jarred, thanks for this review. I do wonder if there is more silver lining on this card we might be missing though. Could it act as a good budget 4K card? What happens if users dial back settings slightly at 4K (eg no Ray tracing, no bleeding edge ultra features ) and then make the most of DLSS 3 and the extra 16GB VRAM? I wonder if users might get something visually close to top line experience at a much lower price.
    Reply
  • JarredWaltonGPU
    Lord_Moonub said:
    Jarred, thanks for this review. I do wonder if there is more silver lining on this card we might be missing though. Could it act as a good budget 4K card? What happens if users dial back settings slightly at 4K (eg no Ray tracing, no bleeding edge ultra features ) and then make the most of DLSS 3 and the extra 16GB VRAM? I wonder if users might get something visually close to top line experience at a much lower price.
    If you do those things, the 4060 Ti 8GB will be just as fast. Basically, dialing back settings to make this run better means dialing back settings so that more than 8GB isn't needed.
    Reply
  • Elusive Ruse
    Damn, @JarredWaltonGPU went hard! Appreciate the review and the clarification of your scoring system.
    Reply
  • InvalidError
    More memory doesn't do you much good without the bandwidth to put it to use. The 4060(Ti) needed 192bits to strike the practically perfect balance between capacity and bandwidth. It would have brought the 4060(Ti) launches from steaming garbage to at least being a consistent upgrade over the 3060(Ti).
    Reply
  • Greg7579
    Jarred, I'm building with the 4090 but love reading your GPU reviews, even the ones that are far below what I would build with because I learn something every time.
    I am not a gamer but a GFX Medium Format photographer and have multiple TB of high-res 200MB raw files that I work extensively with in LightRoom and Photoshop. I build every 4 years and update as I go. I build the absolute top-end of the PC arena, which is way overkill, but I do it anyway.
    As you know. Lightroom has many new amazing AI masking and noise reduction features that are like magic but so many users (photographers) are now grinding to a halt on their old rigs and laptops. Photographers tend to be behind the gamers on PC / laptop power. It is common knowledge on the photo and Adobe forums that these new AI capabilities eat VRAM like Skittles and extensively use the GPU for the grind. (Adobe LR & PS was always behind on using the GPU with the CPU for its editing and export tasks but now are going at it with gusto.) When I run an AI DeNoise on a big GFX 200MB file, my old rig with the 3080 (I'm building again soon with the 4 090) takes about 12 seconds to grind out the AI DeNoise task. Others rigs photographers use take several minutes or just crash. The Adobe and LightRoom forums are full of howling and gnashing of teeth about this. I tell them to start upgrading, but here is my question.... I can't wait to see what the 4090 will do with these photography-related workflow tasks in LR.
    Can you comment on this and tell me if indeed this new Lightroom AI masking and DeNoise (which is a miracle for photographers) is so VRAM intensive that doubling the VRAM on a card like this would really help alot? Isn't it true that NVidea made some decisions 3 years ago that resulted in not having enough (now far cheaper) VRAM in the 40 series? It should be double or triple what it is right? Anything you can teach me about increased GPU power and VRAM in Adobe LR for us photographers?
    Reply
  • Brian D Smith
    Good card for the OEM masses. No one else need worry about it.
    Reply
  • hotaru251
    4060ti should of been closer to a 4070.

    the gap between em is huge and the cost is way too high. (doubly so that it requires dlss3 support to not get crippled by the limited bus)
    Reply
  • atomicWAR
    JarredWaltonGPU said:
    Some of my previous reviews may have been half a point (star) higher than warranted. I've opted to "correct" my scale and thus this lands at the 2.5-star mark.

    Thank you for listening Jarred. I was one of those claiming on multiple recent gpu reviews that your scores were about a half star off though not alone in that sentiment either. I was quick to defend you from trolls though as you clearly were not shilling for Nvidia either. This post proves my faith was well placed in you. Thank you for being a straight arrow!
    Reply