Why you can trust Tom's Hardware
Nvidia GeForce RTX 5060 Ti 16GB Rasterization Gaming Performance
We divide gaming performance into two categories: traditional rasterization games and ray-tracing games. We benchmark each game using four different test settings: 1080p medium, 1080p ultra, 1440p ultra, and 4K ultra. For the RTX 5060 Ti 16GB, the most important results will be 1080p ultra and 1440p ultra — which also serve as proxies for 4K with performance mode and quality mode upscaling, respectively.
We'll start with the rasterization suite of 16 games, as that's arguably still the most useful measurement of gaming performance. Plenty of games that have ray tracing support end up running so poorly that it's more of a feature checkbox than something useful. (Note that we've dropped Hogwarts Legacy and Star Wars Outlaws from our test suite due to inconsistencies in the test results caused by variable weather and game updates.)
We'll provide limited to no commentary on most of the individual game charts, letting the numbers speak for themselves. The Geomean charts will be the main focus, since those provide the big picture overview of how the RTX 5060 Ti 16GB competes with the other GPUs.




First, let's be clear: Just because the RTX 5060 Ti 16GB lands in the middle of our charts doesn't mean it's a slow or middle-of-the-road GPU. Performance relative to price tends to be the real metric, and... well, we don't know for sure where pricing will land, today, next week, or in the coming months. Obviously, if it costs more that makes it less desirable. The same goes for the other GPUs we've included in our charts — the 9070, 5070, and some of the previous generation parts are still at a higher pricing and performance tier; 4060 and 7600 cards meanwhile are at a lower tier. They're present to provide context for the 5060 Ti 16GB.
Looking quickly at the Asus and PNY RTX 5060 Ti 16GB results, it will be immediately obvious that there's almost zero discernable difference in performance between the two cards. The PNY card is almost universally faster, thanks presumably to its higher boost clock. It's also less than 1% faster overall, with none of the results showing more than a 2% difference in performance. You should choose your card (using a specific GPU) based on price and other factors like cooling, aesthetics, size, and noise rather than performance — all RTX 5060 Ti 16GB cards will generally perform within a few percentage points of each other.
The important comparison points will be both the existing and previous generation parts. Obviously, the RTX 5070 should be faster than the 5060 Ti 16GB. On paper, it has 30% more compute and 50% more memory bandwidth, though it also has 33% less VRAM capacity. For our rasterization gaming suite, the 5070 ends up being 31% faster at 1440p, 27% faster at 1080p ultra, and 33% faster at 4K ultra. So, even though it doesn't have as much VRAM, compute and bandwidth still win out, especially at 4K. And it also costs, theoretically at least, 28% more.
AMD's RX 9070 widens that gap, since it's generally faster than the RTX 5070. It's up to 50% faster, with a theoretical 28% higher price tag that, in practice, is much more than that right now. (Depending on where the actual prices on the 5060 Ti 16GB land.) Compares to the previous generation RX 7800 XT and 7700 XT, the 5060 Ti 16GB splits the difference: It's 2–6 percent faster than the 7700 XT on average, and 7–10 percent slower than the 7800 XT.
The direct previous generation comparison shows that, absent MFG as a "performance enhancer," there's not a huge generational uplift with the 5060 Ti 16GB. It's 16–22 percent faster than the RTX 4060 Ti 16GB in our rasterization test suite.
But what about the 8GB 4060 Ti? We don't have 5060 Ti 8GB results (yet), but while the 5060 Ti 16GB card is only 15–17 percent faster than the 4060 Ti at 1080p, the margin increases to 27% at 1440p, and then a massive 69% at 4K. We expect the 5060 Ti 8GB will have similar issues at 4K native.
The 14 individual rasterization game performance charts are below, with no commentary other than a few notes about the various benchmarks.




Assassin's Creed Mirage uses the Ubisoft Anvil engine and DirectX 12. It's also an AMD-promoted game, though these days, that doesn't necessarily mean it always runs better on AMD GPUs. It could be CPU optimizations for Ryzen, or more often, it just means a game has FSR2 or FSR3 support — FSR2 in this case. It also supports DLSS and XeSS upscaling. We run a manual test sequence around the rooftops for this test, rather than using the (flaky) built-in benchmark.




Baldur's Gate 3 is our sole DirectX 11 holdout — it also supports Vulkan, but that performed worse on the GPUs we checked, so we opted to stick with DX11. Built on Larian Studios' Divinity Engine, it's a top-down perspective game, which is a nice change of pace from the many first-person games in our test suite. The faster GPUs hit CPU bottlenecks in this game, especially at 1080p. Our test sequence takes place in the city of Baldur's Gate, which has a lot of NPCs and hits the CPU relatively hard.




Black Myth: Wukong is one of the newer games in our test suite. Built on Unreal Engine 5, which supports full ray tracing as a high-end option, we opted to test using pure rasterization mode. Full RT may look a bit nicer, but the performance hit is quite severe. (Check our linked article for our initial launch benchmarks if you want to see how it runs with full RT enabled. We've got supplemental testing coming as well.) This is one of the few games where we use the built-in benchmark (to avoid having unpredictable combat sequences making deterministic benchmarking difficult).




Dragon Age: The Veilguard uses the Frostbite engine and runs via the DX12 API. It's one of the newest games in my test suite, having launched this past October. It's been received quite well, though, and in terms of visuals, we'd put it right up there with Unreal Engine 5 games — with less noticeable LOD pop-in, which happens so frequently with UE5. We run a loop around the island of Arlathan (I think?) where the Veil Jumpers camp is located, as it was more demanding than many of the other early areas we checked.




Final Fantasy XVI came out for the PS5 in 2023, but the Windows release didn't arrive until 2024. It's also either incredibly demanding or quite poorly optimized (or both), but it does tend to be very GPU limited. Our test sequence consists of running a set path around the town of Lost Wing.




We've been using Flight Simulator 2020 for several years, and there's a new release below. But it's so new that we also wanted to keep the original around a bit longer as a point of reference. We've switched to using the 'beta' (eternal beta) DX12 path for our testing now, as it's required for DLSS frame generation, even if it runs a bit slower on Nvidia GPUs. We use the landing challenge for Ísafjörður as the test sequence.




Flight Simulator 2024 is the latest release of the storied franchise, and it's even more demanding than the above 2020 release — with some differences in what sort of hardware it seems to like best. Where the 2020 version really appreciated AMD's X3D processors, the 2024 release tends to be more forgiving to Intel CPUs, thanks to improved DirectX 12 code (DX11 is no longer supported). Again, we use the landing challenge for Ísafjörður as the test sequence (which looks slightly different in the new engine).




God of War Ragnarök released for the PlayStation two years ago and only recently saw a Windows version. It's AMD promoted, but it also supports DLSS and XeSS alongside FSR3. We run around the village of Svartalfheim, which is one of the most demanding areas in the game that we've encountered.




Horizon Forbidden West is another two years old PlayStation port, using the Decima engine. The graphics are good, though I've heard at least a few people think it looks worse than its predecessor — excessive blurriness being a key complaint. But after using Horizon Zero Dawn for a few years, it felt like a good time to replace it. Our benchmark follows a set path in the city where you (previously) defeated HADES.




The Last of Us, Part 1 is another PlayStation port, though it's been out on PC for about 20 months now. It's also an AMD-promoted game and really hits the VRAM hard at higher-quality settings, though cards with 12GB or more memory usually do fine. Our test takes place outside of the ruins of the city.




A Plague Tale: Requiem uses the Zouna engine and runs on the DirectX 12 API. It's an Nvidia-promoted game that supports DLSS 3, but neither FSR nor XeSS. (It was one of the first DLSS 3-enabled games as well.) It has RT effects, but only for shadows, so it doesn't really improve the look of the game and tanks performance. We run a set path around the early part of the game.




Stalker 2 is another Unreal Engine 5 game, but without any hardware ray tracing support — the Lumen engine also does "software RT" that's basically just fancy rasterization as far as the visuals are concerned, though it's still quite taxing. VRAM can also be a serious problem when trying to run the epic preset, with 8GB cards struggling at most resolutions. There's also quite a bit of microstuttering in Stalker 2, and it tends to be more CPU limited than other recent games. Our test sequence follows a path through the town of Zalissya.




Starfield uses the Creation Engine 2, an updated engine from Bethesda, where the previous release powered the Fallout and Elder Scrolls games. It's another fairly demanding game, and we run around the city of Akila, one of the more taxing locations in the game. It's a bit more CPU limited, particularly at lower resolutions.




Wrapping things up, Warhammer 40,000: Space Marine 2 is yet another AMD-promoted game. It runs on the Swarm engine and uses DirectX 12, without any support for ray tracing hardware. We use a sequence from the introduction, which is generally less demanding than the various missions you get to later in the game but has the advantage of being repeatable and not having enemies everywhere.
- MORE: Best Graphics Cards
- MORE: GPU Benchmarks and Hierarchy
- MORE: All Graphics Content
Current page: Nvidia GeForce RTX 5060 Ti 16GB Rasterization Gaming Performance
Prev Page Nvidia GeForce RTX 5060 Ti 16GB Test Setup Next Page Nvidia GeForce RTX 5060 Ti 16GB Ray Tracing Gaming PerformanceJarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.
-
Amdlova I want to say that is a nice card... But with 180w TBP for those numbers it's a waste of sand.Reply
nvidia is afraid to bench the 8GB cards... Just buy an AMD card and be happy -
palladin9479 Once the 8GB model comes out would be nice to see a quick article focusing just on 4060 and 5060 8 / 16 finding the settings where 8GB stops being capable. The 1080p medium graph shows that the 8GB cards work fine at that level, but then the next step is "ultra" which usually has ridiculous texture sizes. Would be nice to see 1080, 1440, 2160 "high" or "very high", one step down from ultra and see how well those cards do. Someone buying a xx60 class card isn't going to have a good experience playing at 4K "ultra".Reply
Amdlova said:nvidia is afraid to bench the 8GB cards... Just buy an AMD card and be happy
It's right in the article just compare both versions of the 4060. Each test has a graph at the very end using 1080p medium and you can see the 8GB model does very well there. They didn't have time to do additional testing with "high" or "very high" intermediate levels and ultra has ridiculously large texture sizes that start to hurt 8GB cards. I see them as doing well on 1080/1440 with "high" settings, basically a budget gamer using whatever they can get their hands on. We laugh but I know a ton of guys like that at work, have wives and kids are upgrade a piece at a time. -
JarredWaltonGPU
In our test suite, 1080p ultra is still playable in all 18 games on an 8GB card, or at least an 8GB Nvidia card. (The RX 7600 may have some issues in one or two games.) There are however games like Indiana Jones where 8GB represent a real limit to the settings you can use. The TLDR is that it varies by game, but 1080p/1440p "high" should be fine on an 8GB card. I'd still pay the extra $50 if I were in the market for this sort of GPU (assuming it's only a $50 difference, naturally).palladin9479 said:Once the 8GB model comes out would be nice to see a quick article focusing just on 4060 and 5060 8 / 16 finding the settings where 8GB stops being capable. The 1080p medium graph shows that the 8GB cards work fine at that level, but then the next step is "ultra" which usually has ridiculous texture sizes. Would be nice to see 1080, 1440, 2160 "high" or "very high", one step down from ultra and see how well those cards do. Someone buying a xx60 class card isn't going to have a good experience playing at 4K "ultra". -
cknobman So the new gen 60 TI class card cant even come close to matching the last gen vanilla 70 class card?Reply
Seems like a really bad "upgrade" to me.
Definitely a 3 star, not 4, kind of score.
Also if you have been keeping up with the news Nvidia is purposely not letting 8gb cards get reviewed.
They told partners they are not allowed to sample those cards out for review.
The only way you will get 8gb card reviews is AFTER release and when they are purchased at retail by reviewers.
The only reason this is happening is because Nvidia knows the 8gb cards are crap. Making reviews wait until after retail availability ensures that at least the first batch will fly off shelves regardless.
Nvidia is a terrible company. -
palladin9479 JarredWaltonGPU said:In our test suite, 1080p ultra is still playable in all 18 games on an 8GB card, or at least an 8GB Nvidia card. (The RX 7600 may have some issues in one or two games.) There are however games like Indiana Jones where 8GB represent a real limit to the settings you can use. The TLDR is that it varies by game, but 1080p/1440p "high" should be fine on an 8GB card. I'd still pay the extra $50 if I were in the market for this sort of GPU (assuming it's only a $50 difference, naturally).
Yeah it's all price dependent $50 USD to go from 8 to 16 is a no brainer, but there is a large market for older stuff including used cards (see your other article). Steam survey has 1080p being 56.40% of the market with 1440p being 19.06%, that's 3/4ths of the gaming market between those two resolutions. 8GB VRAM was 35.52% with 12GB at 18.42 and 6GB at 11.93%. Over 60% of the market was 8GB or less and only ~7.2% had 16GB or more. We've obviously got a center mass of sorts around 1080/1440 with 8GB, kind of the definition of "mainstream" and why I'm interested in that bracket despite youtubers claiming that a 8GB card can't run solitaire in 2025.
It's not sexy but it's the vast majority of the consumer gaming market and with economies being what they are are prices going up, that market segment wants to squeeze as much out of it's limited disposable income as possible.
I mean RX 7600 8GB at $290 USD. Dirt cheap by todays standards. The poster child of "1080p medium/high".
https://www.amazon.com/PowerColor-Hellhound-Radeon-Gaming-Graphics/dp/B0C48LM7NN/ -
Alvar "Miles" Udell I'd say this is a 3 star card.Reply
Should have knocked a star off just because the 8gb model exists to up charge for the 16gb model.
The 19% rasterization performance improvement deserves another deduction because it is a terrible Gen over Gen increase, same across the 5000 series stack. Yes it's just a refinement generation, but even for MSRP you're talking over $400 for 1080p75/1440p60 in 2025 and not even matching last gens 4070, which will be made all the worse once custom editions tack on their upwards of $100 premiums.
Granted this is an upper entry level gaming card, a PC built around it is still very much more expensive than a console, and needs to have performance that justifies it. -
DRagor I have checked my local market. All 8Gb cards were sold out while 16Gb were still in stock, some even at MSRP (although many had price close to 5070 lol). For me it is clearly foul play by NVIDIA: let the ppl watch 16Gb version reviews and then go buy cheaper 8Gb models coe they're cheapo and ppl don't understand difference.Reply -
Gururu B580s are still in stock... So many similar cards tested from the big two, why not toss in the ARC B580 for buyer options? We know it sits in the 7600/4060 class or higher.Reply -
Roland Of Gilead
Totally agree with you. I was kinda hoping the 5060ti would have a similar bump like the 3060ti did, being faster than a 2080 Super. I kinda figured from the reviews of the new gen 50xx models that it wouldn't really hit the point. But to do so, so unspectacularly is not good.cknobman said:So the new gen 60 TI class card cant even come close to matching the last gen vanilla 70 class card?
As pointed out, the 5060ti 16gb is the only choice for only $50 more. It's a no brainer.
I'm quite happy now with my 4070 Super, and have no FOMO. Well, maybe apart from the 9070XT, which I think is hands down the award winner in the last roll out of GPU's. Defo the stand out card right now, if they are available. -
ThereAndBackAgain
Honestly, if people don't understand the difference between 8 GB VRAM and 16 GB VRAM, they shouldn't be spending $400+ on a GPU in the first place. But it's kind of hard to imagine someone knowledgeable enough to build their own PC not comprehending VRAM. The people who bought those cards most likely knew exactly what they were getting.DRagor said:I have checked my local market. All 8Gb cards were sold out while 16Gb were still in stock, some even at MSRP (although many had price close to 5070 lol). For me it is clearly foul play by NVIDIA: let the ppl watch 16Gb version reviews and then go buy cheaper 8Gb models coe they're cheapo and ppl don't understand difference.