RTX 5060 Ti and RTX 5060 may arrive in March to steal AMD's spotlight — Chaintech hints at higher Average Selling Prices
Let's hope we break the chain of 8GB models.

Nvidia's budget RTX 5060 Ti and RTX 5060 GPUs are reportedly set to launch next month, per a slide shared by Taiwanese hardware manufacturer Chaintech (via wxnod). Chaintech asserts the RTX 5060 series will follow Nvidia's RTX 5070 Ti and RTX 5070 in March. This claim should be taken with a grain of salt if Nvidia didn't announce these GPUs at CES, and their launch is reportedly very near.
Colorful, one of the largest GPU makers and highly visible in the South Asian and Chinese markets, is a customer and partner of Chaintech. Due to the lack of information and context, the mentioned timeframes might be placeholders or more of an expectation from Nvidia.
The slide reads, "NV [Nvidia] will launch new RTX 50 series products in early 2025, with an increased ASP [Average Selling Price]." The claim of a higher ASP makes sense for the RTX 5090 but not the RTX 5080. Nvidia charges $400 more for the RTX 5090; however, the RTX 5080 sees a $200 cut compared to the RTX 4080; realistically, you can't get either at MSRP.
Hence, judging if Nvidia would eye a price spike for its already-expensive 60-series isn't easy. Then, we have the apparent elephant in the room, VRAM. EEC filings suggest that Nvidia, like the last generation, will introduce two variants of the RTX 5060 Ti, one with 8GB of memory and the other with 16. The $499 price tag for the RTX 4060 Ti 16GB previously raised a lot of brows and sparked criticism, scoring just 2.5 stars in our review. Any higher will be a tough sell, given that the RTX 5070 12GB is priced at $549. You can expand the tweet below to see the slide in question.
A more realistic pricing approach will be to keep the RTX 5060 Ti series within the $400 territory, but that will depend on how AMD's GPUs hold up. Specifications are in the dark; however, we suspect the RTX 5060 Ti might employ the GB206 die, which is rumored to feature 36 SMs (4,608 CUDA cores) and a 128-bit interface with GDDR7 support. Cut-down GB205 dies (used on the RTX 5070) are also possible. For the RTX 5060, GB207 is an unlikely fit as it reportedly offers just 20 SMs (2,560 CUDA cores), but you never know.
AMD's RDNA 4 series will debut in March and compete against the RTX 5070 and, possibly, even the RTX 5080. AMD's Radeon RX 9060/9050 offerings will compete with the RTX 5060 Ti/RTX 5060, though the latter will likely debut after March, possibly at Computex.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he’s not working, you’ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun.
-
TheyStoppedit I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.Reply -
DshadoW95
This is true about well-optimized games. Unfortunately a lot of new ports are straight dog water when it comes to texture streaming/compression so they tear through VRAM.TheyStoppedit said:I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.
On the other hand, in no way shape or form are the 60 and especially the 60ti cards supposed to be "budget" options. Historically, they have always been the mainstream gaming gpu option (I would go back as far as the geforce GT6600).
The fact that nvidia can't be bothered to release 50 and 50ti cards anymore doesn't change the fact that a $350-400 card cannot be considered budget. 60 series are generally the mainstream gamer cards, 70 series are for enthusiasts and 80 is for high end gaming. 90 series or Titan were always "halo" products only for the crazies or professionals. -
TheyStoppedit
I think with the absence of XX50 cards, and 90 cards north of $2000, I think we can call $350-$400 budget cards. I think that's the point we're at now. Lower cards, like $249 Intel GPU cards.... I would call those entry level, a step below budget. It's sad that $400 is considered "budget" now, but that's kind of how it is. I can remember in 2003, when I built my first PC, the 5090 of that era was $450 at that time. Unfortunately, times have changed, and when you look at GPU prices now, 350-400 is "budget" as sad as it is to say. It shouldn't be that way, but if we're honest, it really is. In the fact of 2-3000 dollar 32GB cards, 8GB at 400 is what I would expect, and still plenty for 99.9999999% of games at 1080pDshadoW95 said:This is true about well-optimized games. Unfortunately a lot of new ports are straight dog water when it comes to texture streaming/compression so they tear through VRAM.
On the other hand, in no way shape or form are the 60 and especially the 60ti cards supposed to be "budget" options. Historically, they have always been the mainstream gaming gpu option (I would go back as far as the geforce GT6600).
The fact that nvidia can't be bothered to release 50 and 50ti cards anymore doesn't change the fact that a $350-400 card cannot be considered budget. 60 series are generally the mainstream gamer cards, 70 series are for enthusiasts and 80 is for high end gaming. 90 series or Titan were always "halo" products only for the crazies or professionals. -
Mac75 They already released the 5060 its called the 5070 look at the specs. And its about to get wiped by the 9070 series anyway so an even weaker card has no hope. The question is what does the 9070XT do to the misnamed 5080 aka the actual 5070?Reply -
Gururu
Jensen is all too pleased to have you call $400 budget. He would also have you believe that his 8GB VRAM cuts it because you are only doing 1080. Too bad his idea of $250 12GB entry level cards are now doing 1440 (3440x1440 in my case).TheyStoppedit said:I think with the absence of XX50 cards, and 90 cards north of $2000, I think we can call $350-$400 budget cards. I think that's the point we're at now. Lower cards, like $249 Intel GPU cards.... I would call those entry level, a step below budget. It's sad that $400 is considered "budget" now, but that's kind of how it is. I can remember in 2003, when I built my first PC, the 5090 of that era was $450 at that time. Unfortunately, times have changed, and when you look at GPU prices now, 350-400 is "budget" as sad as it is to say. It shouldn't be that way, but if we're honest, it really is. In the fact of 2-3000 dollar 32GB cards, 8GB at 400 is what I would expect, and still plenty for 99.9999999% of games at 1080p -
thestryker
8GB is not enough with how games are currently being designed. It's one of those corners being cut which means it should only appear in the budget space. I don't think anyone rational would complain about getting 8GB VRAM on a $200 card. This isn't how pricing currently exists though since the 4060 is a $300 card (4060 Ti $400) and there's no reason to believe a 5060 will cost less even with 8GB VRAM.TheyStoppedit said:I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.
You really shouldn't approach it this way because what you're saying is: hey giant company fleece money from me because you've raised the prices. This has been something most of the tech space has just gone along with and it doesn't make sense.TheyStoppedit said:I think with the absence of XX50 cards, and 90 cards north of $2000, I think we can call $350-$400 budget cards. I think that's the point we're at now. Lower cards, like $249 Intel GPU cards.... I would call those entry level, a step below budget. It's sad that $400 is considered "budget" now, but that's kind of how it is. I can remember in 2003, when I built my first PC, the 5090 of that era was $450 at that time. Unfortunately, times have changed, and when you look at GPU prices now, 350-400 is "budget" as sad as it is to say. It shouldn't be that way, but if we're honest, it really is. In the fact of 2-3000 dollar 32GB cards, 8GB at 400 is what I would expect, and still plenty for 99.9999999% of games at 1080p
The $300 price point has never really been a budget offering, but rather the lower end of midrange. We saw this shift with the 30 series because the big die was where all the money was and AMD has been happy to play nvidia's pricing games since they launched the first RDNA based card.
What's actually going on is nvidia and AMD have no interest in providing value for the consumer because they've proven they don't have to. This is simply to have high margins on every GPU product being sold. $300-400 isn't a budget price range, but it certainly is low end performance wise.
Tim from HUB did a 40 series video regarding what the market is actually getting and just did one for the 50 series products that we have details on. It should come as no surprise that every performance tier is lower than it really should be and consumers are getting less for their money: J72Gfh5mfTkView: https://youtu.be/J72Gfh5mfTk?si=KrST-B6tTsv5t73W -
Alvar "Miles" Udell TheyStoppedit said:I will never understand what all the fus is about 8GB cards? 8GB is still plenty to run 99.99999999999% of games at max settings at 1080p. There are very, very few games that need more. The 60 and 60Ti cards are supposed to be budget 1080p cards, not 4K240 cards. 8GB is still lots, even for 2025, at the 60 tier for 1080p gaming.
I can think of 3 reasons:
1) Price. nVidia is charging more than ever for entry level gaming cards, yet they're using VRAM size as a way for product segmentation. There have been articles in the past couple of years on TH about modders having upped the VRAM on these cards and shown a not-insignificant gain, so the performance of these cards is artificially being gimped by nVidia.
2) More and more games are using near or above 8GB VRAM at 1920x1080, especially when ray tracing is enabled. Techspot did a decent article about it last year, I suggest you give it a read.
3) DLSS frame generation. Those extra frames have to have memory to fit in, and I'll borrow a chart from that Techspot article to demonstrate, those extra frames can easily push VRAM usage over 8GB at 1920x1080, so imagine what memory usage will be once DLSS4's additional frames add to that.
https://www.techspot.com/articles-info/2856/bench/SSH-f-p.webp
4) -
blppt Given how disappointing the 5080 is so far, why would the 5060/ti be a threat to anything?Reply -
oofdragon 99.999% of games is called OLD games and a GTX 1060 an OLD card will run them fairly well yeah. 8GB is not enough for gaming in 2025, that's what it is. 90 series became 4K users, 80 series 2K users and 70 series 1K users, that's what it is. We all know, btw, that the 4060 was really the 4050, the 4070 rlly was the 4060 and so on. Now we have a 80 series card costing 2 grand, a 70 series costing one grand, and a 60 series costing half grand, that's reality. What happened is that starting series 4000 Nvidia changed its name scheme and didn't upgrade cards but rebranded them. The 3080 12GB became the 4070 12GB and now it will be called the 5070, it's really a 3080 12GB. In all fairness though the 3080 MSRP at launch was $700 and it came down to $600 when rebranded as 4070 and now it almost hit $500 as a 5070, if the 5060 TI had its power at 16GB woundnt be a bad deal either cause it would be way more useful at 4tK.. but then how would Nvidia sell the 4070 Ti? So it's nerfed once again. Yeah games didn't evolve for the last 4 years and won't until the PS6 comes , that's when GPUs will finally upgrade a tier above the last gen (3000 series)Reply -
hotaru251
old games? yes, but are you spending 450+ dollars for older games and not for upcoming gamesover next few yrs that WILL use more?TheyStoppedit said:8GB is still plenty to run 99.99999999999% of games at max settings at 1080p.
Games like Monster Hunter Wild want over 6GB at 720p low everything.
it goes up from there. more so if you turn on frame gen (which uses more vram & also the game wants you to for 60fps at 1080p upscaled)
and thats JUST the games....your vram is used by other stuff in background.
GDDR6 is around 2 per gb...so going from 8gb to 16 gb is less than $20 more to do...for a product that is literally 400+.
Nvidia's just got so far ahead they treat low end like crap becasue it makes ppl buy a more expensive card w/ more vram.
and that ignoring how badly they gimp the memory bus of the 4060 (and you will likely also see on 5060)
AMD's raster & vram is better than nvidia's and only reason they aren't the best is they are much worse moment anything needs ray tracing...if AMD's RT can be around 4070's they have nothing to fear from nvidia on low end.