GPUs with 8GB of VRAM in 2025 are 'like bringing a butter knife to a gunfight' reckons Grok AI
Musk's humorous AI isn't joking, suggests the increasing weight of analysis.

Shipping graphics cards with just 8GB of VRAM is tantamount to "bringing a butter knife to a gunfight," opines the Grok AI, built-into Twitter/X. The AI agent was commenting on a thread about the recent RTX 5060 Ti 8GB performance analysis – one which showed this model may be up to 10% slower than the 16GB variant in popular games. Elon Musk's humorous AI is powered by hundreds of thousands of Nvidia GPUs, so there is a little irony in it disparaging the same brand of silicon that gave it life.
PC enthusiasts were already braced for new generation SKUs arriving with as little as 8GB onboard - before graphics cards like the Nvidia GeForce RTX 5060 Ti were officially launched. Nevertheless, seeing these fears materialize was still painful. And this feeling of disappointment now looks set to continue, with the drip-drip of analysis of GPU commentators sharing benchmarks and 'told you so' tales. We must also add in to the unhappy mix the certainty that newer titles will only be pushing VRAM demands higher.
Grok - this 'humorous AI' isn't joking, or hallucinating
Responding to PunmasterStp on X, Grok highlighted that "Modern AAA games are chomping through VRAM faster than a kid with a bag of candy—especially at 1440p or 4K with all those juicy high-res textures and ray tracing bells and whistles." It went on to contrast the RTX 5060 Ti 8GB and 16GB variants. Users of the former will see it prematurely age with "stutters, texture pop-ins, and even crashes in heavy hitters like Hogwarts Legacy and Space Marine 2," Grok said. Meanwhile, the latter model, with 16GB, was said to be comfortably "cruising" in some of the same titles.
Grok continued with its unvarnished RTX 5060 Ti 8GB takedown by stating that "If you’re planning to game for the next few years without constantly tweaking settings down to potato mode, 8GB just ain’t gonna cut it." Potato mode seems a bit harsh, but the message is clear to those eyeing their budget and new/used 8GB graphics cards – save up more or adjust your expectations and preferences.
AMD is also expected to launch 8GB '60 card(s) shortly
According to the latest murmurings from Taiwan, AMD isn't preparing to ride to the rescue in the '60 arena. A few days ago, we reported that AMD has no plans to cancel its upcoming 8GB VRAM product(s) or halt supply to board partners – despite earlier rumors suggesting the contrary. So, brace yourselves for the reveal of both 8GB and 16GB variants when the Radeon RX 9060/XT models are paraded by AMD and its partners (probably) at Computex, later this month.
Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

Mark Tyson is a news editor at Tom's Hardware. He enjoys covering the full breadth of PC tech; from business and semiconductor design to products approaching the edge of reason.
-
kanewolf Not for non-gamers .... Excel doesn't need vast amounts of VRAM even with multiple 4K monitors ...Reply -
Notton 8GB in 2025 is the hand-me-down GPU you put in the spare PC, except it's new, full price, and missing features like physX.Reply
As an aside, this does not bode well for laptop GPUs where 8GB considered as a mid range, 12GB a mid-high, and >16GB premium. -
TechNomad If AMD is wise they will rebrand the 8GB model to the 9050, or something other than a 9060Reply -
Alvar "Miles" Udell Honestly I think 8GB GPUs above the entry level (--60 laptop and --50 desktop) shouldn't exist. Given the power of even entry level gaming desktop cards like the --60, 8GB cards only exist to upsell a $100+ more expensive 16gb variant.Reply -
Alvar "Miles" Udell
What if you're completely wrong?LordVile said:If you’re playing 1080 8GB is fine. Not everyone plays at 4K.
https://www.tomshardware.com/pc-components/gpus/nvidia-rtx-5060-ti-8gb-struggles-due-to-lack-of-vram-and-not-just-at-4k-ultra
And this is on top of the 10%+ performance penalty you suffer if you dare to use PCIe 4.0.
https://www.tomshardware.com/pc-components/gpus/nvidia-rtx-5060-ti-8gb-loses-up-to-10-percent-performance-when-using-pcie-4-0#xenforo-comments-3878849 -
LordVile
Easily over 60 FPS is “struggling” now? No. It was over 60FPS in everything at 1080p and was more than serviceable at 1440p, it struggled and wasn’t usable on most things at 4K. Theres not much the 16GB model can comfortably play at 1080 or 1440 that the 8GB can’t outside of unoptimised edge cases at maxed out detail settings . Which for a 1080p card like the 60 series are is fine. For a 70 card which is aimed at 1440p it would be an issue but that’s not what we’re talking about.Alvar Miles Udell said:What if you're completely wrong?
https://www.tomshardware.com/pc-components/gpus/nvidia-rtx-5060-ti-8gb-struggles-due-to-lack-of-vram-and-not-just-at-4k-ultra
And this is on top of the 10%+ performance penalty you suffer if you dare to use PCIe 4.0.
https://www.tomshardware.com/pc-components/gpus/nvidia-rtx-5060-ti-8gb-loses-up-to-10-percent-performance-when-using-pcie-4-0#xenforo-comments-3878849 -
rluker5 I think the game devs are getting paid by AMD and Nvidia to obsolete 8GB cards.Reply
It is naturally in their interest to appeal to the largest market reasonably possible and in the GPU manufacturers interest to cut off some models so they sell more. The two are at odds there and some convincing seems plausible.
The games that need more than 8GB don't look better than the ones that don't so 8GB doesn't seem technically necessary, just a limitation placed by substandard optimization or intent.