Skip to main content

Why You Shouldn’t Buy Nvidia’s RTX 20-Series Graphics Cards (Yet)

Originally published August 22, 2018

Nvidia’s new GeForce RTX 20-series graphics cards were just announced, but there’s a few solid reasons you shouldn’t jump on the ray-tracing train and purchase one of the new Turing-based GPUs. At least not yet.

High Pricing (For Now)

We’ve already discussed possible reasons why Nvidia’s 20-series GeForce pricing is exorbitant, and the top-end 2080 Ti Founder’s Edition fetches a cool $1200, which is $500 more than the GTX 1080 Ti cost at launch. Even with partners selling the RTX 2080 Ti, 2080, and 2070 cards at a suggested MSRP of $999, $699, and $499, respectively, the Turing-based RTX graphics cards are priced considerably higher than the previous-generation counterparts (an increase of $300, $100, and $50 from the GTX 1080 Ti, 1080, and 1070, respectively) at launch.

Gamers aren’t all that excited to pay this kind of pricing for the company’s most powerful 20-series GeForce offering, and only the 2070 seems like a reasonable generational price increase among the new Turing-based graphics cards.

Nvidia GeForce ModelXX70XX80XX80 Ti
GTX 10-Series Starting MSRP$450$600$700
RTX 20-Series Starting MSRP(Reference Design)$500$700$1000

Comparing prices to the previous generation is one thing (it doesn’t look good from that perspective), but the most important factor for those considering a 20-series GPU purchase will undoubtedly be the performance. Nvidia’s claim that the RTX 2070 performs better than high-end Titan Xp in games is enticing, but there's no proof yet.

No Gaming Benchmarks (Yet)

The vast majority of Nvidia’s GeForce RTX performance claims focus on its ray-tracing acumen. The only metrics available are those that you can surmise from the provided specifications (core and memory clock speeds, memory bandwidth), and that doesn’t give us any idea of what the card can actually do with a variety of gaming workloads. Although the cards were demoed in various places at Gamescom, the settings used and framerate performance is still a mystery.

Until we can properly examine generational performance data (GTX 1080 Ti vs RTX 2080 Ti, etc), especially on games that don’t support ray tracing, those with 10-series graphics cards should hold off to see how it stacks up against your current GPU before forking over the cash. Especially because...

Ray Tracing Isn’t A Thing (Yet)

The GeForce RTX lineup is a new day for Nvidia’s ray tracing technology that was first introduced with the Volta-based Titan V, bringing “the holy grail” of light rendering techniques to the mainstream market. However, this momentous innovation relies on developer participation. Similar to Nvidia’s PhysX and GameWorks effects, game designers have to invest in the ecosystem and code their work for the RTX ray-tracing tech.

Looking at PhysX as an example, Wikipedia lists 77 titles with hardware support for that feature, going back to 2005. While that's certainly no small list of titles, it's certainly a minority of total high-profile game releases over the last 13 years. So there's a good chance, going forward, there will be plenty of titles you'll want to play that don't support real-time ray tracing.

With the exception of a few announced titles and expected participants (22 in total), there aren’t a lot of games in the pipeline (and none existed up to this point) that would take advantage of the GeForce RTX graphics cards’ tensor and RT cores (which again, likely adds a considerable premium to the product). Purchasing an expensive GPU with groundbreaking ray tracing performance may not be worth it for someone whose favorite game doesn’t support it.

Let's also not forget that gaming feature adoption also often tends to be driven by console hardware and titles. Obviously, no console is powerful enough yet to do real-time ray tracing, and that isn't likely change anytime soon. That's especially true since, until the Nvidia-powered Nintendo Switch, AMD graphics powered all the current gaming consoles. And the Tegra X1 inside Nintendo's console--a 15-watt chip that will be four years old early next year--certainly isn't up to the ray-tracing task. For real-time ray tracing to truly become ubiquitous, it needs to first be embraced by AMD, and then by future consoles.

At least for now, Nvidia is the only horse in the race for ray tracing-capable consumer GPUs, and its eyes are on the future with the new GeForce RTX graphics cards. However, for many gamers, the future isn’t right now. But it could be soon, if more AAA game developers take advantage of the ray tracing capabilities of the Turing architecture. But only time will tell how common the feature will be come 2019 and beyond

Pre-Orders Are Already Out of Stock (Mostly)

Even if you are itching to get one of the new Turing-based 20-series graphics cards right now, you’ll be hard pressed to find any of the RTX 2080 Ti variants in stock as of this writing. Online suppliers quickly sold out their pre-orders, and the top-end cards started appearing from third-party sellers with substantially higher prices. There are still options for the 2080 out there,  but selection is quickly dwindling and many of the overclocked models are already gone. Pricing for most of the available cards sits far above Nvidia's recommended MSRP, but there are a few that are at or below the reference price (see Zotac and Gigabyte).

Until the official release (September 20) or another pre-order run becomes available, it may be tough to find a GeForce RTX graphics card at or near MSRP. That's OK though. Unless your current card isn't capable of playing the games you like at the settings you crave, we'd recommend waiting on Nvidia's new 20-series cards--at least until we see full reviews and benchmarks.

If you already have a 10-series card that's adequate, it may make sense to skip this card generation entirely and see what kind of ray tracing and Tensor Core support is like come time for the 30-series line. By that time, it will also be clear what kind of a response AMD is going to muster to battle Nvidia's Turing juggernaut. Intel might have something serious to say about high-end gaming by that time as well. 

Derek Forrest is a Contributing Writer for Tom's Hardware US. He writes hardware news and reviews gaming desktops and laptops.