Why you can trust Tom's Hardware
Nvidia RTX 5070 Founders Edition Rasterization Gaming Performance
We divide gaming performance into two categories: traditional rasterization games and ray-tracing games. We benchmark each game using four different test settings: 1080p medium, 1080p ultra, 1440p ultra, and 4K ultra.
For the RTX 5070, we'd rate the 1440p ultra results as the most important. While the card can at times handle 4K ultra, and 1080p ultra is also an option, the sweet spot tends to be 1440p. 1440p also functions as a preview of sorts for 4K with quality mode upscaling, while 1080p serves as a partial look at 4K with performance mode upscaling. There's overhead involved with DLSS upscaling (particularly the new Transformers model), but trying to test every setting possible with our collection of games is simply too time consuming.
The RTX 5070 should handle 1440p fine for the most part, though it may struggle with ray tracing ames. 4K will typically need some upscaling help to hit smooth performance levels (60-ish FPS or more). We also have the overall performance geomean, the rasterization geomean, and the ray tracing geomean. We'll put each group of charts in the order of 1440p, 4K, 1080p ultra, and 1080p medium.
Let's start with the rasterization suite of 16 games, as that's arguably still the most useful measurement of gaming performance. Plenty of games that have ray tracing support end up running so poorly that it's more of a feature checkbox than something useful.
We'll provide limited to no commentary on most of the individual game charts, as the numbers speak for themselves. The Geomean charts will be the main focus, since those provide the big picture of how the new RTX 5070 stacks up to other GPUs.
The most important comparison point for the RTX 5070 will be against the prior generation RTX 4070. We could have included the RTX 4070 Super as well, but that's not too far off the performance of the RTX 4070 Ti, and we ran out of time to test additional GPUs. We'll also want to look at how the 5070 look in comparison to the new RTX 5070 Ti. It's supposed to cost $200 less, or 27% less (though retail pricing is likely to be higher for both GPUs for quite some time), so if it's only 15% slower that would be a win of sorts. Seeing how the 5070 handles the RX 7900 XT will have to suffice for the time being, and we'll see about updating the charts here to include the RX 9070 and 9070 XT in the coming days.
What's the overall picture look like? Remembering that most of the GPUs in our charts cost more than the RTX 5070, it does reasonably well. It's 19% faster than the 4070 at 1440p, and that increases to a 22% lead at 4K. It also drops to 16% at 1080p ultra and 14% at 1080p medium, where CPU bottlenecks potentially become a factor.
Compared to the RTX 5070 Ti, the 5070 runs 20% slower at 1440p and 24% slower at 4K. That's almost directly proportional to the theoretical 27% lower price, so spending more money to get a linear improvement per dollar spent isn't a bad idea. We also like the fact that the 5070 Ti has 16GB of VRAM — 12GB on a $550 graphics card definitely feels inadequate in 2025. But of course, the current retail prices on the 5070 Ti are much higher than the suggested $749.
What about AMD's comparable (sort of) GPU, the RX 7900 XT? Note that we're picking the 7900 XT as it should be relatively close to the upcoming RX 9070 in performance. (Maybe — we'll have final numbers tomorrow). It would also be useful to look at the 7900 GRE, or at least it would have been before supplied dried up and prices headed north. Unfortunately, we ran out of time for testing so we'll have to look at the 7900 GRE some other time. By our numbers, the 5070 ends up 11–13 percent slower than the 7900 XT, or alternatively it's about 17% faster than the 7800 XT.
Below are the 16 rasterization game results, in alphabetical order, with short notes on the testing where something worth pointing out is present.
Assassin's Creed Mirage uses the Ubisoft Anvil engine and DirectX 12. It's also an AMD-promoted game, though these days, that doesn't necessarily mean it always runs better on AMD GPUs. It could be CPU optimizations for Ryzen, or more often, it just means a game has FSR2 or FSR3 support — FSR2 in this case. It also supports DLSS and XeSS upscaling.
Baldur's Gate 3 is our sole DirectX 11 holdout — it also supports Vulkan, but that performed worse on the GPUs we checked, so we opted to stick with DX11. Built on Larian Studios' Divinity Engine, it's a top-down perspective game, which is a nice change of pace from the many first-person games in our test suite. The faster GPUs are hitting CPU bottlenecks in this game.
Black Myth: Wukong is one of the newer games in our test suite. Built on Unreal Engine 5, which supports full ray tracing as a high-end option, we opted to test using pure rasterization mode. Full RT may look a bit nicer, but the performance hit is quite severe. (Check our linked article for our initial launch benchmarks if you want to see how it runs with full RT enabled. We've got supplemental testing coming as well.)
Dragon Age: The Veilguard uses the Frostbite engine and runs via the DX12 API. It's one of the newest games in my test suite, having launched this past Halloween. It's been received quite well, though, and in terms of visuals, I'd put it right up there with Unreal Engine 5 games — without some of the LOD pop-in that happens so frequently with UE5.
Final Fantasy XVI came out for the PS5 last year, but it only recently saw a Windows release. It's also either incredibly demanding or quite poorly optimized (or both), but it does tend to be very GPU limited. Our test sequence consists of running a set path around the town of Lost Wing.
We've been using Flight Simulator 2020 for several years, and there's a new release below. But it's so new that we also wanted to keep the original around a bit longer as a point of reference. We've switched to using the 'beta' (eternal beta) DX12 path for our testing now, as it's required for DLSS frame generation, even if it runs a bit slower on Nvidia GPUs.
Flight Simulator 2024 is the latest release of the storied franchise, and it's even more demanding than the above 2020 release — with some differences in what sort of hardware it seems to like best. Where the 2020 version really appreciated AMD's X3D processors, the 2024 release tends to be more forgiving to Intel CPUs, thanks to improved DirectX 12 code (DX11 is no longer supported).
God of War Ragnarök released for the PlayStation two years ago and only recently saw a Windows version. It's AMD promoted, but it also supports DLSS and XeSS alongside FSR3. We run around the village of Svartalfheim, which is one of the most demanding areas in the game that we've encountered.
Hogwarts Legacy came out in early 2023 and it uses Unreal Engine 4. Like so many Unreal Engine games, it can look quite nice but also has some performance issues with certain settings. Ray tracing, in particular, can bloat memory use, tank framerates, and also causes hitching, so we've opted to test without ray tracing. (At maximum RT settings, the 9800X3D CPU ends up getting only around 60 FPS, even at 1080p with upscaling!) We may replace this one in the coming days.
Horizon Forbidden West is another two years old PlayStation port, using the Decima engine. The graphics are good, though I've heard at least a few people think it looks worse than its predecessor — excessive blurriness being a key complaint. But after using Horizon Zero Dawn for a few years, it felt like a good time to replace it.
The Last of Us, Part 1 is another PlayStation port, though it's been out on PC for about 20 months now. It's also an AMD-promoted game and really hits the VRAM hard at higher-quality settings. Cards with 12GB or more memory usually do fine, and the RTX 5070 lands about where expected.
A Plague Tale: Requiem uses the Zouna engine and runs on the DirectX 12 API. It's an Nvidia-promoted game that supports DLSS 3, but neither FSR nor XeSS. (It was one of the first DLSS 3-enabled games as well.) It has RT effects, but only for shadows, so it doesn't really improve the look of the game and tanks performance.
Stalker 2 is another Unreal Engine 5 game, but without any hardware ray tracing support — the Lumen engine also does "software RT" that's basically just fancy rasterization as far as the visuals are concerned, though it's still quite taxing. VRAM can also be a serious problem when trying to run the epic preset, with 8GB cards struggling at most resolutions. There's also quite a bit of microstuttering in Stalker 2, and it tends to be more CPU limited than other recent games.
Star Wars Outlaws uses the Snowdrop engine, and we wanted to include a mix of options. It also has a bunch of RT options that we leave off four our tests. As with several other games, turning on maximum RT settings in Outlaws tends to result in a less than ideal gaming experience, with a lot of stuttering and hitching even on the fastest cards.
Starfield uses the Creation Engine 2, an updated engine from Bethesda, where the previous release powered the Fallout and Elder Scrolls games. It's another fairly demanding game, and we run around the city of Akila, one of the more taxing locations in the game. It's a bit more CPU limited, particularly at lower resolutions.
Wrapping things up, Warhammer 40,000: Space Marine 2 is yet another AMD-promoted game. It runs on the Swarm engine and uses DirectX 12, without any support for ray tracing hardware. We use a sequence from the introduction, which is generally less demanding than the various missions you get to later in the game but has the advantage of being repeatable and not having enemies everywhere. Curiously, the RTX 40-series cards are able to hit much higher performance at 1080p than the 50-series and AMD cards.
- MORE: Best Graphics Cards
- MORE: GPU Benchmarks and Hierarchy
- MORE: All Graphics Content
Current page: Nvidia RTX 5070 Founders Edition Rasterization Gaming Performance
Prev Page Nvidia RTX 5070 Founders Edition Test Setup Next Page Nvidia RTX 5070 Founders Edition Ray Tracing Gaming PerformanceJarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.


















-
Thunder64 This thing is getting blasted everywhere else but here it is 4 stars? What a joke. Not to mention the 50 series is probably the wrost GPU launch ever.Reply -
JarredWaltonGPU
Which, sadly, has a going price of basically $1000 or so new, or you can take your chances with eBay where prices over the past 30 days are averaging $789.55. Not that I expect the 5070 to be any better in the near term. Minor gains are the new status quo, so 20% faster for nominally the same price as the outgoing generation isn't bad.logainofhades said:Yea it's basically a 4070s at best. -
JarredWaltonGPU
I would say the entire 30-series in late 2020 throughout 2021 was, so far, worse than what we've had from the 50-series. RTX 3080 selling for $2000–$2500? RTX 3090 going for up to $4500? Yeah. And you know what? None of that was the fault of Nvidia or AMD.Thunder64 said:This thing is getting blasted everywhere else but here it is 4 stars? What a joke. Not to mention the 50 series is probably the worst GPU launch ever.
The current supply restrictions are much more in Nvidia's control, because it's deciding to prioritize AI over consumer. But I can't fault a company for choosing to do more of the thing that accounted for 88% of its revenue last year.
Is four stars too high? 🤷♂️ That's based on the theoretical MSRP, because GOK what the actual prices are going to be throughout 2025! On paper, everything looks decent. In practice, everything is fubar — and I mean that about all GPUs right now. So writing emotionally vapid comments blaming Nvidia for lack of stock just isn't something I'm going to bother doing. Yes, the supply situation sucks right now. Prices suck right now. You can't buy these at $549 right now (unless you win the lottery). But if you could buy one at that price? Sure, it's a 4-star card, maybe 3.5-star. And getting bent out of shape about a half a star difference of opinion isn't worth the effort.
Put another way: Read the review, look at all the pretty charts, decide for yourself how good/bad/whatever the card is. But don't get hung up on one number that tries (and always fails) to encapsulate way too much information. -
oofdragon LMAO decent price!!!! And he even omitted direct comparison with the 4070 Super!!!!! Hahaha what a joke, this is the most n greed shill website in the whole worldReply
throwback when this same guy said 4070>6950 at same price 😂 this is comedy. Can't wait to see the 9070 "review" tomorrow where he will try and fail to make it look bad compared to this failure -
artk2219
I get where you're coming from, and if the 9000 series had launched first, i would have some real issues with that score. But the 9000 series hasn't launched yet, the market is a mess with pricing all over the place, and the RTX 4070 Super and 7900 GRE basically no longer exist in retail. Given the space this card has launched into, if it can be had at MSRP, it's appropriate. Do I love it? No. But looking at it outside of a bubble, until there are more competing products, it's not the worst thing. It could definitely use more vram though.Thunder64 said:This thing is getting blasted everywhere else but here it is 4 stars? What a joke. Not to mention the 50 series is probably the wrost GPU launch ever.
As for the worst GPU launch ever, nah, we tend to forget just how bad the GeForce FX 5000 and GTX 400 launches were. I'm tempted to throw the Radeon HD 2000 series in there, but they at least typically made it through their warranty period before they would outright die. This could not be said for the flagships from those other two series, the HD 2000 series was just hot, loud, and not very competitive. That said, is this the worst launch in 15 years? Undoubtedly. -
baboma >This thing is getting blasted everywhere else but here it is 4 stars?Reply
No surprise. 5070 is getting special attention because of Huang's "5070 > 4090" CES blurb that had the cognoscenti gnashing their teeth. The throng is itching for payback, and this is their chance.
>What a joke.
Yes, it's a joke that people are crying about overpriced GPUs, when the price of everything else had just jumped 25% overnight.
>Not to mention the 50 series is probably the worst GPU launch ever.
Famous last words. -
LolaGT This is the first review I've read, and I'd have to say that was an unexpectedly poor result.Reply
Leaving out the 4070S was on purpose(probably on urging from someone who provided the hardware for testing, we can guess who), no doubt because that was what it needed to stand up to and be compared with and I knew I was not alone seeing that omitted as glaringly telling.
I'm not sure it really matters, because there will not be any real availability of note probably until the 5070S is close to release.
MSRP? haha, that's the real joke. -
DRagor
You forgot about fake ROPsbtmedic04 said:Ah, more vaporware with fake frames and fake msrps. Pass