God of War — the PS4 2018 release, at least — is finally making its way to PC. Along with visual enhancements, it also has DLSS and FSR support. We've grabbed some of the best graphics cards and tested performance to see how it runs. We've also put together a video showcasing the image quality of the various DLSS and FSR settings using Nvidia's ICAT utility.
For newcomers to the series (like me!), there's a long history that I'm missing. I've certainly seen images and videos of Kratos over the years, but other than the fact that he battles big monsters, I couldn't tell you much. So far, that hasn't really detracted me from enjoying the game. Reviews of the original PS4 version were extremely favorable (94 average at Metacritic), and the PC release has been equally well-received (93 average). I've heard the PS4 release could struggle at times with performance, but the PC release doesn't seem to have any such problems — with the appropriate settings and hardware, at least.
Intel Core i9-9900K (opens in new tab)
MSI MEG Z390 Ace (opens in new tab)
Corsair 2x16GB DDR4-3600 CL16 (opens in new tab)
XPG SX8200 Pro 2TB (opens in new tab)
Seasonic Focus 850 Platinum (opens in new tab)
Thermaltake Toughpower GF1 1000W (opens in new tab)
Corsair Hydro H150i Pro RGB (opens in new tab)
Phanteks Enthoo Pro M
Windows 10 Pro 2H21
Nvidia GPUs: 497.29 drivers
AMD GPUs: 22.1.1 drivers
For our look at the launch performance of the Windows release of God of War, I grabbed a selection of the latest GPUs from AMD and Nvidia, plus a couple of previous-gen cards, and ran some benchmarks at 4K with ultra settings. That's definitely going to be too taxing for some of the slower GPUs, but the game also supports DLSS and FidelityFX Super Resolution (FSR), and I tested all the available settings for those on each of the cards as well. I'll get into image quality below, but just be warned that using the DLSS ultra performance or FSR performance modes can result in a visible loss in fidelity, though with a decent boost to performance.
Running at lower resolutions would also help quite a bit, though the Windows release at present only supports windowed and borderless windowed modes. There's no fullscreen mode, which is unfortunate and makes running at other resolutions more time consuming for testing purposes. Because of that, I skipped changing the desktop resolution and just focused on the 4K results. The scaling shown in the following charts should also apply to 1440p and 1080p resolutions, though CPU limits might become more of a factor.
One final note is that Nvidia provided early access to God of War and said the current 497.29 drivers would suffice, which was a bit strange as they're not specifically listed as being Game Ready. AMD's 22.1.1 drivers do list God of War. The game also uses DLSS version 220.127.116.11, which incorporates the latest updates.
God of War PC Performance
If we're willing to round up, four of the GPUs managed an average performance of 60 fps while running God of War at 4K native with ultra settings. Minimums do dip below that mark, and the major battles can drop things even further, but this is as taxing as God of War gets. Basically, you'll need an RTX 3070 Ti or RX 6800 XT to hit 60 fps without any help from FSR or DLSS.
We can also see indications of the Nvidia-centric optimizations, or perhaps it's just that the game uses a DX11 rendering engine and Nvidia tends to be better about driver optimizations for DX11. Either way, where the RX 6800 XT typically ends up just a few percent slower than the RTX 3080 Ti (it's 14% slower at 4K ultra in our GPU benchmarks hierarchy), here it trails by 25%, barely edging past the RTX 3070 Ti. Note that testing at 4K does put more strain on AMD's Infinity Cache, which is part of the reason for the worse showing by AMD's RDNA2 GPUs here.
The AMD deficit extends to previous generation cards as well. The RTX 2060 nominally competes with the RX 5600 XT, and in our GPU hierarchy was 2% slower overall at 4K ultra. However, in God of War, the RTX 2060 was 8% faster than the RX 5600 XT. The same goes for the RTX 3060 and RX 6600 XT. They're tied in our GPU benchmarks using a test suite of nine games, but the RTX 3060 was 15% faster in God of War.
Enabling the various FSR modes doesn't radically alter the standings. In fact, the only change in rankings is that the RX 6800 XT eventually falls behind the RTX 3070 Ti in the charts due to a significant drop in minimum fps. Of course, that may simply be an anomaly, and we'll try to retest the card and also look at the RX 6900 XT and RX 6800 if time permits, but at least FSR doesn't appear to favor AMD's GPUs in any noticeable way.
Looking at the relative gains versus native rendering, FSR ultra quality improved performance by 26% on average, quality mode provided a 41% boost, balanced was good for a 53% increase in framerates, and performance mode yielded a 67% improvement.
Looking at image quality, there's almost no perceptible loss from the ultra quality and quality FSR modes, at least in this game. In contrast, balanced and particularly performance mode do show a more noticeable degradation. Note that this is only if you're targeting a 4K output result, as we've seen in the past that FSR and DLSS both tend to do better with 4K than 1080p.
In God of War, performance scaling from the DLSS modes was relatively close to what we saw from FSR. DLSS quality mode improved framerates by 28% on average, balanced mode gave a 38% increase, performance mode boosted fps by 48%, and ultra performance mode provided a 63% increase. However, the 9x upscaling factor used in ultra performance mode definitely results in visible image quality degradation, so we would avoid using that mode whenever possible.
It's also worth pointing out that DLSS tends to benefit the slower GPUs like the RTX 2060, RTX 3060, and RTX 3060 Ti more than it does the RTX 3070 Ti, RTX 3080 Ti and 3090. Memory bandwidth and (to a lesser extent) capacity are contributing factors, and the 3070 Ti, 3080 Ti, and 3090 all have more than enough bandwidth to spare. For example, DLSS quality mode provides an average improvement of 33% on the three slower RTX cards we tested, but only 25% on the three faster cards. The same can be said for FSR: slower GPUs often benefit more from the reduced workload.
God of War PC Image Quality and Settings
Trying to provide real-world comparisons of image quality can be difficult. Screenshots don't always convey the user experience, and videos don't always match up. More importantly, video compression algorithms (especially those on YouTube) can munge the quality and obscure differences. Still, we've provided both just to show how DLSS and FSR affect the final rendered output.
The above video uses Nvidia's ICAT utility with video results captured via GeForce Experience on the RTX 3090 — the card shouldn't matter other than affecting the fps counter. While the original captures were all at 50Mbps, the YouTube video downgrades that to just 16Mbps, so keep that in mind. As we've hinted at already, there's not a massive difference between native rendering and the higher quality DLSS and FSR modes, at least when targeting 4K. We could play God of War quite happily using just about any of the modes, with DLSS ultra performance being the only real exception. There's certainly a loss in fidelity with the DLSS performance, FSR performance, and FSR balanced modes, but it's not terrible — at least in this particular game.
DLSS Ultra Performance
FSR Ultra Quality
Looking at the still screenshots more or less confirms the above impressions. Most of the DLSS and FSR modes look practically identical, and you have to look carefully to spot the differences. For example, you might notice a slight loss of detail in the twigs overlapping the tree in the middle-top area of the screen, but you likely wouldn't notice it in motion.
The one exception to this is DLSS ultra performance mode, where curiously, the heavy depth of field blurring effect on the mountainside and trees in the distance basically disappears. It's due to the way DoF filters work with resolution; starting at 720p, the filter just doesn't end up blurring things as much — the reduction in blur is also visible on the DLSS performance screenshot.
But again, the key is to see these things in motion and then decide how much the loss of detail impacts the gaming experience. DLSS ultra performance is playable, but you see some shimmering and other artifacts on grass quite a bit, even at 4K, and it becomes much more noticeable at 1080p — not that you should need ultra performance mode if you're running at 1080p.
Of course, using DLSS or FSR isn't the only way to boost performance. God of War includes four presets, along with a "custom" option that allows you to tweak the seven individual graphics settings. We tested with ultra for the above results, but dropping to the high preset boosted performance about 20%, the "original" preset was about 50% faster than ultra, and dropping to the low preset improved performance by around 60%. A reduction in shadow quality is the most visible change between the various presets, and not coincidentally, that single setting affects performance the most.
You can, of course, mix using the presets with DLSS and FSR. In our performance charts, the RX 5600 XT could only hit 47 fps using FSR performance mode. It could also reach 44 fps at 4K using the low preset, but more importantly, using the original quality preset with FSR quality mode got it to 64 fps, and the low preset combined with FSR quality mode got it to 74 fps. That's a 175% improvement in framerates if you're willing to drop the settings and enable FSR.
It's not particularly surprising that the RX 5600 XT and similar GPUs can easily break 60 fps under such constraints, of course. God of War first came out as a PS4 game, and even the more recent PS4 Pro only has a GPU that's roughly equivalent to the Radeon RX 580 — the original PS4 has a GPU that's more akin to an RX 560, which sits at rank 54 out of all the graphics cards we've tested in our GPU hierarchy. So if all you want is 1080p using the original quality preset, just about any GPU made in the past five years should suffice.
God of War for PC, Finally Where it Belongs
It's been interesting watching the change of heart from both Microsoft and Sony over the past few years. When the original Xbox came out, Microsoft started buying up gaming studios and then made the resulting games console exclusives. Sony had basically been doing the same thing. Of course, there are plenty of cross-platform games, but the assumption was that getting gamers to buy into a specific console platform would ultimately generate more money.
With Windows 10, Microsoft had a change of heart and started releasing nearly all of its first-party games on the Microsoft Store as well as the Xbox One. Actually, it took things a step further with the Play Anywhere initiative and let you buy a game once for both the Xbox One and the Microsoft Store. Clearly, that experiment worked out well, even though the Microsoft Store is one of the worst digital distribution platforms, because Microsoft has started releasing more games on other services like Steam.
It took a few years more before Sony was willing to chance doing the same thing, with Death Stranding and Horizon Zero Dawn being a couple of examples of formerly PlayStation exclusives coming to PC. And guess what? People are still more than happy to buy the consoles, and some people prefer to play games there, but there's clearly an untapped market of PC gamers. God of War is another major PlayStation franchise that's now on PC, and it looks and plays great, even more than three years after its initial release.
Now all we need is a remastering of the rest of the series for PC, similar to what Microsoft did with the Halo Master Chief Collection. We can only hope such a project will see the light of day and get the necessary attention to detail. Because, despite GPU and component shortages, the PC as a major gaming platform clearly isn't going anywhere.