CPU scaling with DLSS — investigating CPU performance in the age of upscaling
In a time of ubiquitous upscaling, your CPU can quickly become a performance bottleneck.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
You are now subscribed
Your newsletter sign-up was successful
The components and games we chose were deliberate. Our hypothesis (if you’d like to call it that) is that we’d see performance scaling for CPUs based on the internal render resolution of a game rather than the output resolution. For example, with DLSS set to Performance mode at 4K, we’d expect to see CPU scaling similar to native 1080p. The results of our testing are a bit more nuanced, but that was the assumption going into testing.
The most important variable to control for was the GPU, and for that, we chose the RTX 4080 Super. In game testing for our CPU reviews, we normally use the RTX 5090. That’s to remove any potential barriers imposed by the GPU in games and isolate CPU performance as much as possible. That approach doesn’t work here. We’re evaluating the balance of components within a full system as it relates to total render time, and more specifically, how that balance shifts with DLSS enabled.
We must create a GPU-bound scenario to see at what point CPU scaling starts to show up. To keep testing focused, we tested DLSS Quality and Performance. The Ultra Performance mode isn’t officially part of DLSS, and isn’t included across games with DLSS. With that settled, the goal is to create a GPU-bound scenario at 4K Quality mode. That translates to an internal resolution of 2560 x 1440, and given our assumptions about CPU performance above 1080p, should completely bind performance to the GPU. Below that point, however, we expect to see some scaling. The RTX 4080 Super fits in the sweet spot for these assumptions.
Using the RTX 5090, we saw CPU-bound performance with the lower-end chips at much lower resolutions and quality settings, creating an artificial scenario where the most powerful and expensive graphics card money can buy consistently outpaced the CPU. It basically forces us to see CPU scaling. The RTX 4080 Super creates a more realistic scenario. It’s powerful enough to create a GPU-bound performance scenario at 4K Quality, but not so powerful that it runs into a GPU bottleneck across the resolution and quality setting range.
Outside of the GPU, our test bench remained unchanged from our CPU reviews. We used Nvidia driver 591.74 (the latest at the time of testing), and we didn’t apply any DLSS overrides. There are some interesting differences between different DLSS models, which we covered in our individual results. We also kept VBS off and Resizable BAR on, keeping with how we test CPUs for reviews.
Moving on from hardware, we chose five games to test. For each chip, we tested Quality and Performance mode for DLSS at 4K, 1440p, and 1080p, and we ran the benchmark at least twice for each pass (three times in the event of performance anomalies). Although we ran multiple passes for each resolution and quality setting, we didn’t average them — each result is an actual run we recorded.
For the games themselves, each of them uses a different engine, which was important so as not to create lopsided results. In addition, we have two titles that use the now-defunct CNN DLSS model, as well as titles that use the newer Transformer model. It’s important to represent both. As we’re looking at shifts in the balance of total render time, the CNN and Transformer DLSS models have different levels of performance overhead, and therefore, represent different sizes of the total render time.
CPU Performance with DLSS in Reviews
Although this is an interesting look at how DLSS interacts with CPU performance, we didn’t do the testing solely to write this article. We also want to evaluate how important CPU scaling is with DLSS enabled in the context of our CPU reviews. After all, a lot of gamers are playing with some form of upscaling enabled, and it’s important to represent that use case, given how CPU scaling shifts.
We’re looking at adding a single game to our test suite that represents DLSS performance at 4K. The idea now is to use Cyberpunk 2077, as it’s a game that scales well on the CPU without being totally bound by the CPU. Plus, it’s still a popular game. We’d use a 4K output with DLSS set to Performance mode, as that’s Nvidia’s recommended quality setting for a 4K output, and have a set of benchmarks that we can compare to our native 1080p results. We’d use the same benchmark path, the same graphics settings, and the same test system. We’d just test at a 4K output with DLSS set to Performance mode for a 1080p internal resolution.
Although this is a good “real-world” test to simulate how you might actually play games, it’s not something we’d include in our overall average. As our test results here show, there’s overhead with DLSS that changes from game to game, so it doesn’t make sense to sway our averages. It would serve more as a touchstone of what you might expect with DLSS turned on, and how it interacts with the CPU as the resolution drops.
For now, we have some interesting takeaways overall. At 4K Quality or native 4K, your CPU plays a less significant role in performance. Unless you’re using an old or particularly weak CPU with a high-end GPU, you shouldn’t run into a bottleneck. At 4K Performance, the CPU plays a more significant role, though the bottleneck is minor due to DLSS overhead and mainly shows up on lower-end chips. At 1440p, if you’re using the Balanced preset or lower, your CPU plays a significant role in performance. I would guess this is the most problematic performance scenario for gamers, as resorting to more aggressive upscaling likely won’t net a major performance uplift if you’re using a weaker CPU that’s a few generations older paired with a newer GPU.
And, of course, we see scaling with CPUs at native 1080p. With DLSS on, those differences become more pronounced as the CPU becomes a complete bottleneck on performance.

Jake Roach is the Senior CPU Analyst at Tom’s Hardware, writing reviews, news, and features about the latest consumer and workstation processors.