How Much CPU Does the GeForce RTX 3080 Need?
Breaking the bottleneck
We've just posted the GeForce RTX 3080 Founders Edition review, which now reigns — until next week's launch on the GeForce RTX 3090 — as the best graphics card for gaming and sits at the top of the GPU benchmarks hierarchy. While your graphics card is usually the biggest considering when it comes to gaming performance, however, we also wanted to look at some of the best CPUs for gaming to see just how much performance you lose — or gain! — by running it with something other than the Core i9-9900K we use in our current GPU testbed.
For this article, we've pulled out several of the latest AMD and Intel processors: Core i9-10900K, Core i9-9900K, Core i3-10100, Ryzen 5 3600, and Ryzen 9 3900X. Then just for good measure, we've gone old school and dug up an 'ancient' Core i7-4770K Haswell chip. This is the oldest CPU I've currently got hanging around, and it should help answer the question of just how much CPU you need to make good use of the RTX 3080.
We'll be testing at 1080p, 1440 and 4K at both medium and ultra presets. We said in the RTX 3080 review that it's a card primarily designed for 4K gaming, and that it starts to hit CPU bottlenecks at 1440p. That applies even more at 1080p with medium settings. Generally speaking, you don't buy a top-shelf GPU to play games at 1080p, though there are esports pros that do just that while using a 240Hz or even 360Hz monitor. Most games will struggle to hit 240 fps; however, some even have fps caps of around 200 fps. But we're doing this for science!
Because we're using multiple platforms and were under time constraints, not every PC used identical hardware. In the case of the old Haswell PC, it couldn't use most of the modern hardware we install in other testbeds — the sole LGA1150 motherboard I have doesn't have an M.2 slot, and of course, it requires DDR3 memory. I also only have a single kit of DDR3-1600 CL9-9-9 memory, which is fine but certainly not the best possible RAM for such a PC. Cases, power supplies, SSD, and other items also differ, though most of these aren't critical as far as gaming performance is concerned.
The DDR4-capable PCs were all tested with the same DDR4-3600 CL16 memory kit, while our standard GPU benchmarks use DDR4-3200 CL16 memory. We've included both scores for the 9900K as yet another point of reference. Memory speed matters a bit, depending on the game, though it doesn't make a huge difference overall. In fact, at higher settings, the 'slower' DDR4-3200 kit actually came out slightly ahead, probably thanks to slightly better subtimings. Here are the full testbed specs:
Platform | Z490 | Z390 | X570 | Z97 |
---|---|---|---|---|
CPU | Core i9-10900K, Core i3-10100 | Core i9-9900K | Ryzen 9 3900X, Ryzen 5 3600 | Core i7-4770K |
Cooler | NZXT X63 Kraken | Corsair H150i RGB Pro | NZXT X63 Kraken | Be Quiet! Shadow Rock Slim |
Motherboard | MSI MEG Z490 Ace | MSI MEG Z390 ACE | MSI MPG X570 Gaming Edge Wifi | Gigabyte Z97X-SOC Force |
Memory | Corsair 2x16GB Platinum RGB DDR4-3600 CL16-18-18 | Corsair 2x16GB Platinum RGB DDR4-3600 CL16-18-18 | Corsair 2x16GB Platinum RGB DDR4-3600 CL16-18-18 | G.Skill 2x8GB Ripjaws X DDR3-1600 CL9-9-9 |
Memory | - | Corsair 2x16GB Vengeance LPX DDR4-3200 CL16-18-18 | - | - |
Storage | Samsung 970 Evo 1TB | Adata XPG SX8200 Pro 2TB | Corsair MP600 2TB | Samsung 850 Evo 2TB |
Power Supply | NZXT E850 | Seasonic Focus PX-850 | Thermaltake Grand 1000W | PC Power & Cooling 850W |
Case | NZXT H510i | Phanteks Enthoo Pro M | XPG Battlecruiser | Some Ultra POS |
All of the systems are running at stock settings but with XMP memory profiles activated. However, the old Z97 board applies an all-core clock of 4.3GHz, even at stock settings. It's a bit of a quirky motherboard, and the BIOS was finicky enough that I just left it alone. Basically, even though it says 4770K, treat these results as being representative of a stock-clocked Core i7-4790K. If you actually have a stock clocked 4770K, or something like an older i7-2600K, your performance will be even lower than what we're showing here.
We're going to start out with 1080p testing, which is where the choice of CPU is going to make the biggest difference. None of the games in our test suite are esports titles where extreme fps matters, though we do have at least one game that can break into the >360 fps range (Strange Brigade). We've long maintained that there are greatly diminishing returns going beyond 144Hz or even 120Hz displays, and a big part of that is the fact that many games simply don't scale to the higher framerates necessary to make the most of higher refresh rates. Okay, let's hit the charts:
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
GeForce RTX 3080 FE: 1080p CPU Benchmarks
1080p Medium
1080p Medium Percentiles
1080p Ultra
1080p Ultra Percentiles
If you were thinking of pairing the new king of the GPUs with a Core i7-4770K (overclocked) from mid-2013, or maybe even a 3770K or 2600K, the worst-case results at 1080p should at least give you pause. There are plenty of situations where your maximum performance is nearly cut in half. With the RTX 3080, the Core i9-10900K is 70% faster at 1080p medium, though the gap drops to 47% at 1080p ultra.
Even stepping up to a new Core i3-10100 provides a pretty substantial boost to performance — some of that is from the newer Comet Lake architecture, but a lot of the benefit likely comes from the faster memory. The i3-10100 ends up about 25% faster overall compared to the overclocked i7-4770K.
At the other end of the spectrum, the move from Core i9-9900K to Core i9-10900K hardly matters at all. At 1080p medium, the 10900K is 4% faster across our test suite. That's not a lot, and it's not really too surprising. Very few games will leverage more than eight cores, so the majority of the improvement comes from the slightly higher clocks on the 10900K. At 1080p ultra, the two chips are basically tied. Some of this almost certainly comes down to differences in motherboard firmware, however. Even though both testbeds use MSI MEG Ace motherboards (Z390 and Z490), they're not the same. Final Fantasy XIV performs quite a bit worse with the Z490 setup, while everything else sees small to modest gains.
What about AMD's CPUs? First, let's note that the Ryzen 9 3900X only leads the Ryzen 5 3600 by a scant 7% at medium and 4% at ultra. Several games show nearly a 10% lead while others are effectively tied. That means comparisons between Intel and AMD don't change substantially when moving to a different AMD processor — the Zen 2 chips will all be within 10% of each other. And as we'll see momentarily, the AMD CPUs really scrunch together at higher resolutions.
For AMD vs. Intel, the Core i9-10900K unsurprisingly comes out on top, by a noticeable 18% at 1080p medium. Four of the games we tested give Intel more than a 20% lead, with Final Fantasy being the only result that's in the single percentage points. Again, we don't expect most gamers would ever pair the RTX 3080 with 1080p medium settings, but this is the worst-case scenario for CPU scaling. Shifting up to 1080p ultra, Intel's lead drops to just 8%. Far Cry 5 and Shadow of the Tomb Raider still favor Intel by over 20%, Metro Exodus shows a 15% lead, and everything else is below 10%.
Finally, while we didn't do a full suite of testing with ray tracing and DLSS effects, it's important to remember that the more demanding a game becomes on your GPU — and ray tracing is definitely demanding — the less your CPU matters. Many games will still need DLSS in performance mode to stay above 60 fps with all the ray tracing effects enabled.
GeForce RTX 3080 FE: 1440p CPU Benchmarks
1440p Medium
1440p Medium Percentiles
1440p Ultra
1440p Ultra Percentiles
The jump to 1440p represents 78% more pixels, but performance only drops by 15% on average at medium settings, and 20% at ultra settings. In other words, 1080 was very much running into CPU bottlenecks, and shifting to a higher resolution doesn't hurt performance all that much even with the fastest CPUs. And if you're using a slower CPU, in some cases, the change in performance is negligible.
Take the i7-4770K, for example. The 1440p medium results are only 3% lower than 1080p medium, and the 1440p ultra results are 6% lower than 1080p ultra. If you plan on buying an RTX 3080 and running at 1440p ultra with an older generation CPU, you'll lose some performance, but not that much. Well, maybe. It's worth noting that the i9-9900K with a 2080 Super is faster overall at 1440p medium and below compared to the i7-4770K with a 3080, but the 4770K does at least get a small boost to performance. Or with a 2080 Ti, the 9900K even wins at 1440p ultra.
Take just one step up the CPU ladder, like moving to a Ryzen 5 3600, or even a Core i3-10100 (which should be about the same level of performance as a Core i7-6700K), and there's at least some benefit to buying an RTX 3080. Still, if you're running less than a 6-core CPU, we'd definitely recommend upgrading your CPU and perhaps even the rest of your PC before taking the RTX 3080 plunge at 1440p.
GeForce RTX 3080 FE: 4K CPU Benchmarks
4K Medium