After what feels like ages of waiting, Minecraft RTX is here. First teased in August 2019 at Gamescom 2019, if you've wondered what exactly was taking so long, you're not alone. Our bet is that Nvidia and Microsoft have spent a lot of time trying to get Minecraft RTX running as good as possible on the widest selection of the best graphics cards before the public release. If you're thinking Minecraft with ray tracing effects tacked on shouldn't be too complex, think again.
With the beta in hand for the past week, we can say that you're going to need a powerful rig to run it well. And that's with DLSS 2.0 enabled—the Deep Learning Super Sampling algorithm that upscales lower rendering resolutions to help boost performance by 70% or more. Turn off DLSS and framerates will plummet, particularly at higher resolutions. You'll need something from the very top of the GPU hierarchy to do 1440p or 4K at 60 fps, in other words.
We've tested Minecraft RTX across all the GeForce RTX graphics cards, as well as with some of the best CPUs for gaming using the RTX 2080 Ti. We've also checked system memory to determine the minimum amount of RAM that you'll want. Not surprisingly, the GPU will be the biggest hurdle.
Ray Tracing vs. Path Tracing
To understand why Minecraft RTX is so demanding, we need to briefly describe how it differs from other RTX enabled games. Nvidia says that Minecraft RTX uses 'path tracing,' similar to what it did with Quake II RTX, where other games like Battlefield 5 and Metro Exodus have only used 'ray tracing.' For anyone who fully understands the difference between ray tracing and path tracing, you probably just rolled your eyes hard. That's because Nvidia has co-opted the terms to mean something new and different. In short, Nvidia's 'path tracing' just means doing more ray tracing calculations—bouncing more rays—to provide a more accurate result.
Path tracing as used in Hollywood films typically means casting a number of rays per pixel into a scene, finding an intersection with an object and determining the base color, then randomly bouncing more rays from that point in new directions. Repeat the process for the new rays until each ray reaches a maximum depth (number of bounces) or fails to intersect with anything, then accumulate the resulting colors and average them out to get a final result.
That's the simplified version, but the important bit is that it can take thousands of rays per pixel to get an accurate result. A random sampling of rays at each bounce quickly scales the total number of rays needed in an exponential fashion. You do however get a 'fast' result with only a few hundred rays per pixel—this early result is usually grainy and gets refined as additional rays are calculated.
Ray tracing is similar, but doesn't have the random sampling and integral sums of a bunch of rays. Where you might need tens or even hundreds of thousands of samples per pixel to get a good 'final' rendering result with path tracing, ray tracing focuses on calculating rays at each bounce toward other objects and light sources. It's still complex, and often ray and path tracing are used as interchangeable terms for 3D rendering, but there are some technical differences and advantages to each approach.
Doing full real-time path tracing or ray tracing in a game isn't practical, especially not with more complex games like Doom Eternal or Call of Duty. Instead, games with ray tracing currently use a hybrid rendering approach. Most of the rendering work is still done via traditional rasterization, which our modern GPUs are very good at, and only certain additional effects get ray traced. Battlefield 5 and Wolfenstein Youngblood use ray tracing for reflections only; Metro Exodus uses ray tracing for global illumination; and Shadow of the Tomb Raider and Call of Duty: Modern Warfare focus their ray tracing efforts on, predictably for the former, shadows. Only two recent games have gone further. Control uses ray tracing for reflections, contact shadows and diffused lighting, while Deliver Us the Moon uses ray tracing for reflections and shadows.
Compare that with Quake II RTX and Minecraft RTX, where you get ray tracing for reflections, shadows, global illumination, refraction, ambient occlusion, emissive lighting, atmospheric effects and more. That's a lot more rays, though it's still not path tracing in the traditional sense. Unless you subscribe to the 'more rays' being synonymous with 'path tracing' mindset.
(Side note: the SEUS PTGI tool is a different take on path tracing. It depends on Minecraft's use of voxels to help speed up what would otherwise be complex calculations, and it can run on GTX and AMD hardware. There's also the RTGI ReShade tool that uses screen space calculations and other clever tricks to approximate path tracing, but it lacks access to much of the data that would be required to do 'proper' path tracing.)
Can you run Minecraft RTX on GTX cards?
Nvidia lists a GeForce RTX card as the minimum requirement for the Minecraft RTX beta. That's not to say that you can't run the beta without an RTX card, but if you do you'll end up with regular Minecraft—just running on a beta server. If you want ray tracing, or even the new RTX resource packs, you have to have at least an RTX 2060. Sorry.
Wait, doesn't Nvidia support DXR on GTX cards? Yes, it does, and even Quake II RTX—which also uses 'path tracing' to provide enhanced visuals—will at least attempt to run on a GTX 10-series or 16-series card with at least 6GB of VRAM. However, performance in Quake II RTX is quite awful without an RTX GPU. 1280x720 on a GTX 1080 Ti or GTX 1660 Ti will net you about 25 fps, for example, compared to 85 fps on an RTX 2060. Bump the resolution to 1920x1080 and the RTX 2060 still manages 41 fps, while all the GTX cards drop to single-digit framerates. And here's the kicker: without DLSS enabled, Minecraft RTX is actually more demanding than Quake II RTX. Yeah.
So, Nvidia and Microsoft have explicitly disabled DXR on GTX support for the Minecraft RTX beta. Or more specifically, a check is made for an RTX GPU and if one is not present, you can't enable the ray tracing effects. That could change with the final non-beta release, and we anticipate Minecraft RTX will work on AMD's future ray tracing enabled GPUs, but that will have to wait for another day—and possibly a change in name. Again, based on what we've seen with Quake II RTX with its 'path tracing,' we imagine Minecraft RTX would run at sub-20 fps for non-RTX hardware, even at 720p.
Nvidia has previously stated that it takes approximately 10 TFLOPS of shader compute performance to do 1 Gigaray of DXR for its Turing and Pascal architectures. With other games where ray tracing is only used on shadows, global illumination or reflections—but not all of them together—it's still possible to do driver-based DXR calculations on GTX cards. In Minecraft RTX using many rays per pixel, software emulation of ray tracing on GTX simply cannot keep up with RTX hardware.
When Nvidia and Microsoft say that an RTX graphics card is required, they're not just trying to force people to buy new hardware. Performance without hardware accelerated ray tracing would be a fraction of what you get with an RTX GPU. Which isn't to say that Nvidia isn't hoping to convince gamers to upgrade to an RTX card by getting the most popular PC game of all time to support ray tracing; it's just not artificially limiting access for no reason.
Minecraft RTX Testing Details
Before we get into the actual test results, there are a few things we want to make clear. First, this is a beta, with all the usual caveats. Performance metrics aren't final, there may be occasional crashes, and the laundry list of known issues is quite extensive. We won't bother listing them all here, but Nvidia sent along a list of around 70 problems that are already being tracked, and there are certain to be more. It's a bit enlightening how many things need to be fixed for this rendering approach that's supposed to be more flexible and clean—a lot of the issues likely have to do with optimizations to improve performance is our guess.
For testing, we've used our standard GPU test bed, which consists of a Core i9-9900K CPU, 32GB of DDR4-3200 memory, plenty of fast M.2 NVMe SSD storage, and all of Nvidia's RTX 20-series Founders Edition graphics cards. Note that the first generation of RTX FE cards are factory overclocked by 90 MHz, while the later RTX Super cards are all running reference clocks.
We've supplemented our GPU testing with additional tests on other CPUs, including AMD Ryzen 9 3900X and Intel Pentium G5400—the range should cover just about any CPU made in the past decade in terms of performance potential. All of the CPUs were tested using the RTX 2080 Ti FE. We've also tested with the Core i9-9900K and 2080 Ti using 2x4GB and 2x8GB of DDR4 memory, to see if that affects performance at all. (The 2x4GB kit is only DDR4-2666, since I didn't have any faster kits of this capacity available—8GB is dead to me, and has been for several years.)
For this initial preview of Minecraft RTX beta performance, we're testing at four resolutions and three different settings. The resolutions are 1280x720, 1920x1080, 2560x1440, and 3840x2160—though the internal rendered resolution will be lower with DLSS enabled. Our settings have all of the 'fancy' options enabled, but with clouds disabled as they can obstruct the view and cause a larger change in performance. We've also set the time of day to evening, with clear skies, to get consistent results.
Beyond that, we've tested with the default options (DirectX Ray Tracing and Upscaling enabled, with a Ray Tracing Render Distance of 8 chunks), the same but with a render distance of 24 chunks (note that the default in vanilla Minecraft, at least for our test level, is 64 chunks and can be set as high as 160—but with ray tracing the maximum is 24), and finally back to 8 chunks render distance but with upscaling (DLSS) disabled. Note that the beta raises a warning flag if you set the render distance above, which restricts the view quite a bit when you're outside.
The DLSS upscaling at present cannot be adjusted, other than turning it on or off. DLSS 2.0 has three options available: performance, balanced, and quality. At resolutions of 1080p and below, Minecraft RTX uses the quality preset, which renders a bit less than half the number of pixels—44.4% to be precise, or 1280x720 upscaled to 1080p if you prefer. 1440p uses the balanced preset, which renders one third the number of pixels—around 1478x831 (give or take a few pixels). 4K meanwhile uses the performance preset that uses one fourth the native resolution—so 4K renders at 1920x1080 and then upscales to 3840x2160.
The test sequence uses the Imagination Island RTX world, with time of day set to 10000 and weather set to clear. We then take a 75 second ride on a mine cart, starting at the ice castle. This provides a consistent and repeatable workload for each GPU and CPU tested.
Minecraft RTX Graphics Card Performance
Starting with the RTX graphics cards, be prepared for some surprisingly low performance numbers. For reference, regular Minecraft without all the fancy ray tracing typically runs at hundreds of frames per second (fps), and even 4K and 60 fps with maxed out settings is possible on cards like the RTX 2060. Turn on ray tracing and it's a very different story. Let's start with the native resolution testing, without DLSS.
Um, ouch? At native 1280x720, a resolution we haven't typically used for gaming on desktops in over a decade, performance isn't too bad. Even the RTX 2060, the slowest RTX card of the bunch, chugs along at 67 fps. And let's be clear: Minecraft isn't some twitch shooter where you benefit from ultra-high framerates. No one is doing lag testing to try and prove that 240 Hz or 360 Hz monitors are better for playing Minecraft. Still, that's at an extremely low resolution.
Move up through the resolution charts and things go from okay to bad to worse. 1080p, without DLSS? Yeah, you can do that, at 30-ish fps on the RTX 2060. Or if you have an RTX 2080 Super or RTX 2080 Ti, you can play at native 1080p and break 60 fps. All that discussion earlier about how full ray tracing or path tracing involves a lot more calculations than traditional rendering? This is the proof. 1440p pushes everything below the RTX 2070 Super under 30 fps, and at native 4K not even the RTX 2080 Ti will provide a great mining or crafting experience. 22 fps? Again, ouch.
Put simply, Minecraft RTX is, if nothing else, the perfect showcase for why Nvidia needs DLSS 2.0. It makes even 4K (which is really upscaling from 1080p) viable on most of the GPUs. Sure, there are going to be some differences in image quality, but DLSS 2.0 to my eyes at least looks more than 'good enough'—and the performance boost makes it an absolute must for anyone wanting to play at something other than 720p.
Enabling DLSS at 720p may not be strictly necessary, but it still improves performance by around 40%—and that's even with a 2080 Ti! Even with a low resolution, all of the GPUs are still the major bottleneck in performance. Move to 1080p and DLSS boosts performance by about 60%, again regardless of which GPU you're using. The RTX 2060 sees the largest gain of 64%, but even the 2080 Ti still improves by 57%.
Average performance gains from DLSS at 1440p and 4K are even larger, in part because they're upscaling relatively lower resolutions. At 1440p, turning on DLSS basically doubles performance: it's 96% faster than native on the 2080 Ti, and 108% faster on a 2060. Finally, 4K DLSS—which upscales 1080p content—yields a 162% increase in framerates on the 2080 Ti, to as much as a 182% increase on the RTX 2060.
Of course, not all of the GPUs are actually hitting what we'd call 'smooth' framerates at 4K with DLSS. The 2060 just barely averages 30 fps, and you'd probably want at least a 2070 Super for 4K to hit acceptable levels of performance. But 1440p and below are viable on every GPU. For consistent 60+ fps, though, you'll need at least a 2070 Super again. 1080p DLSS at least hits 60 fps (barely on the 2060) across the range of RTX cards.
Again, DLSS 2.0 does generally look good—or the native rendering looks quite blurry, if you prefer. The ray tracing effects sometimes cause some graphical anomalies, but DLSS 2.0 in our experience actually does achieve a better overall look than native resolution in Minecraft RTX. We've compared multiple images and videos, with and without DLSS enabled, and even if we ignore the performance uplift DLSS typically looks better. Well, except for the occasional artifacts like around the lights in our test sequence; hopefully that's just a bug (one of many) that will get fixed.
One of the big problems with enabling ray tracing in Minecraft is that the default render distance is significantly lower than the traditional rendering mode. Without ray tracing, all of the GPUs we're looking at can easily handle 1080p with render distance set to the default 64 chunks. Turn on ray tracing and the ray tracing render distance defaults to 8 chunks, at which point the fog and object pop-in become extremely noticeable. Maxing out the render distance at 24 chunks helps a lot, but at the cost of performance and potentially stability. Still, it's our preferred way of playing.
The drop in performance isn't too bad, at least. Most of the RTX cards see a 15-20% drop in performance, and all of the GPUs maintain playable (40+ fps) framerates at 1440p or lower resolutions—with DLSS, of course. Since ray tracing in Minecraft is obviously more about improving visuals than keeping framerates up, a higher render distance makes sense.
60 fps at 1080p now requires at least a 2070 Super, while 1440p needs a 2080 Super or 2080 Ti. Hopefully, Nvidia's upcoming Ampere and RTX 3080 will boost ray tracing performance to make even 4K 60 fps at maximum quality viable.
Minecraft RTX CPU Performance
We've shown that faster RTX graphics cards perform better and are required for higher resolutions in Minecraft RTX, which is nothing new. What about your CPU? While there are situations where the CPU can limit performance—particularly if you're using an RTX 2080 Ti and you've maxed out the render distance—in general it's far less critical than your GPU. We've lumped all 12 test scenarios into one large gallery for this section.
We won't spend a ton of time describing the results of each individual chart. Outside of 720p testing, most of the testing results with the default 8 chunk render distance are basically tied. Yes, everything from a lowly Pentium Gold G5400 up through Ryzen 9 3950X and Core i9-9900K ends up providing similar performance at 1080p, 1440p and 4K. At 720p, there's a 20-25% gap between the Pentium G5400 and the other CPUs, and the Ryzen 9 3950X also performs worse than expected, but there's nothing too far out of the ordinary. Certainly, you can play Minecraft RTX with a very low-end CPU if needed.
Cranking up render distance does open things up a bit more, however. At 720p with maxed out render distance, the i9-9900K is 72% faster than the G5400. It's still 68% faster at 1080p as well, and 48% faster at 1440p. By 4K, things are mostly GPU limited again, with only a 7% gap between the Pentium and the next slowest CPU. The increased render distance also helps the Ryzen 9 3950X, which ends up as the fastest AMD chip up until the 3-way tie at 4K. The Ryzen 3000 CPUs also tend to place ahead of the Core i5 and lower Intel chips, though a Core i7-9700K would probably come out ahead.
We've also included the results of testing the Core i9-9900K with 2x4GB DDR4-2666 memory instead of the 2x16GB DDR4-3200 we used for all the other tests. (And if you have less memory than that, you should upgrade that first before bothering with an RTX GPU.) Turns out, at least for our testing, even dropping to just 8GB of system RAM isn't really a problem. Your mileage may vary, naturally, depending on the map you're playing.
But really, outside of slightly lower minimum framerates in some cases, the CPU side of the story is dwarfed by the GPU requirements. Maybe faster ray tracing GPUs later this year will make the CPU a bigger factor, and further code optimizations could change the story by then.
There's so much to digest with a launch like this. Minecraft for the most part can run on everything from potatoes—along with smartphones and tablets—up through the beefiest of PCs. Cranking up the render distance in the past could cause a few oddities and put more of a load on the CPU, but given the mostly non-twitch nature of the game, extreme framerates aren't really needed. Minecraft RTX changes everything.
The lighting, reflections, and other graphical enhancements definitely make a big difference, both in performance as well as visuals. Minecraft has never looked so pretty! The core survival and exploration gameplay hasn't changed, of course, but makers who spend their time building intricate Minecraft worlds now have a host of new tools available. Those without RTX hardware will definitely feel Nvidia-green with envy for a while.
Looking at the big picture, however, the GPU hardware requirements for full ray tracing—or 'path tracing,' if you prefer Nvidia's terminology—are so high that it's tough to imagine this sort of thing taking off in the near term. Even an RTX 2080 Ti, a GPU that costs over $1,000, fails to handle native 1080p rendering at 60 fps without a severely limited render distance. DLSS isn't just a nice extra to boost performance, either: It's absolutely required on the lower RTX cards, even at 1080p
We'll have to wait and see what AMD's upcoming Big Navi and Nvidia's Ampere do for ray tracing performance. If either one or both of those architectures pack in more ray tracing performance than the current RTX 20-series hardware, maybe we'll start to see heavier use of ray tracing outside of otherwise simplistic graphics environments. More likely, it will be another two generations of Nvidia GPUs and ray tracing tech before we'll see all the reflections, shadows, refractions, godrays and more rendered at higher framerates without the need for resolution upscaling.