Nvidia GeForce RTX 4090 Review: Queen of the Castle

Ada Lovelace delivers the goods, at a steep price.

Nvidia GeForce RTX 4090
Editor's Choice
(Image: © Tom's Hardware)

Tom's Hardware Verdict

The RTX 4090 delivers on the technological and performance fronts, easily besting previous generation offerings. With major enhancements to all the core hardware and significantly higher clock speeds, plus forward looking tech like DLSS 3, the new bar has been set very high — with an equally high price tag.

Pros

  • +

    Fastest GPU currently available

  • +

    Major architectural improvements

  • +

    DLSS 3 addresses CPU bottlenecks

  • +

    Excellent for content creation

  • +

    AV1 support and dual NVENC

Cons

  • -

    Extreme pricing and power

  • -

    Limited gains at 1440p and lower resolutions

  • -

    DLSS 3 adoption will take time

  • -

    We need to see AMD RDNA 3

  • -

    The inevitable RTX 4090 Ti looms

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

The Nvidia GeForce RTX 4090 hype train has been building for most of 2022. After more than a year of extreme GPU prices and shortages, CEO Jensen Huang revealed key details at GTC 2022, with a price sure to make many cry out in despair. $1,599 for the top offering from Nvidia's Ada Lovelace architecture? Actually, that's only $100 more than the RTX 3090 at launch, and if the card can come anywhere near Nvidia's claims of 2x–4x the performance of an RTX 3090 Ti, there will undoubtedly be people willing to pay it. The RTX 4090 now sits atop the GPU benchmarks hierarchy throne, at least at 1440p and 4K. For anyone who's after the fastest possible GPU, never mind the price, it now ranks among the best graphics cards.

That's not to say the RTX 4090 represents a good value, though that can get a bit subjective. Looking just at the FPS delivered by the various GPUs per dollar spent, it ranks dead last out of 68 GPUs from the past decade. Except our standard ranking uses 1080p ultra performance, and the 4090 most decidedly is not a card designed to excel at 1080p. In fact, it's so fast that CPU bottlenecks are still a concern even when gaming at 1440p ultra. Look at 4K performance and factor in ray tracing, and you could argue it's possibly one of the best values — see what we mean about value being subjective?

Again, you'll pay dearly for the privilege of owning an RTX 4090 card, as the base model RTX 4090 Founders Edition costs $1,599 and partner cards can push the price up to $1,999. But for those who want the best, or anyone with deep enough pockets that $2,000 isn't a huge deal, this is the card you'll want to get right now, and we'd be surprised to see anything surpass it in this generation, short of a future RTX 4090 Ti. 

Swipe to scroll horizontally
Current Top-Tier GPU Specifications
Graphics CardRTX 4090RTX 3090 TiRTX 3090RTX 3080 TiRX 6950 XTArc A770 16GB
ArchitectureAD102GA102GA102GA102Navi 21ACM-G10
Process TechnologyTSMC 4NSamsung 8NSamsung 8NSamsung 8NTSMC N7TSMC N6
Transistors (Billion)76.328.328.328.326.821.7
Die size (mm^2)608.4628.4628.4628.4519406
SMs / CUs / Xe-Cores1288482808032
GPU Shaders1638410752104961024051204096
Tensor Cores512336328320N/A512
Ray Tracing "Cores"1288482808032
Boost Clock (MHz)252018601695166523102100
VRAM Speed (Gbps)212119.5191817.5
VRAM (GB)242424121616
VRAM Bus Width384384384384256256
L2 / Infinity Cache7266612816
ROPs176112112112128128
TMUs512336328320320256
TFLOPS FP3282.64035.634.123.717.2
TFLOPS FP16 (FP8/INT8)661 (1321)160 (320)142 (285)136 (273)47.4138 (275)
Bandwidth (GBps)10081008936912576560
TDP (watts)450450350350335225
Launch DateOct 2022Mar 2022Sep 2020Jun 2021May 2022Oct 2022
Launch Price$1,599 $1,999 $1,499 $1,199 $1,099 $349

Here's a look at the who's who of the extreme performance graphics card world, with the fastest cards from Nvidia, AMD, and now Intel. Obviously, Intel's Arc A770 competes on a completely different playing field, but it's still interesting to show how it stacks up on paper.

We're going to simply refer you to our Nvidia Ada Lovelace Architectural deep dive if you want to learn about all the new technologies and changes made with the RTX 40-series. The above specs table tells a lot of what you need to know. Transistor counts have nearly tripled compared to Ampere; core counts on the RTX 4090 are 52% higher than the RTX 3090 Ti; GPU clock speeds are 35% faster, and the GDDR6X memory? It's still mostly unchanged, except there's now 12x more L2 cache to keep the GPU from having to request data from memory as often.

On paper, that gives the RTX 4090 just over double the compute performance of the RTX 3090 Ti, and there are definitely workloads where you'll see exactly those sorts of gains. But under the hood, there are other changes that can further widen the gap.

Ray tracing once again gets a big emphasis, and three new technologies — Shader Execution Reordering (SER), Opacity Micro-Maps (OMM) and Displaced Micro-Meshes (DMM) — all offer potential improvements. However, they also require developers to use them, which means existing games and engines won't benefit.

Deep learning and AI workloads also stand to see massive generational improvements. Ada includes the FP8 Transformer Engine from Hopper H100, along with FP8 number format support. That means double the compute per Tensor core, for algorithms that can use FP8 instead of FP16, and up to four times the number-crunching prowess of the 3090 Ti.

One algorithm that can utilize the new Tensor cores — along with an improved Optical Flow Accelerator (OFA) — is DLSS 3. In fact, DLSS 3 requires an RTX 40-series graphics card, so earlier RTX cards won't benefit. What does DLSS 3 do? It takes the current and previously rendered frames and generates an extra in-between frame to fill the gap. In some cases, it can nearly double the performance of DLSS 2. We'll take a closer look at DLSS 3 later in this review.

From a professional perspective, particularly for anyone interested in deep learning, you can easily justify the cost of the RTX 4090 — time is money, and doubling or quadrupling throughput will definitely save time. Content creators will find a lot to like and it's a quick and easy upgrade from a 3090 or 3090 Ti to the 4090. We'll look at ProViz performance as well.

But what about gamers? Unlike the RTX 3090 and 3090 Ti, Nvidia isn't going on about how the RTX 4090 is designed for professionals. Yes, it will work great for such people, but it's also part of the GeForce family, and Nvidia isn't holding back on its gaming performance claims and comparisons. Maybe the past two years of cryptocurrency mining are to blame, though GPU mining is now unprofitable so at least gamers won't have to fight miners for cards this round. 

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

  • -Fran-
    Shouldn't this be up tomorrow?

    EDIT: Nevermind. Looks like it was today! YAY.

    Thanks for the review!

    Regards.
    Reply
  • colossusrage
    Can finally put 4K 120Hz displays to good use.
    Reply
  • brandonjclark
    colossusrage said:
    Can finally put 4K 120Hz displays to good use.

    I'm still on a 3090, but on my 165hz 1440p display, so it maxes most things just fine. I think I'm going to wait for the 5k series GPU's. I know this is a major bump, but dang it's expensive! I simply can't afford to be making these kind of investments in depreciating assets for FUN.
    Reply
  • JarredWaltonGPU
    -Fran- said:
    Shouldn't this be up tomorrow?

    EDIT: Nevermind. Looks like it was today! YAY.

    Thanks for the review!

    Regards.
    Yeah, Nvidia almost always does major launches with Founders Edition reviews the day before launch, and partner card reviews the day of launch.
    Reply
  • JarredWaltonGPU
    brandonjclark said:
    I'm still on a 3090, but on my 165hz 1440p display, so it maxes most things just fine. I think I'm going to wait for the 5k series GPU's. I know this is a major bump, but dang it's expensive! I simply can't afford to be making these kind of investments in depreciating assets for FUN.
    You could still possibly get $800 for the 3090. Then it’s “only” $800 to upgrade! LOL. Of course if you sell on eBay it’s $800 - 15%.
    Reply
  • kiniku
    A review like this, comparing a 4090 to an expensive sports car we should be in awe and envy of, is a bit misleading. PC Gaming systems don't equate to racing on the track or even the freeway. But the way it's worded in this review if you don't buy this GPU, anything "less" is a compromise. That couldn't be further from the truth. People with "big pockets" aren't fools either, except for maybe the few readers here that have convinced themselves and posted they need one or spend everything they make on their gaming PC's. Most gamers don't want or need a 450 watt sucking, 3 slot, space heater to enjoy an immersive, solid 3D experience.
    Reply
  • y2kmady
    can this be used in aorus b550 pro ac ?
    Reply
  • spongiemaster
    kiniku said:
    Most gamers don't want or need a 450 watt sucking, 3 slot, space heater to enjoy an immersive, solid 3D experience.
    Congrats on stating the obvious. Most gamers have no need for a halo GPU that can be CPU limited sometimes even at 4k. A 50% performance improvement while using the same power as a 3090Ti shows outstanding efficiency gains. Early reports are showing excellent undervolting results. 150W decrease with only a 5% loss to performance.

    Any chance we could get some 720P benchmarks?
    Reply
  • LastStanding
    the RTX 4090 still comes with three DisplayPort 1.4a outputs

    the PCIe x16 slot sticks with the PCIe 4.0 standard rather than upgrading to PCIe 5.0.

    These missing components are selling points now, especially knowing NVIDIA's rival(s?) supports the updated ports, so, IMO, this should have been included as a "con" too.

    Another thing, why would enthusiasts only value "average metrics" when "average" barely tells the complete results?! It doesn't show the programs stability, any frame-pacing/hitches issues, etc., so a VERY miss oversight here, IMO.

    I also find weird is, the DLSS benchmarks. Why champion the increase for extra fps buuuut... never, EVER, no mention of the awareness of DLSS included awful sharpening-pass?! 😏 What the sense of having faster fps but the results show the imagery smeared, ghosting, and/or artefacts to hades? 🤔
    Reply
  • chalabam
    This card is for AI. Where are the tensorflow and AI benchmarks
    Reply