Why you can trust Tom's Hardware
In our Intel Arc A380 review, we introduced video encoding performance and quality testing using FFmpeg. We wanted to do the same sort of testing — and more — on the RTX 4090, though time has conspired against us.
To quickly recap, Nvidia's Ada Lovelace architecture includes a new 8th generation NVENC block, which adds support for AV1 encoding. On RTX 40-series cards with 12GB or more VRAM, there are also dual NVENC blocks, which can either work on separate streams or double the encoding performance of a single stream.
The new NVENC allows for encoding at up to 8K 60 Hz, for the 0.001% of people that have such a display. No, I'm not jealous. Why do you ask? (Yes I am, and I'd love to have an 8K display for testing gaming performance as well while I'm here.) Maybe we'll even get 8K 120 Hz support in the future, using both encoders.
We'll update this page with our testing results once we've finished doing the testing, so stay tuned.
- MORE: Best Graphics Cards
- MORE: GPU Benchmarks and Hierarchy
- MORE: All Graphics Content
Current page: GeForce RTX 4090 Video Encoding Performance and Quality
Prev Page GeForce RTX 4090: Professional and Content Creation Performance Next Page GeForce RTX 4090: Power, Temps, Noise, Etc.Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.
-
-Fran- Shouldn't this be up tomorrow?Reply
EDIT: Nevermind. Looks like it was today! YAY.
Thanks for the review!
Regards. -
brandonjclark colossusrage said:Can finally put 4K 120Hz displays to good use.
I'm still on a 3090, but on my 165hz 1440p display, so it maxes most things just fine. I think I'm going to wait for the 5k series GPU's. I know this is a major bump, but dang it's expensive! I simply can't afford to be making these kind of investments in depreciating assets for FUN. -
JarredWaltonGPU
Yeah, Nvidia almost always does major launches with Founders Edition reviews the day before launch, and partner card reviews the day of launch.-Fran- said:Shouldn't this be up tomorrow?
EDIT: Nevermind. Looks like it was today! YAY.
Thanks for the review!
Regards. -
JarredWaltonGPU
You could still possibly get $800 for the 3090. Then it’s “only” $800 to upgrade! LOL. Of course if you sell on eBay it’s $800 - 15%.brandonjclark said:I'm still on a 3090, but on my 165hz 1440p display, so it maxes most things just fine. I think I'm going to wait for the 5k series GPU's. I know this is a major bump, but dang it's expensive! I simply can't afford to be making these kind of investments in depreciating assets for FUN. -
kiniku A review like this, comparing a 4090 to an expensive sports car we should be in awe and envy of, is a bit misleading. PC Gaming systems don't equate to racing on the track or even the freeway. But the way it's worded in this review if you don't buy this GPU, anything "less" is a compromise. That couldn't be further from the truth. People with "big pockets" aren't fools either, except for maybe the few readers here that have convinced themselves and posted they need one or spend everything they make on their gaming PC's. Most gamers don't want or need a 450 watt sucking, 3 slot, space heater to enjoy an immersive, solid 3D experience.Reply -
spongiemaster
Congrats on stating the obvious. Most gamers have no need for a halo GPU that can be CPU limited sometimes even at 4k. A 50% performance improvement while using the same power as a 3090Ti shows outstanding efficiency gains. Early reports are showing excellent undervolting results. 150W decrease with only a 5% loss to performance.kiniku said:Most gamers don't want or need a 450 watt sucking, 3 slot, space heater to enjoy an immersive, solid 3D experience.
Any chance we could get some 720P benchmarks? -
LastStanding the RTX 4090 still comes with three DisplayPort 1.4a outputs
the PCIe x16 slot sticks with the PCIe 4.0 standard rather than upgrading to PCIe 5.0.
These missing components are selling points now, especially knowing NVIDIA's rival(s?) supports the updated ports, so, IMO, this should have been included as a "con" too.
Another thing, why would enthusiasts only value "average metrics" when "average" barely tells the complete results?! It doesn't show the programs stability, any frame-pacing/hitches issues, etc., so a VERY miss oversight here, IMO.
I also find weird is, the DLSS benchmarks. Why champion the increase for extra fps buuuut... never, EVER, no mention of the awareness of DLSS included awful sharpening-pass?! 😏 What the sense of having faster fps but the results show the imagery smeared, ghosting, and/or artefacts to hades? 🤔