We measure real-world power consumption using Powenetics testing hardware and software. We capture in-line GPU power consumption by collecting data while looping Metro Exodus (the original, not the enhanced version) and while running the FurMark stress test. Our test PC remains the same old Core i9-9900K as we've used previously, to keep results consistent.
For the RTX 4090 Founders Edition, our previous settings of 2560x1440 ultra in Metro and 1600x900 in FurMark clearly weren't pushing the GPU hard enough. Power draw was well under the rated 450W TBP, so we increased the resolution and settings to 4K Extreme for Metro and 1440p for FurMark. The following charts are intended to represent worst-case power consumption, temps, etc. and so we do check other settings to ensure we're pushing the GPUs as much as reasonably possible.
Compared to the Asus RTX 3090 Ti, the power results are very similar. That makes sense as both cards have a 450W TBP. Considering you're getting at least 50% higher performance with the 4090 (at 4K), that proves the core architecture is quite a bit more efficient than with Ampere. Both the 4090 and 3090 Ti average just under 440W in Metro Exodus, and 460W–470W in FurMark. Interestingly, the 4090 power draw does take a dip after about six minutes running FurMark, but we don't see a corresponding drop in clock or fan speeds.
GPU clocks are a massive improvement over any prior Nvidia GPU architecture. The RTX 4090 has an official boost clock of 2520 MHz, though Nvidia often exceeds that for gaming workloads. In Metro Exodus, as well as many of the other games we've checked, typical GPU clocks are in the 2750 MHz range. That's higher than even AMD's RX 6750 XT, and there are indications we'd be able to push those clocks even higher if we're willing to crank up the power limit. FurMark does drop to just under 2.5 GHz, but then the other Nvidia GPUs show even larger throttling, with the 3090 Ti as an example running at just 1.44 GHz.
We were impressed with the RTX 3090 Founders Edition at launch, as it had incredibly low noise levels and temperatures — at least while gaming. Cryptocurrency mining was another matter, and the Founders Edition cards routinely hit 110C on their GDDR6X memory before throttling kicked in. But for gaming, the 3090 was usually very cool and quiet. The 3090 Ti from Asus had to ramp up fan speed quite a bit in order to remain at its target temperature of around 65C. The 4090 Founders Edition ends up splitting the difference: slightly higher temps of 67C but lower fan speed. FurMark is also more favorable toward the 4090 than the 3090 Ti, though it's not representative of very many real-world workloads.
We measure noise levels at 10cm using an SPL (sound pressure level) meter, aimed right at the GPU fans in order to minimize the impact of other noise sources like the fans on the CPU cooler. The noise floor of our test environment and equipment is around 32–33 dB(A). The RTX 4090 while gaming plateaued at 45.0 dB(A) and a fan speed of around 40%. The Asus 3090 Ti in comparison ran at 49.1 dB(A) and a 74% fan speed, while a Sapphire RX 6950 XT only measured 37.3 dB(A).
The fans on the RTX 4090 can get louder if needed, and ramping up fan speed to 75% results in 57.2 dB(A). Hopefully the card won't ever need to run the fans that high — and with the death of GPU mining, that's far more likely to be the case during the lifetime of the card.
Overall, the RTX 4090 Founders Edition isn't worse than the RTX 3090 Ti from a pure power consumption point of view, and it offers far superior performance. We're also very interested in doing additional testing with overclocking and third-party cards, which we'll be looking at in the near future.
One thing to note is that, while the power draw of the RTX 4090 might be 450W and that seems like a lot (it is), for chips it's really more about thermal density. Cooling 450W in a 608mm^2 chip isn't that difficult. It's actually quite a bit easier than cooling 250W in a 215mm^2 chip, which is what Alder Lake i9-12900K has to deal with. Zen 4 is actually potentially worse, with a 70mm^2 CCD (core complex die) that could potentially pull well over 140W.
- MORE: Best Graphics Cards
- MORE: GPU Benchmarks and Hierarchy
- MORE: All Graphics Content