Nvidia GeForce RTX 4080 Review: More Efficient, Still Expensive

Slimmed down Ada wearing the old Titan price

Nvidia GeForce RTX 4080
(Image: © Tom's Hardware)

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

(Image credit: Tom's Hardware)

We measure real-world power consumption using Powenetics testing hardware and software. We capture in-line GPU power consumption by collecting data while looping Metro Exodus (the original, not the enhanced version) and while running the FurMark stress test. Our test PC remains the same old Core i9-9900K as we've used previously, to keep results consistent.

For the RTX 4080 Founders Edition, we ran Metro at 3840x2160 using the Extreme preset (no ray tracing or DLSS), and we ran FurMark at 2560x1440. The following charts are intended to represent something of a worst-case scenario for power consumption, temps, etc.

The RTX 4080 Founders Edition used quite a bit less power than the rated 320W TBP. We could only break that limit with overclocking and FurMark, where we hit 332W — still less than the previous generation RTX 3080 running at stock settings. That's a nice change of pace, and the result is a card that's far more efficient than the specifications might lead you to believe.

Clock speeds were also quite impressive, averaging 2.78 GHz in Metro, though the GPU did drop to 2.3 GHz with FurMark. Meanwhile, our manual overclock pushed the GPU beyond 3.0 GHz — the first time we've ever managed that using a card's stock cooler! FurMark clocks also increased to 2.62 GHz with our overclock.

Temperatures, fan speeds, and noise levels all go hand-in-hand. Given the rather massive cooler, it's not too surprising that the RTX 4080 Founders Edition stayed in the low 60C range at stock. Our overclock includes higher fan speeds, resulting in even lower temperatures of 55C–58C. The fans only hit around 1300 RPMs at stock, but our more aggressive fan curve pushed that closer to 2000 RPM when overclocked.

Finally, we have our noise levels, which we gather using an SPL (sound pressure level) meter placed 10cm from the card, with the mic aimed right at the GPU fan(s). This helps minimize the impact of other noise sources, like the fans on the CPU cooler. The noise floor of our test environment and equipment is around 32 dB(A).

After running a demanding game (Metro, at appropriate settings that tax the GPU), the RTX 4080 settled in at a fan speed of 38% and a noise level of 42.6 dB(A). While it's not shown in the chart, our overclocked settings resulted in a 70% fan speed and 55.5 dB(A). Finally, with a static fan speed of 75%, the 4080 Founders Edition generated 57.4 dB(A) of noise.

RTX 4080 Additional Power, Clock, and Temperature Testing

Swipe to scroll horizontally
GameSettingAvg FPSAvg ClockAvg PowerAvg TempAvg Utilization
13-Game Geomean1080p 'Ultra'132.22795.9221.453.579.5%
13-Game Geomean1440p 'Ultra'104.92786.9252.856.387.2%
13-Game Geomean4k 'Ultra'62.72764.128959.997.0%
6-Game DXR Geomean1080p 'Ultra'131.52789.5265.658.488.6%
6-Game DXR Geomean1440p 'Ultra'91.22775.9289.859.893.1%
6-Game DXR Geomean4k 'Ultra'47.12744300.261.295.6%
Borderlands 31080p Badass99.42789.8280.860.799.0%
Borderlands 31440p Badass59.62782.2295.962.399.0%
Borderlands 34k Badass27.32766.1300.562.699.0%
Bright Memory Infinite1080p Very High217.42790275.260.392.8%
Bright Memory Infinite1440p Very High165.72781.4304.462.598.6%
Bright Memory Infinite4k Very High89.12761.5308.863.299.0%
Control1080p High152.12805284.655.399.0%
Control1440p High93.72787.6301.959.199.0%
Control4k High45.62778.7306.85999.0%
Cyberpunk 20771080p RT-Ultra98.52790277.458.497.9%
Cyberpunk 20771440p RT-Ultra62.82777.229655.598.3%
Cyberpunk 20774k RT-Ultra30.32691.2291.958.199.0%
Far Cry 61080p Ultra152.62805.11414261.4%
Far Cry 61440p Ultra150.52805193.446.578.5%
Far Cry 64k Ultra1092803.9258.450.698.8%
Flight Simulator1080p Ultra77.92805188.65056.0%
Flight Simulator1440p Ultra77.5280519150.856.3%
Flight Simulator4k Ultra76.12739.1305.862.897.8%
Fortnite1080p Epic119.22790256.460.590.8%
Fortnite1440p Epic77.72784.5283.862.594.5%
Fortnite4k Epic38.22747.1294.963.296.3%
Forza Horizon 51080p Extreme159.82805152.945.379.7%
Forza Horizon 51440p Extreme151.22805179.149.484.0%
Forza Horizon 54k Extreme126.52801.9227.955.795.5%
Horizon Zero Dawn1080p Ultimate183.82804.8174.948.465.9%
Horizon Zero Dawn1440p Ultimate1762795.1219.153.683.6%
Horizon Zero Dawn4k Ultimate114.42790.4277.658.699.0%
Metro Exodus Enhanced1080p Extreme114.82771.9290.76194.5%
Metro Exodus Enhanced1440p Extreme912735.2304.862.198.0%
Metro Exodus Enhanced4k Extreme522702307.963.199.0%
Minecraft1080p RT 24-Blocks116.22790216.755.162.7%
Minecraft1440p RT 24-Blocks83.22790251.557.373.2%
Minecraft4k RT 24-Blocks44.42785.2291.561.182.5%
Total War Warhammer 31440p Ultra160.92789.9255.857.885.2%
Total War Warhammer 31080p Ultra139.62787.6294.960.797.5%
Total War Warhammer 34k Ultra74.62780.4302.561.898.9%
Watch Dogs Legion1080p Ultra129.8281017146.468.2%
Watch Dogs Legion1440p Ultra1252795.1229.953.486.1%
Watch Dogs Legion4k Ultra87.82789294.46099.0%

Besides our Powenetics testing, we collect all of our frametime data using Nvidia's FrameView, which also logs clock speeds, temperatures, power, and GPU utilization (and plenty of other data as well). We've verified with our Powenetics equipment that Nvidia's GPUs report power draw that's within a few percent of the actual power use, so the above results give a wider perspective on how the card runs.

As we've mentioned already, Nvidia's RTX 4090 and 4080 tend to come in below their maximum rated TBP in a lot of gaming workloads. That's especially true of our 1080p results, where the average power use for the card was just 221W. With a faster CPU like a Core i9-13900K or Ryzen 9 7950X, we'd likely see higher power use from the 4080, though it's likely even an overclocked 13900K wouldn't max out its power limit at 1080p.

In terms of efficiency, you can see the overall power, performance, etc., for thirteen of the games in our test suite. Overclocking increased performance at 4K by 7%, while power use went up by 7.7%. Normally, overclocking can really drop efficiency, but at least with the 4080 Founders Edition the card seems to handle a modest OC fairly well. 

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

  • btmedic04
    At $1200, this card should be DOA on the market. However people will still buy them all up because of mind share. Realistically, this should be an $800-$900 gpu.
    Reply
  • Wisecracker
    Nvidia GPUs also tend to be heavily favored by professional users
    mmmmMehhhhh . . . .Vegas GPU compute on line one . . .
    AMD's new Radeon Pro driver makes Radeon Pro W6800 faster than Nvidia's RTX A5000.
    AMD Rearchitects OpenGL Driver for a 72% Performance Uplift : Read more
    My CAD does OpenGL, too
    :homer:
    Reply
  • saunupe1911
    People flocked to the 4090 as it's a monster but it would be entirely stupid to grab this card while the high end 3000s series exist along with the 4090.

    A 3080 and up will run everything at 2K...and with high refresh rates with DLSS.

    Go big or go home and let this GPU sit! Force Nvidia's hand to lower prices.

    You can't have 2 halo products when there's no demand and the previous gen still exist.
    Reply
  • Math Geek
    they'll cry foul, grumble about the price and even blame retailers for the high price. but only while sitting in line to buy one.......

    man how i wish folks could just get a grip on themselves and let these just sit on shelves for a couple months while Nvidia gets a much needed reality check. but alas they'll sell out in minutes just like always sigh
    Reply
  • chalabam
    Unfortunately the new batch of games is so politized that it makes buying a GPU a bad investment.
    Even when they have the best graphics ever, the gameplay is not worth it.
    Reply
  • gburke
    I am one who likes to have the best to push games to the limit. And I'm usually pretty good about staying on top of current hardware. I can definitely afford it. I "clamored" to get a 3080 at launch and was lucky enough to get one at market value beating out the dreadful scalpers. But makes no sense this time to upgrade over lest gen just for gaming. So I am sitting this one out. I would be curious to know how many others out there like me who doesn't see the real benefit to this new generation hardware for gaming. Honestly, 60fps at 4K on almost all my games is great for me. Not really interested in going above that.
    Reply
  • PlaneInTheSky
    Seeing how much wattage these GPU use in a loop is interesting, but it still tells me nothing regarding real-life cost.

    Cloud gaming suddenly looks more attractive when I realize I won't need to pay to run a GPU at 300 watt.

    The running cost of GPU should now be part of reviews imo.

    Considering how much people in Europe, Japan, and South East Asia are now paying for electricity and how much these new GPU consume.

    Household appliances with similar power usage, usually have their running cost discussed in reviews.
    Reply
  • BaRoMeTrIc
    Math Geek said:
    they'll cry foul, grumble about the price and even blame retailers for the high price. but only while sitting in line to buy one.......

    man how i wish folks could just get a grip on themselves and let these just sit on shelves for a couple months while Nvidia gets a much needed reality check. but alas they'll sell out in minutes just like always sigh
    High end RTX cards have become status symbols amongst gamers.
    Reply
  • Tac 25
    none of my games need it, no reason to buy this thing. The Christmas money is safe.
    Reply
  • sizzling
    I’d like to see a performance per £/€/$ comparison between generations. Normally you would expect this to improve from one generation to the next but I am not seeing it. I bought my mid range 3080 at launch for £754. Seeing these are going to cost £1100-£1200 the performance per £/€/$ seems about flat on last generation. Yeah great, I can get 40-60% more performance for 50% more cost. Fairly disappointing for a new generation card. Look back at how the 3070 & 3080 smashed the performance per £/€/$ compared to a 2080Ti.
    Reply