Nvidia GeForce RTX 4080 Review: More Efficient, Still Expensive

Slimmed down Ada wearing the old Titan price

Nvidia GeForce RTX 4080
(Image: © Tom's Hardware)

Tom's Hardware Verdict

The RTX 4080 has all the technological advancements of the Ada Lovelace architecture, with a price that's difficult to justify. It's too close to the 4090 to entice extreme performance enthusiasts, and soon it will have to contend with AMD's RX 7900 series. But lots of people prefer Nvidia and want DLSS and are willing to pay the piper his dues.


  • +

    Second-fastest GPU (for now)

  • +

    Much improved efficiency

  • +

    Excellent ray tracing performance

  • +

    Packs all the Ada Lovelace enhancements


  • -

    High price without the halo performance of the 4090

  • -

    Needs DLSS 3 to truly shine in gaming performance

  • -

    AMD's RDNA 3 could provide strong competition

  • -

    Lingering concerns surrounding the 16-pin connector

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

The Nvidia GeForce RTX 4080 is the follow-up to last month's RTX 4090 launch, now one of the best graphics cards and the top listing in our GPU benchmarks hierarchy. Of course, a bit of the shine has come off thanks to the melting 16-pin connectors. The good news: RTX 4080 uses less power, which should mean it's also less likely to funnel enough power to melt the plastic connector… maybe. The bad news: At $1,199, it's still priced out of reach for most gamers and represents a big jump in generational pricing, inheriting the RTX 3080 Ti launch price that we also felt was too high.

We already know most of what to expect from Nvidia's Ada Lovelace architecture, so the only real question now is how performance scales down to fewer GPU shaders, less memory, less cache, a narrower memory interface, etc. Let's quickly look at the specifications for a few of the top Nvidia and AMD GPUs. 

Swipe to scroll horizontally
Nvidia and AMD GPU Specifications
Graphics CardRTX 4080RTX 4090RTX 3090 TiRTX 3080 TiRTX 3080RX 7900 XTXRX 7900 XT
ArchitectureAD103AD102GA102GA102GA102Navi 31Navi 31
Process TechnologyTSMC 4NTSMC 4NSamsung 8NSamsung 8NSamsung 8NTSMC N5 + N6TSMC N5 + N6
Transistors (Billion)45.976.328.328.328.345.6 + 6x 2.0545.6 + 5x 2.05
Die size (mm^2)378.6608.4628.4628.4628.4300 + 222300 + 185
GPU Shaders972816384107521024087041228810752
Tensor Cores304512336320272N/AN/A
Ray Tracing "Cores"761288480689684
Boost Clock (MHz)2505252018601665171025002400
VRAM Speed (Gbps)22.4212119192020
VRAM (GB)16242412102420
VRAM Bus Width256384384384320384320
L2 Cache64726659680
TFLOPS FP3248.782.64034.129.861.451.6
TFLOPS FP16 (FP8/INT8)390 (780)661 (1321)160 (320)136 (273)119 (238)123 (246)103 (206)
Bandwidth (GBps)71710081008912760960800
TBP (watts)320450450350320355300
Launch DateNov 2022Oct 2022Mar 2022Jun 2021Sep 2020Dec 2022Dec 2022
Launch Price$1,199 $1,599 $1,999 $1,199 $699 $999 $899

There's a relatively large gap between the RTX 4080 and the larger RTX 4090. You get most of an AD103 GPU — 76 of the potential 80 Streaming Multiprocessors (SMs) — but that's still 40% fewer GPU shaders and other functional units than the RTX 4090. Clock speeds are similar, you get 33% fewer memory channels, VRAM, and bandwidth, and the rated TBP drops by 29%. On paper, the RTX 4090 could be up to 70% faster based on the theoretical compute performance, and that's a concern.

$1,199 is hardly affordable, so it feels like anyone even looking at the RTX 4080 should probably just save up the additional $400 for the RTX 4090 and go for broke — or melted. But then the RTX 4090 has been sold out at anywhere below $2,100 since launch, which means it could actually be a $900 upsell, and that's far more significant.

The pricing becomes even more of a concern when we factor in AMD's Radeon RX 7900 XTX/XT cards coming next month. We now have all the pertinent details for the first cards using AMD's RDNA 3 GPU architecture, and they certainly look promising. Prices are still high, but the specs comparisons suggest AMD might be able to beat the RTX 4080 while costing at least $200–$300 less. This means, unless you absolutely refuse to consider purchasing an AMD graphics card, you should at least wait until next month to see what the red team has to offer.

However, Nvidia does have some extras that AMD is unlikely to match in the near term. For example, the Deep learning and AI horsepower in RTX 4080 far surpass what AMD intends to offer. If we've got the figures right, AMD's FP16 and INT8 throughput will be less than a third of the RTX 4080.

Nvidia also offers DLSS 3 courtesy of the enhanced Optical Flow Accelerator (OFA). Ten games already support the technology: Bright Memory: Infinite, Destroy All Humans! 2 - Reprobed, F.I.S.T.: Forged in Shadow Torch, F1 22, Justice, Loopmancer, Marvel’s Spider-Man Remastered, Microsoft Flight Simulator, A Plague Tale: Requiem, and Super People. That's about half as many DLSS 3 games in less than a month as those with AMD's FSR2 technology. Of course, you need an RTX 40-series GPU for DLSS 3, while FSR2 works with pretty much everything.

Nvidia GPUs also tend to be heavily favored by professional users, or at least their employers. So while true workstations will likely opt for the RTX 6000 48GB card as opposed to a GeForce RTX 40-series, there's certainly potential in picking up one or more RTX 4080 cards for AI and deep learning use. Content creators may also find something to like, though again, if you're willing to pay for a 4080, it may not be a huge step up in pricing to nab a 4090 instead.

Another piece of good news (depending on which side of the aisle you fall, we suppose) is that GPU mining remains unprofitable. Gamers won't be able to offset the price of a new graphics card through cryptocurrency mining, but at least there should be more GPUs available for gamers. Now let's see exactly what Nvidia has to offer with its new RTX 4080. 

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

  • btmedic04
    At $1200, this card should be DOA on the market. However people will still buy them all up because of mind share. Realistically, this should be an $800-$900 gpu.
  • Wisecracker
    Nvidia GPUs also tend to be heavily favored by professional users
    mmmmMehhhhh . . . .Vegas GPU compute on line one . . .
    AMD's new Radeon Pro driver makes Radeon Pro W6800 faster than Nvidia's RTX A5000.
    AMD Rearchitects OpenGL Driver for a 72% Performance Uplift : Read more
    My CAD does OpenGL, too
  • saunupe1911
    People flocked to the 4090 as it's a monster but it would be entirely stupid to grab this card while the high end 3000s series exist along with the 4090.

    A 3080 and up will run everything at 2K...and with high refresh rates with DLSS.

    Go big or go home and let this GPU sit! Force Nvidia's hand to lower prices.

    You can't have 2 halo products when there's no demand and the previous gen still exist.
  • Math Geek
    they'll cry foul, grumble about the price and even blame retailers for the high price. but only while sitting in line to buy one.......

    man how i wish folks could just get a grip on themselves and let these just sit on shelves for a couple months while Nvidia gets a much needed reality check. but alas they'll sell out in minutes just like always sigh
  • chalabam
    Unfortunately the new batch of games is so politized that it makes buying a GPU a bad investment.
    Even when they have the best graphics ever, the gameplay is not worth it.
  • gburke
    I am one who likes to have the best to push games to the limit. And I'm usually pretty good about staying on top of current hardware. I can definitely afford it. I "clamored" to get a 3080 at launch and was lucky enough to get one at market value beating out the dreadful scalpers. But makes no sense this time to upgrade over lest gen just for gaming. So I am sitting this one out. I would be curious to know how many others out there like me who doesn't see the real benefit to this new generation hardware for gaming. Honestly, 60fps at 4K on almost all my games is great for me. Not really interested in going above that.
  • PlaneInTheSky
    Seeing how much wattage these GPU use in a loop is interesting, but it still tells me nothing regarding real-life cost.

    Cloud gaming suddenly looks more attractive when I realize I won't need to pay to run a GPU at 300 watt.

    The running cost of GPU should now be part of reviews imo.

    Considering how much people in Europe, Japan, and South East Asia are now paying for electricity and how much these new GPU consume.

    Household appliances with similar power usage, usually have their running cost discussed in reviews.
  • BaRoMeTrIc
    Math Geek said:
    they'll cry foul, grumble about the price and even blame retailers for the high price. but only while sitting in line to buy one.......

    man how i wish folks could just get a grip on themselves and let these just sit on shelves for a couple months while Nvidia gets a much needed reality check. but alas they'll sell out in minutes just like always sigh
    High end RTX cards have become status symbols amongst gamers.
  • Tac 25
    none of my games need it, no reason to buy this thing. The Christmas money is safe.
  • sizzling
    I’d like to see a performance per £/€/$ comparison between generations. Normally you would expect this to improve from one generation to the next but I am not seeing it. I bought my mid range 3080 at launch for £754. Seeing these are going to cost £1100-£1200 the performance per £/€/$ seems about flat on last generation. Yeah great, I can get 40-60% more performance for 50% more cost. Fairly disappointing for a new generation card. Look back at how the 3070 & 3080 smashed the performance per £/€/$ compared to a 2080Ti.