Tom's Hardware Verdict
The RTX 4080 has all the technological advancements of the Ada Lovelace architecture, with a price that's difficult to justify. It's too close to the 4090 to entice extreme performance enthusiasts, and soon it will have to contend with AMD's RX 7900 series. But lots of people prefer Nvidia and want DLSS and are willing to pay the piper his dues.
Pros
- +
Second-fastest GPU (for now)
- +
Much improved efficiency
- +
Excellent ray tracing performance
- +
Packs all the Ada Lovelace enhancements
Cons
- -
High price without the halo performance of the 4090
- -
Needs DLSS 3 to truly shine in gaming performance
- -
AMD's RDNA 3 could provide strong competition
- -
Lingering concerns surrounding the 16-pin connector
Why you can trust Tom's Hardware
The Nvidia GeForce RTX 4080 is the follow-up to last month's RTX 4090 launch, now one of the best graphics cards and the top listing in our GPU benchmarks hierarchy. Of course, a bit of the shine has come off thanks to the melting 16-pin connectors. The good news: RTX 4080 uses less power, which should mean it's also less likely to funnel enough power to melt the plastic connector… maybe. The bad news: At $1,199, it's still priced out of reach for most gamers and represents a big jump in generational pricing, inheriting the RTX 3080 Ti launch price that we also felt was too high.
We already know most of what to expect from Nvidia's Ada Lovelace architecture, so the only real question now is how performance scales down to fewer GPU shaders, less memory, less cache, a narrower memory interface, etc. Let's quickly look at the specifications for a few of the top Nvidia and AMD GPUs.
Graphics Card | RTX 4080 | RTX 4090 | RTX 3090 Ti | RTX 3080 Ti | RTX 3080 | RX 7900 XTX | RX 7900 XT |
---|---|---|---|---|---|---|---|
Architecture | AD103 | AD102 | GA102 | GA102 | GA102 | Navi 31 | Navi 31 |
Process Technology | TSMC 4N | TSMC 4N | Samsung 8N | Samsung 8N | Samsung 8N | TSMC N5 + N6 | TSMC N5 + N6 |
Transistors (Billion) | 45.9 | 76.3 | 28.3 | 28.3 | 28.3 | 45.6 + 6x 2.05 | 45.6 + 5x 2.05 |
Die size (mm^2) | 378.6 | 608.4 | 628.4 | 628.4 | 628.4 | 300 + 222 | 300 + 185 |
SMs | 76 | 128 | 84 | 80 | 68 | 96 | 84 |
GPU Shaders | 9728 | 16384 | 10752 | 10240 | 8704 | 12288 | 10752 |
Tensor Cores | 304 | 512 | 336 | 320 | 272 | N/A | N/A |
Ray Tracing "Cores" | 76 | 128 | 84 | 80 | 68 | 96 | 84 |
Boost Clock (MHz) | 2505 | 2520 | 1860 | 1665 | 1710 | 2500 | 2400 |
VRAM Speed (Gbps) | 22.4 | 21 | 21 | 19 | 19 | 20 | 20 |
VRAM (GB) | 16 | 24 | 24 | 12 | 10 | 24 | 20 |
VRAM Bus Width | 256 | 384 | 384 | 384 | 320 | 384 | 320 |
L2 Cache | 64 | 72 | 6 | 6 | 5 | 96 | 80 |
ROPs | 112 | 176 | 112 | 112 | 96 | 192 | 192 |
TMUs | 304 | 512 | 336 | 320 | 272 | 384 | 336 |
TFLOPS FP32 | 48.7 | 82.6 | 40 | 34.1 | 29.8 | 61.4 | 51.6 |
TFLOPS FP16 (FP8/INT8) | 390 (780) | 661 (1321) | 160 (320) | 136 (273) | 119 (238) | 123 (246) | 103 (206) |
Bandwidth (GBps) | 717 | 1008 | 1008 | 912 | 760 | 960 | 800 |
TBP (watts) | 320 | 450 | 450 | 350 | 320 | 355 | 300 |
Launch Date | Nov 2022 | Oct 2022 | Mar 2022 | Jun 2021 | Sep 2020 | Dec 2022 | Dec 2022 |
Launch Price | $1,199 | $1,599 | $1,999 | $1,199 | $699 | $999 | $899 |
There's a relatively large gap between the RTX 4080 and the larger RTX 4090. You get most of an AD103 GPU — 76 of the potential 80 Streaming Multiprocessors (SMs) — but that's still 40% fewer GPU shaders and other functional units than the RTX 4090. Clock speeds are similar, you get 33% fewer memory channels, VRAM, and bandwidth, and the rated TBP drops by 29%. On paper, the RTX 4090 could be up to 70% faster based on the theoretical compute performance, and that's a concern.
$1,199 is hardly affordable, so it feels like anyone even looking at the RTX 4080 should probably just save up the additional $400 for the RTX 4090 and go for broke — or melted. But then the RTX 4090 has been sold out at anywhere below $2,100 since launch, which means it could actually be a $900 upsell, and that's far more significant.
The pricing becomes even more of a concern when we factor in AMD's Radeon RX 7900 XTX/XT cards coming next month. We now have all the pertinent details for the first cards using AMD's RDNA 3 GPU architecture, and they certainly look promising. Prices are still high, but the specs comparisons suggest AMD might be able to beat the RTX 4080 while costing at least $200–$300 less. This means, unless you absolutely refuse to consider purchasing an AMD graphics card, you should at least wait until next month to see what the red team has to offer.
However, Nvidia does have some extras that AMD is unlikely to match in the near term. For example, the Deep learning and AI horsepower in RTX 4080 far surpass what AMD intends to offer. If we've got the figures right, AMD's FP16 and INT8 throughput will be less than a third of the RTX 4080.
Nvidia also offers DLSS 3 courtesy of the enhanced Optical Flow Accelerator (OFA). Ten games already support the technology: Bright Memory: Infinite, Destroy All Humans! 2 - Reprobed, F.I.S.T.: Forged in Shadow Torch, F1 22, Justice, Loopmancer, Marvel’s Spider-Man Remastered, Microsoft Flight Simulator, A Plague Tale: Requiem, and Super People. That's about half as many DLSS 3 games in less than a month as those with AMD's FSR2 technology. Of course, you need an RTX 40-series GPU for DLSS 3, while FSR2 works with pretty much everything.
Nvidia GPUs also tend to be heavily favored by professional users, or at least their employers. So while true workstations will likely opt for the RTX 6000 48GB card as opposed to a GeForce RTX 40-series, there's certainly potential in picking up one or more RTX 4080 cards for AI and deep learning use. Content creators may also find something to like, though again, if you're willing to pay for a 4080, it may not be a huge step up in pricing to nab a 4090 instead.
Another piece of good news (depending on which side of the aisle you fall, we suppose) is that GPU mining remains unprofitable. Gamers won't be able to offset the price of a new graphics card through cryptocurrency mining, but at least there should be more GPUs available for gamers. Now let's see exactly what Nvidia has to offer with its new RTX 4080.
- MORE: Best Graphics Cards
- MORE: GPU Benchmarks and Hierarchy
- MORE: All Graphics Content
Current page: Nvidia GeForce RTX 4080 Founders Edition Review
Next Page Nvidia RTX 4080 Design and AssemblyJarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.
-
btmedic04 At $1200, this card should be DOA on the market. However people will still buy them all up because of mind share. Realistically, this should be an $800-$900 gpu.Reply -
Wisecracker Nvidia GPUs also tend to be heavily favored by professional users
mmmmMehhhhh . . . .Vegas GPU compute on line one . . .
AMD's new Radeon Pro driver makes Radeon Pro W6800 faster than Nvidia's RTX A5000.
My CAD does OpenGL, too
AMD Rearchitects OpenGL Driver for a 72% Performance Uplift : Read more
:homer: -
saunupe1911 People flocked to the 4090 as it's a monster but it would be entirely stupid to grab this card while the high end 3000s series exist along with the 4090.Reply
A 3080 and up will run everything at 2K...and with high refresh rates with DLSS.
Go big or go home and let this GPU sit! Force Nvidia's hand to lower prices.
You can't have 2 halo products when there's no demand and the previous gen still exist. -
Math Geek they'll cry foul, grumble about the price and even blame retailers for the high price. but only while sitting in line to buy one.......Reply
man how i wish folks could just get a grip on themselves and let these just sit on shelves for a couple months while Nvidia gets a much needed reality check. but alas they'll sell out in minutes just like always sigh -
chalabam Unfortunately the new batch of games is so politized that it makes buying a GPU a bad investment.Reply
Even when they have the best graphics ever, the gameplay is not worth it. -
gburke I am one who likes to have the best to push games to the limit. And I'm usually pretty good about staying on top of current hardware. I can definitely afford it. I "clamored" to get a 3080 at launch and was lucky enough to get one at market value beating out the dreadful scalpers. But makes no sense this time to upgrade over lest gen just for gaming. So I am sitting this one out. I would be curious to know how many others out there like me who doesn't see the real benefit to this new generation hardware for gaming. Honestly, 60fps at 4K on almost all my games is great for me. Not really interested in going above that.Reply -
PlaneInTheSky Seeing how much wattage these GPU use in a loop is interesting, but it still tells me nothing regarding real-life cost.Reply
Cloud gaming suddenly looks more attractive when I realize I won't need to pay to run a GPU at 300 watt.
The running cost of GPU should now be part of reviews imo.
Considering how much people in Europe, Japan, and South East Asia are now paying for electricity and how much these new GPU consume.
Household appliances with similar power usage, usually have their running cost discussed in reviews. -
BaRoMeTrIc
High end RTX cards have become status symbols amongst gamers.Math Geek said:they'll cry foul, grumble about the price and even blame retailers for the high price. but only while sitting in line to buy one.......
man how i wish folks could just get a grip on themselves and let these just sit on shelves for a couple months while Nvidia gets a much needed reality check. but alas they'll sell out in minutes just like always sigh -
sizzling I’d like to see a performance per £/€/$ comparison between generations. Normally you would expect this to improve from one generation to the next but I am not seeing it. I bought my mid range 3080 at launch for £754. Seeing these are going to cost £1100-£1200 the performance per £/€/$ seems about flat on last generation. Yeah great, I can get 40-60% more performance for 50% more cost. Fairly disappointing for a new generation card. Look back at how the 3070 & 3080 smashed the performance per £/€/$ compared to a 2080Ti.Reply