Why you can trust Tom's Hardware
MSI provided its RTX 3080 12GB Suprim X for this review, which uses the same core design as other Suprim X GPUs. It's a large triple-fan, triple-slot card that measures 336x140x61mm and weighs 1901g. It includes a fair amount of RGB lighting, with two "wide V" accents between the fans, a large RGB strip along the top, and an RGB MSI Dragon logo on the metal backplate. MSI uses three custom 95mm Torx 4.0 fans in its Tri Frozr 2S cooling configuration, which feature partially integrated rims to increase static pressure and airflow. Combined with the chunky heatsink, there's plenty of cooling potential on tap.
Video outputs consist of the usual triple DisplayPort 1.4 and single HDMI 2.1. That's a bit surprising, considering the high-end nature of the design, and some of the competing GPUs add a second HDMI 2.1 connection as a fifth output. Note also that the card runs with a boost clock of 1830MHz, unless you install the MSI Dragon software and select the OC profile, which bumps the clocks to 1845MHz. It also requires three 8-pin PEG power connectors, and MSI lists the power consumption at 400W. That's higher than the reference 3080 Ti and 3090, and is largely due to the boosted GPU clocks.
MSI includes a few other extras with the card as well. For people who worry about the weight of the graphics card, an adjustable-height support can help out — just don't move or ship the PC anywhere with the stand installed, as it will end up banging around inside your case. There's also a decent 38x24cm mouse pad, should you need one of those. In short, the MSI Suprim X brand is the "kitchen sink" approach to graphics cards, with plenty of enthusiast-oriented features and extras.
Test Setup for GeForce RTX 3050
We've updated our GPU test PC and gaming suite for 2022. We're now using a Core i9-12900K processor, MSI Pro Z690-A DDR4 WiFi motherboard, and DDR4-3600 memory (with XMP enabled). We also upgraded to Windows 11 Pro, since it's basically required to get the most out of Alder Lake. You can see the rest of the hardware in the boxout.
Our new gaming tests consist of a "standard" suite of seven games without ray tracing enabled (even if the game supports it), and a separate "ray tracing" suite of six games that all use multiple RT effects. For this review we'll be testing at 4K, 1440p, and 1080p at "ultra" settings — which generally means maxed out settings, except without SSAA if that's an option. We also enable DLSS Quality mode in the games that support it, which includes all of the ray tracing suite and three of the games in the standard suite.
- MORE: Best Graphics Cards
- MORE: GPU Benchmarks and Hierarchy
- MORE: All Graphics Content
Current page: MSI GeForce RTX 3080 12GB Suprim X
Prev Page Features and Specifications Next Page MSI GeForce RTX 3080 12GB Gaming PerformanceJarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.
30-year-old Pentium FDIV bug tracked down in the silicon — Ken Shirriff takes the microscope to Intel's first-ever recall
Cyberpunk 2077 update 2.2 claims to improve Arrow Lake performance by up to 33%, theoretically matching the Ryzen 7 7800X3D
Empyrean Technology gives control to CEC after U.S. blacklisting — China’s top developer of chip design systems hands reins to state-owned firm
-
King_V "You know, we have a lot of GA102 chips where all 12 memory controllers are perfectly fine but where we can't hit the necessary 80 SMs for the 3080 Ti. Rather than selling these as an RTX 3080 10GB, what if we just make a 3080 12GB? Then we don't have to directly deal with the supposed $699 MSRP."
That may not be exactly what happened, but it certainly feels that way. MSI has produced a graphics card that effectively ties the RTX 3080 Ti in performance, even though it has 10 fewer SMs and needs more power to get there.
I guess that's one way to hit the price point and raw performance metrics. -
10tacle THANK YOU Jared for including FS2020 in this hardware review! That's pretty much all I use my rig for as a hard core flight simmer. I find the gap increase at 4K with the 3080 Ti (my GPU) over this 10GB 3080 interesting. That sim is a strange hardware demand duck and I'm still trying to figure out fine tuning details some 6 months on now since my build last August after winning a NewEgg shuffle to get the GPU (I paid $1399.99 for the EVGA FTW3 edition).Reply
The interesting thing is that it is the only game (sim to me) that shows a leg stretch gap between the lower tiered/VRAM'd cards, including this 10GB 3080, at 4K. My guess is that's where the Ti's wider memory bus bandwidth, CUDA cores (I still call Nvidia GPU cores that), and extra 2GB VRAM come together to show off the power of the GPU. The same can be said with the 3090 sitting at the top of the FS2020 chart at 4K. -
HideOut I hope the Intel ARC series is competitive and they put so much price pressure on nVidia that they nearly go broke. They're doing noithing but ripping off customers these days with this kinda rediculous trash.Reply -
watzupken I actually see no point in paying this much for a RTX 3000 series card when we are expecting RDNA3 and RTX 4000 series to be announced later this year. I feel Nvidia released this card with the intention of increasing prices, less to push performance. After all, between RTX 3080 and RTX 3090, each "new" product is just marginally better than the previous, but cost quite a bit more for a few % improvement.Reply -
JarredWaltonGPU
Like I said in the review, Nvidia claims this was an AIB-driven release. Meaning, MSI, Asus, EVGA, Gigabyte, etc. wanted an alternative to the 3080 Ti and 3080 10GB. Probably there were chips that could be used in the "formerly Quadro" line that have similar specs of 70 SMs and 384-bit memory interface, and Nvidia created this SKU for them to use. But that would assume there's less demand for the professional GPUs right now, as otherwise Nvidia would presumably prefer selling the chips in that more lucrative market.watzupken said:I actually see no point in paying this much for a RTX 3000 series card when we are expecting RDNA3 and RTX 4000 series to be announced later this year. I feel Nvidia released this card with the intention of increasing prices, less to push performance. After all, between RTX 3080 and RTX 3090, each "new" product is just marginally better than the previous, but cost quite a bit more for a few % improvement.