After a long but worth-it wait, Nvidia has announced the chipmaker's spanking new GeForce RTX 30-series (codename Ampere) graphics cards. The GeForce RTX 3090 and GeForce RTX 3080 will be available on September 24 and 17, respectively, with the GeForce RTX 3070 coming at a later date in October. These three cards boast impressive specs that will vie for our Best Graphics Cards for Gaming list.
Built on a custom Samsung 8nm process, Ampere comes equipped with Nvidia's second-generation Ray Tracing cores and third-generation Tensor cores. The GeForce RTX 3090, RTX 3080 and RTX 3070 are also the first Nvidia consumer graphics cards to come with PCIe 4.0 support. We have the architectural deep dive details here.
The GeForce RTX 3090 is the behemoth of the Ampere lineup. The triple-slot graphics card measures 12.3 x 5.4 inches (313 x 138mm) and flaunts specifications would impress even the most demanding enthusiasts. In terms of performance, Nvidia claims that the GeForce RTX 3090 is up to 50% faster than the Titan RTX.
The GeForce RTX 3090 is aimed at 8K gaming at 60 frames per second, so it comes equipped with 10,496 CUDA cores that feature a boost clock up to 1.7 GHz. There's also 24GB of 19.5 Gbps GDDR6X memory across a 384-bit memory interface. The graphics card has a 350W TDP (thermal design power) and requires two 8-pin PCIe power connectors.
Professional and hardcore enthusiasts will be delighted to know that the GeForce RTX 3090 is the only Ampere-based graphics card to support SLI through Nvidia's NVLink connector. This opens the door to pairing up two of these beasts together for an awesome compute machine. The GeForce RTX NVLink Bridge costs $79.99 and will be available on the same day as the GeForce RTX 3090.
Nvidia Ampere / RTX 30-Series Specifications
Header Cell - Column 0 | GeForce RTX 3090 | GeForce RTX 3080 | GeForce RTX 3070 | Titan RTX | GeForce RTX 2080 Ti |
---|---|---|---|---|---|
Architecture (GPU) | Ampere (GA102) | Ampere (GA102) | Ampere (GA104)* | Turing (TU102) | Turing (TU102) |
CUDA Cores | 10,496 | 8,704 | 5,888 | 4,608 | 4,352 |
RT Cores | 82 | 68 | 46 | 72 | 68 |
Tensor Cores | 328 | 272 | 184 | 576 | 544 |
Texture Units | 328 | 272 | 184 | 288 | 272 |
Base Clock Rate | 1,400 MHz | 1,440 MHz | 1,500 MHz | 1,350 MHz | 1,350 MHz |
Boost Clock Rate | 1,700 MHz | 1,710 MHz | 1,730 MHz | 1,770 MHz | 1,545 MHz |
Memory Capacity | 24GB GDDR6X | 10GB GDDR6X | 8GB GDDR6 | 24GB GDDR6 | 11GB GDDR6 |
Memory Speed | 19.5 Gbps | 19 Gbps | 14 Gbps | 14 Gbps | 14 Gbps |
Memory Bus | 384-bit | 320-bit | 256-bit | 384-bit | 352-bit |
Memory Bandwidth | 935.8 GBps | 760 GBps | 448 GBps | 672 GBps | 616 GBps |
ROPs | 96 | 88 | 64 | 96 | 88 |
L2 Cache | 6MB | 5MB | 4MB | 6MB | 5.5MB |
TDP | 350W | 320W | 220W | 280W | 250W |
Transistor Count | ? | ? | ? | 18.6 billion | 18.6 billion |
Die Size | ? | ? | ? | 754 mm² | 754 mm² |
MSRP | $1,499 | $699 | $499 | $2,499 | $999 |
*Some specifications are unconfirmed.
Nvidia touts the GeForce RTX 3080 as the flagship Ampere SKU. The chipmaker is promising up to double the performance over the previous GeForce RTX 2080. According to Nvidia, the GeForce RTX 3080 is capable of providing a perfect 60 frames per second gaming experience at 4K even with ray tracing enabled.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
The GeForce RTX 3080 comes in at 11.2 x 4.4 inches (285 x 112mm), sporting up to 8,704 CUDA cores that top out at 1.71GHz. The 10GB of GDDR6X memory communicates via a 320-bit memory bus. Nvidia rates the GeForce RTX 3080 for 320W, so like the GeForce RTX 3090, the graphics card still depends on a pair of 8-pin PCIe power connectors. Head to our Nvidia GeForce RTX 3080 Everything We Know post for more details.
Ultimately, the GeForce RTX 3070 will continue to be the sweet spot for gamers. The graphics card starts at $499 and delivers higher performance than last generation's flagship GeForce RTX 2080 Ti at half the price. We have the deep-dive details in our Nvidia GeForce RTX 3070: RTX 2080 Ti Performance at $499 article.
The GeForce RTX 3070 arrives with dimensions of 9.5 x 4.4 inches (242 x 112mm) and packing 5,888 CUDA cores and 8GB of GDDR6 memory. The maximum boost clock on this model is 1.73 GHz, while the memory works with a 256-bit memory interface. The GeForce RTX 3070 has a much more reasonable TDP (220W) so it only requires one 8-pin PCIe power connector to operate.
Regardless of the model, the Ampere-based graphics cards offer three DisplayPort 1.4a outputs and one HDMI 2.1 port. Gone is the VirtualLink port that debuted with Turing. It doesn't come as a huge surprise since the standard never really caught on.
Given the power requirements, Nvidia recommends a 750W power supply for the GeForce RTX 3090 and GeForce RTX 3080, while the GeForce RTX 3070 can get by with a 650W unit. Nvidia's recommendations are based around a high-end system with the Intel Core i9-10900K so you could get away with a power supply that has a lower capacity than the suggested.
Zhiye Liu is a news editor and memory reviewer at Tom’s Hardware. Although he loves everything that’s hardware, he has a soft spot for CPUs, GPUs, and RAM.
-
EricLane Bummer. Was hoping for $1200, would have gotten the 3090 for $1400 but $1500 is too much for me. $1200 was too much but I was willing to bite. Now I am not sure if it is worth upgrading my 1080 Amp Extreme to 3080 or just wait until next year and see what the landscape looks like. I don't want to switch to AMD. What does everyone else think?Reply -
hannibal Wait till 2023. The raytrasing get better every year, in big jump!Reply
1080 is fast enough for normal rasterisation! -
Gurg The pricing for the 3080 and 3070 is a pleasant surprise. Nvidia will sell boatloads of 3070s and 3080s at those prices and performance.Reply -
QueueCumber EricLane said:Bummer. Was hoping for $1200, would have gotten the 3090 for $1400 but $1500 is too much for me. $1200 was too much but I was willing to bite. Now I am not sure if it is worth upgrading my 1080 Amp Extreme to 3080 or just wait until next year and see what the landscape looks like. I don't want to switch to AMD. What does everyone else think?
If you're looking to do 4k at a more stable fps then upgrade for sure. If you're looking to dip into 8k, it looks like another cycle or two before that gets up to 120 fps stable. I'm running 2x1080s in SLI and I'm looking forward to the large bump in performance with a 3090. After that, I'll wait for whatever future card allows me to dive deep into 8k.... -
Chung Leong Bad news for AMD. These prices are very competitive. The fact that the 3070 was launched means the 3060 and 3050 will come sooner rather than later. AMD is going to have a rough time maintaining their position in the mid-range segment. Meanwhile, Intel is squeezing them from the bottom with better and better IGP.Reply -
bmwm3oz EricLane said:Bummer. Was hoping for $1200, would have gotten the 3090 for $1400 but $1500 is too much for me. $1200 was too much but I was willing to bite. Now I am not sure if it is worth upgrading my 1080 Amp Extreme to 3080 or just wait until next year and see what the landscape looks like. I don't want to switch to AMD. What does everyone else think?
You were willing to spend 1,399 but 1,499 is too much? A ~7% price increase from what you expected is a deal breaker? :LOL: -
chuck850 The power requirements on these are disappointing however. There was some teaser info earlier in the year about reduced power consumption. Despite the more compact production technology, the 3070 is only a few watts lower then the existing 2080ti (for supposedly the same performance). Was hopeful that would lead to decreased overall system heat and power (and bode well for later mobile versions without the speed reductions of the Q series).Reply -
Avro Arrow
Well, for one thing, getting an RTX 3080 for $700 would far more than double the performance of your current card. The RTX 2080 Ti is ~30% faster than the GTX 1080 Ti and the RTX 3080 is ~100% faster than the RTX 2080 Ti so the gain that you would get for $700 would be nothing short of massive. Whether or not it's worth the upgrade is completely dependent on whether or not you're satisfied with your GTX 1080. If you are, then upgrading would be a complete waste.EricLane said:Bummer. Was hoping for $1200, would have gotten the 3090 for $1400 but $1500 is too much for me. $1200 was too much but I was willing to bite. Now I am not sure if it is worth upgrading my 1080 Amp Extreme to 3080 or just wait until next year and see what the landscape looks like. I don't want to switch to AMD. What does everyone else think?
The other thing is, what's wrong with AMD? I've had Radeon cards only since 2008 because they always offered me better performance at the price points that I was willing to pay and I've enjoyed them immensely. Before that, I had four straight nVidia cards and before that I had a CirrusLogic (pre-3D). Remember that ATi has been around far longer than nVidia and if their products were bad, they'd be extinct.
If AMD has a better product (although I doubt it will), then only a foolish fanboy would still buy nVidia. People who have never owned a Radeon card and are afraid of them are as ridiculous as people who have never owned an AMD CPU and are afraid of them. The differences between them are only performance-based. When it comes to actual use, there's no difference between them. There's more differnce between a Samsung and a Motorola Android phone than there is between a GeForce and a Radeon card.
Yes they will. The only question is, what will retailers do with all their leftover ~$2,000 RTX 2080 Ti cards? As soon as nVidia made their announcement, those cards were immediately turned into dead stock.Gurg said:The pricing for the 3080 and 3070 is a pleasant surprise. Nvidia will sell boatloads of 3070s and 3080s at those prices and performance.
Intel's IGPs would only be a threat to AMD (and nVidia for that matter) if AMD and nVidia were standing still. They're not. Intel's IGPs are, as of now, nowhere near ATi's IGPs and each step forward that Intel takes, ATi takes an even bigger one. Hell, Intel's most powerful IGP, the Iris Plus, is listed as being 50% behind the Vega 8 which itself is weaker than the lowly GT 1030. I have serious doubts about the viability of Intel's invasion of the graphics space. We've all heard this talk before from Intel just like we heard about nVidia wanting to make CPUs. Even Larrabee didn't go anywhere. When Intel finally releases its "Xe Graphics", nVidia will have Ampere and ATi will have RDNA2. Intel will just be left in the dust again.Chung Leong said:Bad news for AMD. These prices are very competitive. The fact that the 3070 was launched means the 3060 and 3050 will come sooner rather than later. AMD is going to have a rough time maintaining their position in the mid-range segment. Meanwhile, Intel is squeezing them from the bottom with better and better IGP.