AMD Radeon RX 5500 XT vs. Nvidia GeForce GTX 1660: The Battle for Mainstream Gaming Supremacy
GeForce Battles Radeon in the realm of $200 gaming cards.
AMD recently launched the Radeon RX 5500 XT, which aims to capture the attention of mainstream gamers with a mid-range budget to spend on a GPU. Following our review, we already know it’s not as powerful as Nvidia’s GeForce GTX 1660, but can it offer enough value to make up for what it lacks in performance? Let’s find out.
It’s an exciting time to be in the market for a new GPU, with several new products on the market in the last 12 months. Last year at this time, gamers were eagerly awaiting the reveal of AMD’s Navi architecture and the new line of GPUs that would reinvigorate the company’s Radeon Graphics brand. And Nvidia had not yet launched the RT and Tensor core-free 16-series Turing GPUs for the mid-range market.
Today, we’ve got myriad options from team red, including the recently launched Radeon RX 5500 XT, which introduces Navi to the high-volume segment of the market, in the $200 price range. AMD is no stranger to this price point. For several generations, the Radeon team has focused its attention on mainstream cards like the RX 480, 580, and 590 (all based on the same Polaris architecture).
However, Nvidia’s GeForce GTX 1660, which hit store shelves in March of 2019, is a strong contender for your gaming money. That GPU’s successor, the GTX 1660 Super, is somewhat more expensive ($230+), but the original model is still readily available and often sells for $210 or less.
Now the question becomes, which card should you choose if you’re in the market for a new GPU for roughly $200. We compared the features and benefits of the Radeon RX 5500 XT and the GeForce GTX 1660 in four categories -- featured technology, game performance, power consumption & heat output and Value -- to help you decide which of these two cards is right for you.
Featured Technology
AMD’s NAVI 14 XTX GPU features the same number of cores (1408) as the version of Nvidia’s Turing TU116-300-A1 chip found in the GTX 1660. Although it operates at a higher frequency and produces higher peak floating-point performance (5.2 TFLOPS vs the GTX 1660’s 5 TFLOPS).
AMD’s NAVI 14 XTX features 6.4 billion transistors, which is 200 million fewer than the Turing TU116, but AMD crammed its chip into a much smaller package. Thanks to TSMC’s 7nm finFET process, AMD’s GPU fits on a 158 mm² die. Turing GPUs are also manufactured on TSMC’s finFET process, but Nvidia is working with a larger 12nm process. As a result, the TU116-300-A1 is nearly twice the size of NAVI 14 XTX at 284 mm².
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
The RX 5500 XT also offers faster GDDR6 memory compared to the GTX 1660’s GDDR5 memory modules. Nvidia does provide a larger 192-bit memory bus versus AMD’s 128-bit memory bus configuration, but the faster memory more than makes up for the difference in bus size. Theoretically, the NAVI 14 XTX has 224 GB/s of memory bandwidth to work with, compared to the 192 GB/s that TU116 offers. The Radeon RX 5500 XT is available in 4GB and 8GB configurations. But for this comparison, we’re focusing on the 8GB version. The GeForce GTX 1660 falls right in the middle, with 6GB of graphics memory.
By the numbers, AMD’s RX 5500 XT appears to be the stronger of the two GPUs. It’s manufactured on a smaller, more efficient process, offers a higher base and boost clock speed, and has a higher theoretical performance limit. The higher capacity of faster GDDR6 memory also gives it a significant advantage.
Header Cell - Column 0 | Gigabyte RX 5500 XT Gaming OC 8G | Gigabyte GeForce GTX 1660 OC 6G |
---|---|---|
Architecture (GPU) | RDNA (Navi 14 XTX) | Turing (TU116) |
ALUs/Stream Processors | 1408 | 1408 |
Peak FP32 Compute (Based on Typical Boost) | 5.2 TFLOPS | 5 TFLOPS |
Tensor Cores | N/A | N/A |
RT Cores | N/A | N/A |
Texture Units | 88 | 88 |
ROPs | 32 | 48 |
Base Clock Rate | 1685 MHz | 1530 MHz |
Nvidia Boost/AMD Game Rate | 1737 MHz | 1785 MHz |
AMD Boost Rate | 1845 MHz | N/A |
Memory Capacity | 8GB GDDR6 | 6GB GDDR5 |
Memory Bus | 128-bit | 192-bit |
Memory Bandwidth | 224 GB/s | 192 GB/s |
L2 Cache | 2MB | 1.5MB |
TDP | 130W | 120W |
Transistor Count | 6.4 billion | 6.6 billion |
Die Size | 158 mm² | 284 mm² |
AMD offers a variety of software features for the Radeon RX 5500 XT. The company’s Radeon Software allows you to monitor your GPU’s temperature, fan speed and voltage, adjust the graphics settings for your games, access performance reports and game states, and update your GPU drivers. The Radeon Software Adrenalin 2020 edition also lets launch games directly from the Radeon Software dashboard. It even offers integrated media capture and game streaming features, including screenshots, video, and instant gif capture as well as compatibility with many popular streaming services such as Twitch, YouTube, Mixer, Facebook and more.
AMD also offers a feature called Game Boost, which monitors your framerate and dynamically adjusts the resolution of your games when it won’t impact the visual experience to maximize performance.
The Radeon RX 5500 XT also supports a technology called Radeon Anti-Lag, which aims to reduce input lag by managing the queue of work sent to the CPU and ensuring that the CPU can’t get ahead of the GPU. Also, Radeon Image Sharpening combines contrast-adaptive sharpening with GPU upscaling to produce crisper in-game visuals.
Nvidia’s GeForce Experience software is a reasonable equivalent to AMD’s Radeon Software Adrenalin. It enables you to keep your drivers up to date, offers performance-optimized game setting configurations, and allows you to launch your games from the GeForce Experience dashboard.
Nvidia’s software supports media capture and game streaming features that integrate with popular streaming platforms. The GeForce GTX 1660 takes full advantage of Nvidia’s NVENC hardware encoder technology, which lowers the CPU overhead associated with software encoding. NVENC allows the CPU to send a single set of instructions to the GPU, which the GPU uses to output to your display and broadcast without impeding performance.
GeForce Experience also includes Nvidia’s advanced screenshot capture technology, called Nvidia Ansel, which allows you to capture extremely high-resolution (up to 33x 1080p) images of supported games (for regular OS captures, see how to take screenshots in Windows). Ansel also offers free-camera capture, which allows you to snap angles that aren’t possible in-game, and the ability to capture 360-degree images. Ansel even supports RAW EXR output, which allows you to make changes to the image in post-processing, such as create HDR-enhanced images.
Both AMD and Nvidia offer support for adaptive display technology. The Radeon RX 5500 XT supports both FreeSync and FreeSync 2, whereas the GeForce 1660 is compatible with Nvidia’s more-expensive G-Sync adaptive display technology. Nvidia also supports a select few FreeSync displays. And as for performance, both AMD and Nvidia offer dynamic clock speed boosting technologies that automatically overclock your GPU to maximize the capabilities of your hardware configuration.
Winner: Tie. Both AMD and Nvidia offer compelling feature sets for their budget-minded options. Each card supports automated game optimization, game streaming, and driver update software. And they both support performance optimization features that maximize your gameplay experience.
Gaming Performance
We tested 11 relatively demanding games on both GPUs with maximum and medium settings enabled. In The Division 2, Gears of War 5, Strange Brigade, Far Cry 5, Shadow of the Tomb Raider, Final Fantasy XIV, Forza Horizon 4 and Battlefield V both cards managed to drive 60 fps average gameplay with the graphics settings maxed right out.
More demanding titles such as Borderlands 3 and Metro: Exodus, we had to drop the settings to medium to achieve the coveted bare minimum of 60 fps. Ghost Recon: Breakpoint was also very demanding, and our Radeon was unable to crest 50 fps at max settings. However, the GTX 1660 fell just a sliver short with an average framerate of 58.7 fps on max settings.
Both cards produce acceptable performance for 1080p gaming, but the GeForce outshined the Radeon by a small margin in all but two of our tests (at medium settings), but the two that AMD won were by less than a 1% margin. The GTX 1660 often held a 5% or better lead over the RX 5500 XT in our max settings tests.
Winner: Nvidia. Despite having the superior on-paper specifications and equivalent software features, in a competition of might, the GTX 1660 outperformed the RX 5500 XT in almost every test we threw at these two cards. Nvidia’s configuration appears much more capable of handling games with maximum graphics settings enabled than AMD’s. Strangely, the RX 5500 XT is a stronger competitor with medium settings, but why would you settle for medium if you don’t need to?
Power Consumption and Heat Output
On the budget end of the GPU market, power consumption shouldn’t really be much of concern. None of the GPUs of this calibre draw anywhere near enough power to make a significant difference in your system’s configuration. Both AMD and Nvidia recommend at least a 450-watt power supply for their respective cards.
AMD’s Radeon RX 5500 XT features a TDP rating of 130w, whereas Nvidia’s GeForce GTX 1660 includes a TDP rating of 120w. In practice, the Radeon GPUs actually draw quite a bit less than 130 watts, except under extreme loads. We tested two examples of the RX 5500 XT, a 4GB Sapphire and an 8GB Gigabyte card, and we found that, in-game, the AMD card will average close to 100w.
The Gigabyte GeForce GTX 1660 sample that we tested in March 2019 needed an average of 126 watts, and peaked at 134 watts, and the Zotac model that we used in the Radeon RX 5500 XT review pulled between 110 and 120 watts in our Metro: Exodus power test.
Interestingly, the Zotac GeForce GTX 1660 peaked around 100w in our Furmark stress test, whereas the Gigabyte sample drew around 130w for this test. Both the Gigabyte and Sapphire Radeon cards topped 120w in this test, with the Sapphire model touching 136w at its peak.
You likely won’t be concerned with the power requirements for these cards, but higher power draw does equate to higher heat output, which is likely of higher concern to you.
Our Gigabyte RX 5500 XT OC 8G proved to be one of the coolest-running GPUs that we’ve seen in this segment. Following three passes of our Metro: Exodus test, the data showed that the GPU never exceeded 63C. Our tests with Gigabyte’s GeForce GTX 1660 OC 6G showed that it peaked at a 68C. It’s also worth noting that the fans on the GeForce were spinning at 2000 rpm, and the Radeon card spun at a leisurely 1800 rpm.
Winner: AMD. The power requirements for either of these cards is low enough to be negligible. However, the low heat output from the Radeon RX 5500 XT, likely aided by its smaller 7nm process, gives AMD a leg up in this category.
Value Proposition
Comparing the value of the Radeon RX 5500 XT 8GB to the GeForce GTX 1660 is a somewhat challenging task. If you compare these cards at the same price, the GeForce GTX 1660 is the clear winner.
Nvidia’s offering outperforms the competition handily in gaming tests, which if we’re honest is kind of the point of a graphics card—especially in the $200 range. However, AMD’s option still delivers acceptable performance and costs a few dollars less, which somewhat evens that playing field. Where the Radeon maybe 5% to 10% cheaper, it’s also 5% to 10% slower in many cases.
Winner: Nvidia. We can’t call the Radeon RX 5500 XT a bad deal. It’s far and away a better card than you would have bought for the same price a year ago. However, for just a few dollars more you can get better gaming performance from a GeForce GTX 1660.
Round | Nvidia GeForce GTX 1660 6GB | AMD Radeon RX 5500 XT 8GB |
---|---|---|
Featured Technology | ✗ | ✗ |
Gaming Performance | ✗ | Row 1 - Cell 2 |
Power Consumption | Row 2 - Cell 1 | ✗ |
Value Proposition | ✗ | Row 3 - Cell 2 |
Total | 3 | 2 |
Bottom Line
What a difference a year can make. The last time we did a showdown of the mid-range GPUs, we put the Radeon RX 590 toe to toe with the GeForce GTX 1060 and AMD’s card was the definitive winner. This time around, the tables have turned and Nvidia has captured the mid-tier crown back from AMD.
But pricing is close and can always change. So if you’re reading this in the weeks and months after first publication (in mid-January 2020), make sure to check your favorite online store to see if the value balance has shifted -- or if something better has come along. Like most areas of PC tech, price cuts and new products are always coming along to change the battle for your hard-earned component dollars.
Kevin Carbotte is a contributing writer for Tom's Hardware who primarily covers VR and AR hardware. He has been writing for us for more than four years.
-
joeblowsmynose I would have given features to AMD easily for the goodies they add to their driver suite.Reply
Overclocking GPU and VRAM, voltage adjustments on both, power state programming for both, fan curve tuning (when it works properly), Radeon Boost, etc. ... -
AlistairAB You tested the old 1660, and the 8GB RX 5500? The two cards nobody should buy?Reply
Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700. -
artk2219 AlistairAB said:You tested the old 1660, and the 8GB RX 5500? The two cards nobody should buy?
Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700.
If you plan on keeping it a few years those extra 4gb are definitely needed. It will be a better card in the long run, as even the 6gb on the 1660 can be a bottleneck in some titles (the 6gb VRAM is also something i'm not keen on with the new RX 5600). If you plan to upgrade every two years, meh. -
cryoburner We can’t call the Radeon RX 5500 XT a bad deal. It’s far and away a better card than you would have bought for the same price a year ago.
That's simply not true. A year ago, you could get an RX 580 8GB for about the same price, and a 5500 XT isn't much more than 5% faster. Even 3 years ago, one could get an RX 480 8GB for around $200, which is within 15% of the performance of this card. The only notable advantage the 5500 XT has over those older models is improved power efficiency. The performance gains are relatively poor after 3 years, and even Nvidia's lineup is offering better performance for the money right now.
Why should no one buy a 1660? Keep in mind that while the 1660 SUPER might perform around 10% faster, at current prices (online, in the US), it also typically costs at least 10% more, so both versions of the card tend to offer similar value for someone buying a graphics card as an upgrade.AlistairAB said:What the heck? You tested the non-super 1660, and the 8GB RX 5500? The two cards nobody should buy?
Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700.
And in the case of the 5500 XT, the 4GB model's limited VRAM is a notable concern, particularly since the card only uses an x8 PCIe connection, causing performance to tank more than usual when VRAM is exceeded. Neither version of the 5500 XT is all that competitively priced right now. The 4GB model needs to be priced lower than a 1650 SUPER, not higher, and the 8GB model needs to be priced lower than a 1660, which is simply a faster card. Both versions of the card should have launched for around $20 less than they did. -
dorsai Personally I would hesitate to try multiplayer gaming on either of these cards...they seem more low end than mainstream.Reply -
alextheblue I'd like to see those cards tested battling while overclocked. IMHO the 5500 XT is a bit overpriced, but they seem to have decent headroom.Reply
This has already been discussed. If you're running these kinds of settings, there are already games that are hindered by the lack of RAM. The is only going to get worse. This problem is exacerbated if you're on a PCIe 3.0 platform, since Navi 14 only has 8 lanes.AlistairAB said:Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700.
But yes, if you can get a good deal on one, the Super offers superb bang for the buck. -
NP Strangely, the RX 5500 XT is a stronger competitor with medium settings, but why would you settle for medium if you don’t need to?Reply
I own an nVidia, and will probably buy another geforce card when I need to upgrade, but I still don't under why you assume the answer would be "Obviously for no reason" ?
I think there is valid reason for not settling for, but deliberately picking medium. You pick medium, because the added frame rates allow you to perform better in any competitive online FPS you intend to play with either RX 5500XT or GTX 1600 range gpu.
That either is or is not a consideration for you, but it is nevertheless among considerations that many people would regard of key importance when buying a new gpu. -
InvalidError The RX5500 is a pitiful upgrade over what was already available for sub-$200 three years ago. If there was true competition between AMD and Nvidia GPUs, the 4GB RX5500 would be a $120 GPU.Reply
Also, most people who buy RX5500-class GPUs will be stuck on PCIe 3.0x8 for the foreseeable future where it incurs sometimes significant performance penalties, which makes it a "nope" GPU in my book. -
mitch074
True - but a couple months from now, the RX5500 4Gb will be a nice pairing with B550 based motherboards in PCIE 4.0 for mainstream gaming rigs. By then its price may have dropped a bit.InvalidError said:The RX5500 is a pitiful upgrade over what was already available for sub-$200 three years ago. If there was true competition between AMD and Nvidia GPUs, the 4GB RX5500 would be a $120 GPU.
Also, most people who buy RX5500-class GPUs will be stuck on PCIe 3.0x8 for the foreseeable future where it incurs sometimes significant performance penalties, which makes it a "nope" GPU in my book.
In the mean time, I'll pray the my reference RX480 8Gb doesn't fry - still going strong since I bought it in summer 2016.