The five worst AMD GPUs of all time: So bad we can't forget them
Historically awful AMD GPUs.
For AMD, it seems both its wins and its losses are big. This time we're not talking about AMD's best GPUs, but its worst ones, the slop of the crop. We're not even just talking GPUs that simply couldn't stand up to Nvidia's best GPUs, but the ones that failed entirely on their own merits. Incidentally, many of AMD's poorer GPUs were also worse than Nvidia's though.
The field here is surprisingly competitive, and given that there are only five spots available, determining the winners (losers?) was pretty difficult. But if we had to choose five and only five, these are the ones we'd choose, based not only on an individual card but GPU families and series as a whole. Note that we do tend to skew more toward the modern era (i.e. DirectX 9 capable GPUs), so if you're still hot about ATI Rage GPUs from prior to the AMD acquisition, we're going to give those a pass.
5 — Radeon R9 290X
2012 saw AMD and Nvidia trade blows with their respective HD 7000- and GTX 600-series, and the fight was pretty even. The next year was going to see both companies launch their second generation 28nm chips, and Nvidia was up first. Its GTX Titan in February 2013 was monstrously fast and expensive at $999, while the GTX 780 was just a touch slower and more reasonably priced at $649. The ball was in AMD's court.
AMD was working on a brand-new architecture ... sort of. And that was the problem. It took its prior experience with the 28nm node, added more cores, and added a lot of power. At the same time, it took the company until October to respond with its R9 290X. Though belated, this GPU was perhaps worth the wait as it could often beat Nvidia's Titan card. It was briefly the world's fastest graphics card for gaming. It also used a much smaller 440mm2 die, compared to the 560mm2 chip in the Titan and the 780. It even came with 4GB of VRAM. Best of all, it was just $549, undercutting Nvidia by a massive amount.
That's how the story of the 290X is usually told, anyway. The problem for the R9 290X was everything that happened immediately afterward. Nvidia did some massive price cutting just days after the 290X launched, bringing the GTX 780 down to $499. Then, the GTX 780 Ti came out in November and retook the performance crown from the 290X. In the span of two weeks, the 290X went from a first-place GPU with terrific bang for buck to a second-place GPU with decent value.
But over the long term, the 290X and the did not stand the test of time. In fact, the entire 200-series from AMD consisted of a lot of warmed over GPUs — Cape Verde, Pitcairn, and Tahiti were impressive in the HD 7700/7800/7900 cards, far less so as the OEM-only 8000-series rebadge... and then AMD rebadged those again for everything below the R9 290, and added insult to injury with a third and final rebadging for the 300-series parts.
AMD took too long to get Hawaii and the R9 290X out the door and was falling behind Nvidia's cadence. Where things went from bad to worse was when Nvidia's next generation GTX 900-series cards came out less than a year later, which proved devastating for AMD. The $329 GTX 970 basically matched the 290X, retailing for ~$500 at the time, and it did so while consuming about half as much power.
AMD would have to cut prices to stay competitive, but AMD was already struggling to turn a profit with the Radeon 200-series. Cutting prices to remain competitive meant losing money, and the company ended up posting an $80 million loss in 2013 and a $400 million loss in 2014. By contrast, Nvidia made $440 million in 2013 even with the massive Kepler dies used for the GTX 700-series.
Again, this whole era from AMD had more than its share of "bad" to go around, like the R9 Fury we'll get to in a moment. The 200-series struggled to compete, and AMD's plan for its 2015 Radeon 300-series was to take the 200-series GPUs and simply boost the clock speeds. This exacerbated another issue of the 200 series and the 290X especially: high power draw. The reference 290X card was almost Fermi-like, in its own right, and the 390X was truly on par with Fermi.
AMD started to hide its high power use by reporting GPU power only (i.e. not the total board power) to make things seem less dire. This again was in stark contrast to Nvidia, which took its Kepler architecture in the GTX 600-series and iterated on it with the GTX 700-series. There was no such future for the Radeon 200-series; it was the end of the road.
The 290X had its moment to be sure. For two glorious weeks, it was the Titan killer. But it and the rest of the 200-series/300-series pushed AMD close to bankruptcy, which would have almost certainly happened if it weren't for Ryzen's success. The 290X flew way too close to the sun, and AMD got burned for it. Although often celebrated as one of AMD's best GPUs, it's hard to say it did the company many favors. Still, out of AMD's bad GPUs, the 290X is far from the absolute worst.
4 — Radeon RX 7600
Normally, if we're talking about history's worst (or best) GPUs, we're going to be talking about cards have have been out for a while — GPUs where we have the benefit of hindsight and can see how the whole sequence played out. Sometimes, though, there's a GPU so disappointing that it's immediately clear it's one of the worst ever. The RX 7600, which came out just six months ago, is one of those graphics cards.
AMD has often offered the most competitive GPUs in the lower-end to midrange segment of the market, and that was especially true at the end of the last generation. Throughout 2022 and 2023, RX 6000 GPUs like the RX 6600 XT and 6650 XT sold for rock bottom prices. Only Nvidia's RTX 3060 and Intel's Arc A750 offered a realistic alternative. All of these were aging GPUs, and the higher-end RX 7900 XT and 7900 XTX offered a look into what a mainstream version might look like.
Unfortunately, the RX 7600 features almost nothing that made the RX 7000-series attractive. It's built on the old 6nm node from TSMC, forfeiting all the gains its siblings made on the 5nm node, and it doesn't even use the chiplet design that AMD hyped up for the 7000-series (though admittedly, that was more about cutting costs than improving performance). The result was that the RX 7600 was really more of an RX 6650 XT with the RDNA 3 architecture, as it has the same number of cores, same amount of VRAM, same 128-bit memory bus, and nearly the same memory bandwidth... and basically the same performance.
In our launch review of the RX 7600, we noted that it barely managed to outperform the 6650 XT in gaming, which tracks given the tech specs. AMD essentially skipped a generational update for its midrange GPU, and thereby gave Nvidia's own RTX 4060 much more legroom than it would have otherwise had. The 4060 is a bit faster in rasterization, significantly faster in ray tracing, much more efficient, and uses higher quality DLSS resolution upscaling instead of FSR2. Sure, the 7600 is cheaper than the 4060, but so was the 6650 XT.
Considering how good AMD did in the mainstream segment with its RX 6000-series as well as previous GPU families, the RX 7600 was and is very disappointing. It's not like the company has never replaced an old GPU with something similar or even identical — R9 285 comes to mind as a similar story. But here AMD had a new architecture and design that resulted in the same performance (outside of AI workloads like Stable Diffusion), coupled with a higher price tag.
It's not just the RX 7600, either: The RX 7000-series as a whole hasn't been as competitive as we'd like. The RX 7900 XT and XTX offer a clear performance upgrade over the previous generation parts and represent the best of the family, but the RX 7800 XT and RX 7700 XT that only recently launched are mostly a sidestep from the previous generation Navi 31 parts. The RDNA 3 GPUs use slightly less power than their RDNA 2 equivalents, and you get AV1 encoding/decoding support along with DisplayPort 2.1 UHBR13.5 outputs, but those are less critical compared to their gaming performance.
As a side note, we did rate the 7600 decently well at 3.5 out of 5 starts, but that's relative to the options that currently exist on the market, and in retrospect we feel our original score was quite generous. It really says a lot about the current state of the desktop GPU world when even a bad card can seem almost okay.
3 — Radeon RX 6500 XT
If the RX 7000-series marked the moment where AMD fumbled the midrange segment, RX 6000-series was when AMD absolutely dropped the ball on the low-end. Ironically, this was a pretty good generation for AMD in literally every other area of the market, and it should have been a slam dunk for RX 6000-series in the low-end since Nvidia's RTX 30-series didn't even present there. Yet, AMD still managed to screw it up.
Contrary to semi-popular belief, the low-end hasn't always started at ~$250 and GPUs in the $100 to $250 range haven't always sucked. The RX 460 in 2016 was pretty solid at ~$100, as was its refresh in the form of the RX 560, and the RX 5500 XT in 2019 was a little expensive at $169 for the 4GB model but overall was alright. Nvidia's GTX 950 and GTX 1050/1050 Ti were also decent budget options, back before Nvidia decided to stop making sub-$250 cards. It should have been a cakewalk for AMD with its RX 6000-series.
Unfortunately, at $199, the RX 6500 XT was already stretching it when it came to pricing, but the specs alone scream corner cutting: 4GB of memory, a measly 64-bit memory bus, no overclocking, and just a mere four PCIe 4.0 lanes. Even the RX 460 had a 128-bit bus and the full 16 lanes normally allocated to GPUs. It's funny how six years later, those things aren't necessary for a GPU selling for nearly twice as much.
AMD's design decisions really hurt the RX 6500 XT. It actually performed worse in most cases than the RX 5500 XT 4GB that it supposedly replaced, clearly trailed the RX 5500 XT 8GB, and could only tie Nvidia's four year old GTX 1650 Super. Sure, the RX 6500 XT was a big improvement in efficiency over the 5500 XT, but this only put it on par with the 1650 Super.
Worst of all was the fact that the RX 6500 XT's four lanes were insufficient on systems that didn't have PCIe 4.0 — precisely the sort of PCs that might actually be looking at a budget GPU upgrade! At PCIe 3.0 speeds, the 6500 XT would see 10–25 percent of its performance vanish out of thin air. Ironically, AMD's budget Ryzen 5 5500 and 4500, which launched alongside the 6500 XT, didn't even have PCIe 4.0 support, sticking to PCIe 3.0. Meanwhile, even Intel's Core i3-12100 had PCIe 4.0, so you were better off going with Intel if you wanted the 6500 XT.
The RX 6500 XT's little brother, the RX 6400, also caught some flak for similar issues, but it had at least one crucial selling point: it was small form factor friendly. The market for SFF GPUs (generally considered to be one slot and/or low-profile) hadn't really gotten any love since Nvidia's GTX 1650, but the RX 6400 at $159 offered both a low price and a newer architecture, and it didn't require an external power connector.
To be fair to AMD's engineers, the Navi 24 chip that powered the RX 6500 XT and the RX 6400 was reportedly designed for Ryzen 6000-powered laptops instead of desktops. Those four PCIe lanes would never be a problem with Ryzen 6000, which had PCIe 4.0 unlike the Ryzen 5000 APUs. But AMD chose to deploy Navi 24 in desktop GPUs, and those GPUs weren't even good with AMD's own budget chips. It's almost as if AMD didn't care.
2 — Radeon R9 Fury X
With the Radeon 200 series, AMD had technically made good gains, but that was at the expense of the company's finances. The R9 290X and other 200-series cards were being sold for way too little and ended up undermining its R&D budget. In 2011, AMD was investing over $350 million per quarter into R&D, but by 2015 this declined to $240 million, and that was spread across GPUs and CPUs. Meanwhile Nvidia's budget was approaching $350 million per quarter by 2015, spent solely on GPUs.
AMD made a tough choice: It decided to pool all of its GPU R&D for a brand-new flagship GPU that would launch in 2015. This meant not responding to Nvidia's GTX 900-series in 2014, instead choosing to cut prices for existing 200-series GPUs and later rebadging those 200-series GPUs as the Radeon 300-series in 2015. Meanwhile, the new flagship graphics card would incorporate cutting-edge high bandwidth memory (HBM), a custom liquid cooler, and a brand-new architecture. It would be "an overclocker's dream," as one AMD engineer put it, and when has AMD ever oversold one of its graphics cards?
The R9 Fury X didn't get off to a great start, as Nvidia preempted the launch with its GTX 980 Ti, which was essentially a cheaper GTX Titan X with half the VRAM — still 6GB, though. That meant the Fury X's price tag of $649 wasn't $350 less than the $999 Titan X, but instead it was just as expensive as the 980 Ti. Plus, there was only one Fury X model at launch: the reference design, with liquid cooling, equipped with a 120mm radiator and fan.
AMD's efforts almost paid off, as the Fury X could rival the 980 Ti and the Titan X in many games. However, the dilemma of the Fury X was that it only had 4GB of HBM, but all the compute offered by its 4,096 GPU shaders meant it was most competitive at 4K... except 4K requires lots of VRAM. The GTX 980 Ti had 6GB of good old GDDR5, and the Titan X had a whopping 12GB. Were you really going to bet on 4GB being good enough for 4K gaming when you could get a 6GB card for the same price, especially when that card had other benefits?
Speaking of price, HBM was expensive. The first generation in the R9 Fury X used four 1GB stacks of memory, but these stacks along with the GPU had to sit on top of a silicon interposer that was used to link the chips. HBM provided a very wide 4,096-bit interface, eight times the width of its previous Hawaii GPUs (290/290X/390/390X), and boasted up to 512 GB/s of bandwidth, but Nvidia had better memory compression technology and was able to keep pace with just a 384-bit interface and 336 GB/s of bandwidth.
And that "overclocker's dream" claim? Well, out of the box the VRMs (the things that supply the graphics chip with power) hit over 100 C, making it totally impractical to overclock. Overclocking the HBM was also totally locked down, which chaffed against that "overclocker's dream" statement even more. Nvidia's GTX 980 Ti on the other hand could easily hit a 10% overclock at a minimum, and even 20% wasn't exactly rare. Also, the R9 Fury X was horribly power hungry. AMD claimed a 275W TDP, but there were plenty of workloads and games that could push it well beyond that mark — that liquid cooler wasn't just for show.
The R9 Fury X was nearly a tragedy for AMD's GPU division. It spent almost two years and hundreds of millions of dollars just to offer what was essentially a GTX 980 Ti competitor with less VRAM, worse efficiency, less overclocking headroom, and no third party models. AMD took a big gamble and it didn't even come close to paying off. With a net loss of $660 million, AMD lost even more money in 2015 than it had in 2014, making it clear that neither a refreshed Radeon 200-series nor Fury was what AMD needed.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
1 — RX Vega 64
Although AMD's graphics cards had never made the company a ton of money, it did at least enjoy a period of relative success from the HD 4000-series in 2008 to the HD 7000-series in 2012. However, AMD's competitive edge began to decline beyond that point, leaving the company with financial statements in the red and second-rate technology. AMD needed a shakeup, and it started by appointing Lisa Su as CEO in 2014.
Su hadn't even been CEO for a year when the above R9 Fury X launched, and in the aftermath Su decided to make a graphics-only business unit within AMD called the Radeon Technologies Group (RTG). By giving the graphics engineers a little more autonomy, the hope was that better Radeon GPUs could come out of the same budget. Raja Koduri was selected to run RTG, as he had worked at ATI before and after AMD's acquisition, and then later at Apple to develop its graphics hardware.
RTG's first product was the RX 480 and the wider RX 400-series using the Polaris architecture, and it was pretty good all things considered. This got people excited for AMD's upcoming flagship, codenamed Vega, and AMD took notice. Capitalizing on this hype, on January 1, 2017 AMD released a teaser video called "After the Uprising." It featured the Vega codename, a jab at Nvidia's upcoming Volta GPU, and lots of drums. It ended with the tagline "Make. Some. Noise."
AMD fans weren't altogether happy when it was revealed that this teaser was for an architectural presentation at CES, where Koduri revealed that the product name for Vega would be... RX Vega. Wow.
There were a few mildly interesting tidbits about new features, HBM2 memory (still expensive — deja vu!), and talk of a "High-Bandwidth Cache Controller." That last never really seemed to do much post-launch, but one thing that did stand out was a Doom (2016) demo that showed the top-end Vega chip with performance just barely above the GTX 1080. That wasn't exactly a great sign when Doom was already a game that loved AMD GPUs of the time. But surely this was just down to early drivers, and more mature ones down the line would fix things, right? It wouldn't be the first time that had happened with an AMD GPU.
In a repeat of the 980 Ti and Fury X, Nvidia threw a curveball at AMD with its GTX 1080 Ti, which replaced the 1080 as Nvidia's flagship GPU in March. The 1080 Ti for $699 came with 30% more performance than the 1080, which at the same time got a price cut of $100 down to $499. So now AMD had to match an even faster GPU, one that beat RX Vega to launch by five months — and it was only targeting a GPU that launched over a year earlier.
Launching in August 2017, the RX Vega series debuted with the Vega 64 and Vega 56, and it was a disaster. The $499 Vega 64 could only just catch the GTX 1080 (non-Ti), but it had to consume a ton of power to do so. More power means more heat, and the Vega 64's blower-style cooler needed to get quite loud to cool down the GPU. "Make some noise" took on a new meaning after this. AMD even had a massive, triple-fan reference cooler design for it's engineering samples — why it chose not to use it for the final product is a mystery.
Although there were plans to make more Vega GPUs than just the Vega 64 and the Vega 56, they never really came to fruition. The Vega architecture did make it into Ryzen APUs, or at least the name and many of the features did. It was also present in a couple of lower-end professional GPUs, server-oriented Radeon Instinct GPUs, and even an Intel CPU, but no further gaming GPUs were made.
In its bid to be three times unlucky, AMD trotted out a new brand with the second generation Vega datacenter card as the Radeon VII. It bumped the HBM2 to four stacks and 16GB, with 1 TB/s of bandwidth, and it was the first use of TSMC's 7nm process node. Ultimately, it was basically a worse version of the RTX 2080 with no ray tracing capability, and it had to be priced at $699 to compete. Expensive HBM got its third strike in the gaming sector.
This was all eerily similar to what happened with the R9 Fury X. Just as the Fury X was basically a worse 980 Ti, the Vega 64 was a worse 1080. The existing RX 400-series was rebranded as the 500-series, just to try and compete in the budget to midrange sector. But at least the Fury X had matched Nvidia's flagship 980 Ti. The Vega 64 came nowhere near matching the GTX 1080 Ti.
The aftermath at AMD was as profound as the failure. Just after the launch of RX Vega, RTG head Raja Koduri announced he would be taking a sabbatical on account of how challenging Vega was to work on. Before the year was up, he left AMD entirely to go work at Intel and make discrete graphics cards there instead. After leading RTG for just over two years, he was replaced by David Wang who continues to head up AMD's GPU efforts.
After the failure of Vega, AMD stopped making flagship gaming GPUs for three whole years. Vega essentially had no future, and after Vega the entire GCN macroarchitecture that debuted with the HD 7000-series reached the end of the road. In 2019, AMD finally introduced RDNA and its RX 5000-series. It's now on RDNA 3, with RDNA 4 rumors beginning to circulate, but AMD gave up a lot of ground to Nvidia after Vega.
Dishonorable Mention: Radeon 8500 and Driver Woes
ATI Technologies had started the Radeon brand in 2000, succeeding its earlier lineup of 3D Rage cards. Initially launching with the Radeon 7000-series, ATI's third generation Radeon 9000-series was able to define the direction that modern GPUs would take, a once-in-a-lifetime achievement. However, it's the second-generation Radeon 8000-series GPUs that earn a dishonorable mention on this list, not because they were built poorly but because they came with awful drivers.
On paper, it was expected that the Radeon 8500 and other members of the Radeon 8000 family would perform pretty well compared to Nvidia's GeForce 3, even against the flagship Ti 500. Back then, graphics cards were much simpler, and certain specifications could tell you quite a bit, so this wasn't fortune-telling or anything.
Except, extensive testing in several games showed the 8500 losing to the Ti 500 by large margins, while in others, it was at least tying. The issue was that ATI's drivers weren't all that good, and they kneecapped the 8500 to the point where it was losing tons of frames. Driver issues arguably cost ATI the performance crown in 2001; even the notorious drivers for RX 5000-series cards and their black screens of death have never been charged with lowering performance as much as the old Radeon 8000 family.
The perception that ATI couldn't make good drivers persisted for years afterward, and eventually, third-party drivers for Radeon GPUs popped up in 2004. Omega Drivers were modified GPU drivers that promised to offer superior performance and stability, though the Radeon drivers remained the most popular solution. It's unclear if Omega Drivers were actually better than the drivers from ATI (and later AMD), as anecdotes vary, but the mere fact of their existence says a lot about what people thought of the official drivers.
Although Radeon drivers are definitely better today than they were back then, but as with all major chipmakers, AMD (which bought ATI in 2006) still manages to put critical bugs in its drivers from time to time. Just this year, there was a buggy driver that could corrupt your Windows installation, another that transformed Radeon 780M iGPUs into the slower 760M, and even a driver feature that got Counter Strike 2 players banned.
Matthew Connatser is a freelancing writer for Tom's Hardware US. He writes articles about CPUs, GPUs, SSDs, and computers in general.
-
gdmaclew Is there going to be a follow-up article about The Five Worst nVidia GPUs of All Time?Reply
Seems only fair. -
newtechldtech what is the point of this article ? I get it if you are warning people from today cards , but this is so unprofessional .Reply -
PEnns gdmaclew said:Is there going to be a follow-up article abot The Five Worst nVidia GPUs of All Time?
Seems only fair.
Don't hold your breath.
And if they do that, it will be very unwillingly. and in a much more flattering light. -
PaulAlcorn
Yes, this is part of an ongoing series of articles.gdmaclew said:Is there going to be a follow-up article abot The Five Worst nVidia GPUs of All Time?
Seems only fair. -
PEnns Thank you for a very enlightening article. I am glad you reminded the people, just in case they forgot.Reply
/S -
Neilbob Not too sure I agree that the RX 7600 is one of the worst so much as a meh product with a slightly poor price and value proposition; trouble is meh products with questionable pricing and value rather sums up the majority of the current generation.Reply
I'm certain there have been far worse options over the years (although my wrinkly old noggin can't think of them at this moment in time). -
garrett040 So one day we have a video of Nvidia touting its "500" games with dlss, then the following day we get "5 WORST AMD CARDS OF ALL TIME!"Reply
yea this isn't sus at all.... this kind of behaviour specifically makes me WANT to get an amd gpu next upgrade... -
JarredWaltonGPU
As noted in the intro, we've trended toward newer rather than digging way back into the pre-Radeon era. We'll be doing the same for Nvidia, just like we did with both the Best AMD and Best Nvidia articles. These are a "nostalgia series" of articles, talking about some of the good and bad old days. Don't worry, the RTX 40-series will also get its fair share of derision with the next and final piece.Neilbob said:Not too sure I agree that the RX 7600 is one of the worst so much as a meh product with a slightly poor price and value proposition; trouble is meh products with questionable pricing and value rather sums up the majority of the current generation.
I'm certain there have been far worse options over the years (although my wrinkly old noggin can't think of them at this moment in time).
The RX 7600 represents the worst of the 7000-series, though the 7700 XT certainly gives it some competition, and the 7800 XT isn't riding high on the hog either. It's simply lackluster in almost every important way. DP 2.1 is a checkbox feature that has almost no practical bearing on a ~$250 graphics card — are you really going to pair it with a new $750 monitor to make use of the ultra-high bandwidth it supports? AV1 and boosted AI performance are at least something, but these are primarily gaming cards and so a few percent improvement over the RX 6650 XT while bumping the price $20–$40 isn't a great showing.
These best/worst articles were planned weeks ago. Nvidia hitting 500+ was merely a news story, and while I did have something positive to say, if you actually read the text there's plenty of cynicism as well. We did the Best AMD GPUs before the Best Nvidia GPUs, and no one complained. Doing the Worst AMD GPUs before the Worst Nvidia GPUs just follows that pattern.garrett040 said:So one day we have a video of Nvidia touting its "500" games with dlss, then the following day we get "5 WORST AMD CARDS OF ALL TIME!"
yea this isn't sus at all.... this kind of behaviour specifically makes me WANT to get an amd gpu next upgrade... -
usertests Only the price makes the RX 7600 bad, and it almost launched with a higher MSRP than it did. It's just boring otherwise.Reply
The RX 6500 XT has truly earned its spot on the list, if not higher.