AMD Radeon RX 580 Graphics Card Price List
The Radeon RX 580 is AMD's fastest Polaris-based graphics card. Similar to its predecessor, RX 580 is based on a Polaris 10 GPU code-named "Ellesmere XT." But it outpaces the previous-gen Radeon RX 480 thanks to higher base and boost clock rates. Otherwise, it's unaltered.
Polaris 10 is manufactured using GlobalFoundries' 14nm FinFET process, helping the Radeon RX 580 deliver more performance than most of AMD's 28nm GPUs at a fraction of the power budget. This also helps improve the RX 580's value proposition. Radeon RX 580s with 8GB of GDDR5 launched for around $250. Models with 4GB of memory were also available at lower prices.
Today, these cards sell for more than when they were introduced, and may continue to do so for the foreseeable future. This is due to the sharp increase in cryptocurrency mining, which Polaris-based boards are well-suited to.
MORE: Best Deals
AMD Radeon RX 500-Series GPUs
GPU | AMD Radeon RX 580 | AMD Radeon RX 570 | AMD Radeon RX 560 | AMD Radeon RX 550 |
Code-name | Ellesmere XT | Ellesmere | Baffin | Lexa |
Shader Units | 2304 | 2048 | 1024 | 512 |
Texture Units | 144 | 128 | 64 | 32 |
ROPs | 32 | 32 | 16 | 16 |
Transistor Count | 5.7 Billion | 5.7 Billion | 3 Billion | 2.2 Billion |
Base Clock / Boost Clock | 1257 MHz / 1340 MHz | 1168 MHz / 1244 MHz | 1175 MHz / 1275 MHz | 1100 MHz / 1183 MHz |
Memory | Up To 8GB GDDR5 @ 8 Gb/s256-bit | Up To 8GB GDDR5 @ 7 Gb/s256-bit | Up To 4GB GDDR5 @ 7 Gb/s128-bit | Up To 4GB GDDR5 @ 7 Gb/s128-bit |
TDP | 185W | 150W | 80W | 50W |
MORE: Best Graphics
MORE: AMD Radeon RX 580 8GB Review
MORE: Desktop GPU Performance Hierarchy Table
MORE: All Graphics Content
Below is a list of all currently available RX 580 graphics cards, separated by brand:
Gigabyte
MSI
PowerColor
Sapphire
XFX
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
-
ali744husain I did a quick search on Amazon and Nowinstock and found multiple RX580 GPUs under 300$, some even 250$.Reply
-https://www.amazon.com/gp/offer-listing/B071CMPRZZ/ref=dp_olp_new_mbc?ie=UTF8&condition=new
-https://www.amazon.com/gp/offer-listing/B06ZZGXTTK/ref=dp_olp_all_mbc?ie=UTF8&condition=all
-https://www.amazon.com/gp/offer-listing/B06Y44TWF3/ref=dp_olp_all_mbc?ie=UTF8&condition=all
-https://www.amazon.com/gp/offer-listing/B06XZRWT8D/ref=dp_olp_all_mbc?ie=UTF8&condition=all
-https://www.amazon.com/gp/offer-listing/B07115GPN7/ref=dp_olp_all_mbc?ie=UTF8&condition=all -
SamSerious I wouldnt buy a 4GB graphics card for 250$ and more in 2017, its just not future proof anymore.Reply -
cryoburner
I'm not sure any card at the performance level of an RX 580 or GTX 1060 could be considered all that "future proof" though. These cards are best suited to 1080p, and at that resolution 4GB is currently plenty. By the time games really start benefiting from having more than 4GB of VRAM at 1080p, the card's graphics processor probably won't be able to run those games at high settings with stable frame rates anyway. And at 1440p, these cards are already hitting the limits of what their graphics processor can do. 1440p typically only requires half a GB or less additional VRAM compared to 1080p, but the GPU has to push 78% more pixels per second, making the card's raw performance more of a concern in that case.20271548 said:I wouldnt buy a 4GB graphics card for 250$ and more in 2017, its just not future proof anymore.
I would say that 4GB is still a reasonable amount of VRAM for a card in this performance range, and additional memory might not see much use unless you keep using the card until the point where you are getting unstable frame rates at medium settings at 1080p. Moving up to a card with 6 or 8GB could still be considered reasonable if the price difference isn't too huge, but these cards are already marked up higher than what they launched for, stretching people's budgets as it is. The GTX 1060 does offer 6GB for around the same price as a 4GB RX 580 now, though if "future proofing" is a major concern, the 580's superior performance in most DX12 and Vulcan titles is probably worth considering as well.
I would be more concerned about the 3GB version of the GTX 1060. With 25% less memory, it's going to be running into performance issues sooner than a card with 4GB, and having 10% of its cores disabled isn't going to help. -
redgarl Still 100$ above MSRP... I guess we should forget about the Polaris cards. I need a card for 1080p gaming, so it sux.Reply -
TekUK Buying an 8GB RX480 was the biggest mistake I ever made. And not just because I use 1080p.Reply
I wish I would of just stayed with my R9 280X for a while longer ! -
penn919 At least you got to pay MSRP....I was holding on to an old HD 7950 in hopes of getting Vega..now I can't even get polaris for a reasonable price. Still tuck with an old card, but it's still acceptable for what I play...for now.Reply -
alextheblue
That depends on a lot of factors. Back a couple years ago people were saying the same exact thing about 2GB vs 4GB buffers. In some cases they were right, budget cards with 2GB were garbage. But at that time there were mid-range or better 2GB cards with enough raw horsepower to run modded Skyrim, but came up short when super high res textures (and more varied textures) were added. Also, the Xbox One X has around the same GPU horsepower as a RX 580, and it has a good chunk of RAM dedicated to whatever game you're running. The point is that newer games intended for multiplatform or ported to PC will increasingly use ever-larger textures. You can always turn textures down, but there again is a case of having the horsepower, but not the RAM. Meaning users can claim they run the latest games just as well as any 8GB card, but neglect to mention they can't turn textures all the way up for a more beautiful environment.20271963 said:I'm not sure any card at the performance level of an RX 580 or GTX 1060 could be considered all that "future proof" though. These cards are best suited to 1080p, and at that resolution 4GB is currently plenty. By the time games really start benefiting from having more than 4GB of VRAM at 1080p, the card's graphics processor probably won't be able to run those games at high settings with stable frame rates anyway.
That reminds me... game installs are getting massive. My next PC might need a 1TB SSD. :P
I agree with this in theory. The reality is that many developers work with Nvidia (*cough*kickbacks*cough*) to stay under 3GB at certain settings. I suspect that many games will be configured such that presets will jump right from "3GB is good enough" right to "need more than 4GB for best performance". Fine tuning outside of presets might yield better results for 4GB cards, in *some* cases where finely-grained texture options are available. But even in that scenario, who does that in reviews? Presets rule the roost.20271963 said:I would be more concerned about the 3GB version of the GTX 1060. With 25% less memory, it's going to be running into performance issues sooner than a card with 4GB, and having 10% of its cores disabled isn't going to help.
Of course, not all developers are on the take. Some remain fiercely independent, and 4GB may be enough and 3GB cards may have to drop settings. -
torka I get that companies add their own coolers and adjust clock rates to sell various versions of these cards, but I can't be alone in thinking that they all look like garbage.Reply
I'd rather take a slight performance and cooling hit just to have a sexy reference card -
shrapnel_indie 20271963 said:
I'm not sure any card at the performance level of an RX 580 or GTX 1060 could be considered all that "future proof" though. These cards are best suited to 1080p, and at that resolution 4GB is currently plenty. By the time games really start benefiting from having more than 4GB of VRAM at 1080p, the card's graphics processor probably won't be able to run those games at high settings with stable frame rates anyway. And at 1440p, these cards are already hitting the limits of what their graphics processor can do. 1440p typically only requires half a GB or less additional VRAM compared to 1080p, but the GPU has to push 78% more pixels per second, making the card's raw performance more of a concern in that case.20271548 said:I wouldnt buy a 4GB graphics card for 250$ and more in 2017, its just not future proof anymore.
I would say that 4GB is still a reasonable amount of VRAM for a card in this performance range, and additional memory might not see much use unless you keep using the card until the point where you are getting unstable frame rates at medium settings at 1080p. Moving up to a card with 6 or 8GB could still be considered reasonable if the price difference isn't too huge, but these cards are already marked up higher than what they launched for, stretching people's budgets as it is. The GTX 1060 does offer 6GB for around the same price as a 4GB RX 580 now, though if "future proofing" is a major concern, the 580's superior performance in most DX12 and Vulcan titles is probably worth considering as well.
I would be more concerned about the 3GB version of the GTX 1060. With 25% less memory, it's going to be running into performance issues sooner than a card with 4GB, and having 10% of its cores disabled isn't going to help.
It, the 3GB crippled 1060 Memory, has already shown itself to be an issue:
And in titles like Doom, Hitman, and to a lesser extent, Rise of the Tomb Raider, the 3GB GeForce takes care of itself by running out of memory, even at 1920x1080. If you aren’t careful to manage the 1060’s detail settings, Nvidia’s GeForce GTX 1050 Ti sometimes ends up faster thanks to its 4GB of GDDR5.
- AMD Radeon RX 570 4GB Review
by Igor Wallossek, Tom's Hardware, April 19, 2017 at 6:00 AM
-
80-watt Hamster 20277785 said:I get that companies add their own coolers and adjust clock rates to sell various versions of these cards, but I can't be alone in thinking that they all look like garbage.
I'd rather take a slight performance and cooling hit just to have a sexy reference card
Agreed, save for Sapphire. Their stuff still looks nice if you ignore the LEDs. But AMD's reference cards are not, and IIRC have never been, attractive. The classiest-looking AMD-based cards in recent memory were probably the XFX 280s.