AMD Radeon RX 560 4GB Review: 1080p Gaming On The Cheap

AMD’s Radeon RX 500-series refresh was a quick blitz of mostly recycled GPUs and iterative nomenclature. But there were a couple of developments that made the 500-series notably more competitive than the Radeon RX 400s before them.

For instance, we got a first taste of Polaris 12, code-named Lexa, in our Radeon RX 550 2GB Review. That GPU was designed to fill a gap under $100 where entry-level favorites like Radeon R7 260 and 360 once lived. It, along with Nvidia’s competing GeForce GT 1030, were welcome additions to a long-neglected segment.

Just above the RX 550 in its refreshed product stack, AMD unveiled a Radeon RX 560 that also piqued our interest. Last year, we identified the Radeon RX 460 as a clear step up from previous-gen Bonaire-based cards and Nvidia’s GeForce GTX 750 Ti. Radeon RX 560 takes the same Polaris 11 GPU, enables all of its shading/texturing resources, increases its clock rate, and (theoretically) lowers its price.

If the RX 460 was already a strong option for HD gaming, we couldn’t wait to see how Radeon RX 560 improved upon it.

Meet Radeon RX 560: More Shaders; Higher Clocks

When Radeon RX 460 launched more than a year ago, we hadn’t seen a new mainstream GPU from AMD in years. Until then, everything was repackaged first- and second-generation GCN designs. Naturally, though, the shift to 14nm FinFET inherently meant new processors, even if they shared a lot of architectural attributes with their predecessors. Now the company is massaging its first wave of Polaris GPUs to better situate them against a full portfolio of Pascal-based competition. 

Compared to Polaris 10, composed of 5.7 billion transistors on a 232 mm² die, Radeon RX 560’s processor packs three billion transistors into 123 square millimeters of die space. It’s similarly based on AMD’s fourth-gen GCN architecture, but rebalanced for more power-sensitive applications.

A single Graphics Command Processor up front is still responsible for dispatching graphics queues to the Shader Engines. So too are the Asynchronous Compute Engines tasked with handling compute queues. As with Polaris 10, this chip’s command processing logic consists of four ACEs, with two Hardware Scheduler units in place for prioritized queues, temporal/spatial resource management, and offloading CPU kernel mode driver scheduling tasks. While many resources are trimmed moving from Polaris 10 to 11, this is not one of them.

Shader Engines, on the other hand, are halved—Polaris 11 gets two, compared to Polaris 10’s four. But whereas the version of Polaris 11 that went into Radeon RX 460 featured seven active Compute Units per SE, Radeon RX 560 gets a completely uncut GPU sporting 16 total CUs. Given 64 Stream processors and four texture units per CU, the math for Radeon RX 560 adds up to 1024 shaders and 64 texture units across the GPU—a ~14% increase.


AMD Radeon RX 560
Asus ROG Strix RX 560 O4GB Gaming
Nvidia GeForce GTX 1050
Nvidia GeForce GTX 1050 Ti
GPU
Polaris 11/Baffin
Polaris 11/Baffin
GP107
GP207
Shaders
1024
1024
640
768
Base Clock Frequency
1175 MHz
1221 MHz
1354 MHz1290 MHz
Boost Clock Frequency
1275 MHz
1326 MHz
1455 MHz1392 MHz
Memory Size & Type
4GB GDDR5
4GB GDDR52GB GDDR5
4GB GDDR5
Process Technology
14nm
14nm14nm
14nm
Transistors
3 Billion
3 Billion
3.3 Billion
3.3 Billion
Texture Units
64
6440
48
Texture Fillrate
81.6 GT/s
84.9 GT/s
58.2 GT/s66.8 GT/s
ROPs
16
16
32
32
Pixel Fillrate
20.4 GPix/s21.2 GPix/s
46.6 GPix/s44.5 GPix/s
Memory Bus
128-bit
128-bit128-bit
128-bit
Memory Clock Frequency
3500 MHz
3500 MHz
3504 MHz
3504 MHz
Memory Bandwidth
112 GB/s
112 GB/s112.1 GB/s
112.1 GB/s
TDP
80W
80W/100W
75W
75W

Two render back-ends per Shader Engine, each with four ROPs, total 16 pixels per clock, or, again, half of what you get from Radeon RX 580/570. Polaris 11’s memory bus is also cut in half to 128 bits. AMD tries to compensate somewhat with 7 Gb/s GDDR5, but even then, you’re only looking at 112 GB/s of bandwidth. This spec is unchanged from the Radeon RX 460.

There are higher GPU clock rates to talk about, though. AMD specifies a base frequency of 1175 MHz and a Boost ceiling of 1275 MHz. The company sent our U.S. and German labs Asus’ ROG Strix Radeon RX 560 O4GB Gaming OC Edition to test, which is overclocked to a 1326 MHz Boost frequency in Gaming mode.

Interestingly, while the Radeon RX 460 was officially rated under 75W, opening the door to implementations without auxiliary power, all of the boards we tested had six-pin connectors. For its higher-clocked Radeon RX 560, AMD cites a typical board power of 80W. And yet, we’ve already seen versions with no power connector and higher-than-reference clock rates. Our Asus cards do, however, come equipped with six-pin inputs.

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

Create a new thread in the Reviews comments forum about this subject
This thread is closed for comments
25 comments
Comment from the forums
    Your comment
  • firerod1
    Cute the price of the 560 by 20$ than it will work.
  • RomeoReject
    Cutting it by $20 would make it a $100 card. They'd likely be losing money at that price point.
  • firerod1
    Anonymous said:
    Cutting it by $20 would make it a $100 card. They'd likely be losing money at that price point.


    I meant this card since it’s 1050 ti price while offering 1050 performance.
  • cryoburner
    Quote:
    ...we couldn’t wait to see how Radeon RX 560 improved upon it.


    Is that why you waited almost half a year to review the card? :3
  • shrapnel_indie
    Anonymous said:
    Quote:
    ...we couldn’t wait to see how Radeon RX 560 improved upon it.


    Is that why you waited almost half a year to review the card? :3


    Did you read the review?

    At the beginning of the conclusion:
    Quote:
    The pace at which new hardware hit our lab this summer meant we couldn’t review all of AMD’s Radeon RX 500-series cards consecutively.
  • Wisecracker
    4GB on the Radeon RX 560 = "Mining Card"

    The minimal arch (even with the extra CUs) can't use 4GB for gaming like the big brother 570. The 2GB RX 560 even trades blows with its 4GB twin, along with the 2GB GTX 1050, at the $110-$120 price point for the gamer bunch.

    Leave the RX 560 4GB for the "Entrepreneurial Capitalist" crowd ...
  • bit_user
    I think your power dissipation for the 1050 Ti is wrong. While I'm sure some OC'd model use more, there are 1050 Ti's with 75 W TDP.

    Also, I wish the RX 560 came in a low-profile version, like the RX 460 did (and the GTX 1050 Ti does). This excludes it from certain applications. It's the most raw compute available at that price & power dissipation.
  • senzffm123
    correct, i got one of those 1050 TI with 75 W TDP in my rig, doesnt have a power connector as well. hell of a card!
  • turkey3_scratch
    My RX 460 I bought for $120 back in the day (well, not that far back). There were some for $90 I remember, too. Seems like just an RX 460. Well, it is basically an RX 460.
  • jdwii
    Man Amd what is up with your GPU division for the first time ever letting Nvidia walk all over you in performance per dollar, performance per watt and overall performance, this is very sad.

    Whatever Amd is doing with their architecture and leadership in the GPU division needs to change. I can't even think of a time 2 years ago and before where nvidia ever offered a better value.
  • nukedathlonman
    I had flashed my XFX RX-460 to an RX-560 (no issues in doing this simple BIOS flash) - I got no complaints about it. Performs well (60FPS is all I aim for given my older 60hz non-freesync display) in gameing at 1920x1200 at high (Deus Ex Mankind) or medium high settings (GTA V) with an overclocked Phenom II x6 backing it up. GPU is still the bottleneck, buy I don't care given the systems age and how little I've spent on it.
  • nukedathlonman
    I flashed my 4GB XFX RX-460 to an RX-560 - got no complaints about either RX-460 or RX-560. I only aim for 60fps (older 60hz panel, no freesync) and get that with high or medium high settings at 1920x1200. I do like how quiet my card is.
  • cryoburner
    Anonymous said:
    Did you read the review?

    At the beginning of the conclusion:
    Quote:
    The pace at which new hardware hit our lab this summer meant we couldn’t review all of AMD’s Radeon RX 500-series cards consecutively.

    But they said they "couldn't wait" to review it, when they apparently could. : P

    And technically, the RX 560 was released in the spring, not the summer, though it's possible that they might not have got a unit in for review until a bit later. It is worth pointing out that the GT 1030 came out around the same time though, and they had no problem getting a review up for that over two and a half months ago.

    It also seems like an RX 560 review might have been worth prioritizing, in light of the fact that any higher-end cards from AMD have been priced out of the market for months due to cryptocurrency mining. Had it not been for the miners, the RX 570 would have likely been available for not much more than $150 by this point.
  • cangelini
    Anonymous said:
    I think your power dissipation for the 1050 Ti is wrong. While I'm sure some OC'd model use more, there are 1050 Ti's with 75 W TDP.

    Also, I wish the RX 560 came in a low-profile version, like the RX 460 did (and the GTX 1050 Ti does). This excludes it from certain applications. It's the most raw compute available at that price & power dissipation.


    Good catch--should be 75W. Fixed!
  • cangelini
    Anonymous said:
    Quote:
    ...we couldn’t wait to see how Radeon RX 560 improved upon it.


    Is that why you waited almost half a year to review the card? :3


    We didn't get RX 560s for our U.S. and German labs until recently--after all of the other craziness this year.
  • damric
    As someone pointed out, you can unlock the shaders on the RX 460 to full equivalent RX 560. Was figured out when all of the Mac versions of the RX 460 were already shipping fully unlocked.
  • bit_user
    Anonymous said:
    Man Amd what is up with your GPU division for the first time ever letting Nvidia walk all over you in performance per dollar, performance per watt and overall performance, this is very sad.

    Whatever Amd is doing with their architecture and leadership in the GPU division needs to change. I can't even think of a time 2 years ago and before where nvidia ever offered a better value.

    It's actually not bad, if you look at the benchies. Particularly in DX12 and Vulkan, it's very close to the more expensive 1050 Ti. Even beats it, in one case.

    As to your question about how this came about, the game changer seems to have come when Nvidia switched to tile-based rendering, in Maxwell (900 series). Ever since, AMD hasn't been able to catch up.
  • Cryio
    RX 560: As fast or faster than 1050.

    If we add Tessellation Override to x16 or x8 in driver: Substantially universally faster than 1050, probably on the same playing field or faster than 1050 Ti.

    Conclusion: Must buy as a low-end GPU.
  • logainofhades
    It is like AMD isn't even trying, due to their focus on Ryzen. Oh well, they make miners happy I guess.
  • cryoburner
    Anonymous said:
    Man Amd what is up with your GPU division for the first time ever letting Nvidia walk all over you in performance per dollar, performance per watt and overall performance, this is very sad.

    Whatever Amd is doing with their architecture and leadership in the GPU division needs to change. I can't even think of a time 2 years ago and before where nvidia ever offered a better value.

    Anonymous said:
    It is like AMD isn't even trying, due to their focus on Ryzen. Oh well, they make miners happy I guess.

    I don't think it's so much that they're not trying, it's that their cards were found to be better for cryptocurrency mining than Nvidia's, resulting in them being in short supply, and prices rose accordingly. From the launch of the RX 400 series last year, up until earlier this year, they were offering very good performance per dollar, and had compelling products readily available at the levels most people buy.

    Just six months ago, you could find plenty of RX 480s for well under $200, offering performance close to a GTX 1060 for considerably less. At times, some 4GB RX 480s even went on sale with rebates bringing them down near $150-$160, about what you would currently pay for a 1050 Ti with far less performance. The only real reason to consider a 1050 Ti then would have been if you had a pre-built system with an underpowered PSU or small form factor, since for a little more you could get an RX 470 or 480.

    They did take too long to fill in the high end of their range with Vega though. And I suspect that Vega would have been a much more impressive launch had it not been for mining messing up the market. Vega 56 and 64 might have had significantly lower official launch prices, and the cards would have probably been available for those prices, and not marked up further. Considering that their official launch price for the 8GB RX 580 was $229, it wouldn't have surprised me if Vega 56 would have been around $329 to $349, and Vega 64 around $429 to $449.

    I would definitely like to see AMD work on their efficiency though, since Radeon cards used to be quite good when it came to that, often better than Nvidia. I'd rather not have the noise and heat from a 200+ watt graphics card in my system if possible, and Nvidia currently has them beat on that. Of course, with the recent mining shortages, having better efficiency at a similar performance level could have actually made availability even worse.