AMD Radeon RX 560 4GB Review: 1080p Gaming On The Cheap

Early Verdict

Radeon RX 560 doesn't outperform its predecessor by as much as we expected. However, it's still a fairly cool, fairly quiet card that serves up modest frame rates at 1920x1080 in modern games, so long as you're willing to dial quality back to Medium presets.

Pros

  • +

    Compelling 1080p performance at reduced detail settings

  • +

    Manageable heat/power consumption

  • +

    Radeon RX 560 4GB is reasonably priced if you find a $120 card

Cons

  • -

    Comparable performance from GeForce GTX 1050 in most games

  • -

    Higher power consumption than GTX 1050

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

AMD Radeon RX 560 4GB: Polaris 11 Rides Again

AMD’s Radeon RX 500-series refresh was a quick blitz of mostly recycled GPUs and iterative nomenclature. But there were a couple of developments that made the 500-series notably more competitive than the Radeon RX 400s before them.

For instance, we got a first taste of Polaris 12, code-named Lexa, in our Radeon RX 550 2GB Review. That GPU was designed to fill a gap under $100 (£120) where entry-level favorites like Radeon R7 260 and 360 once lived. It, along with Nvidia’s competing GeForce GT 1030, were welcome additions to a long-neglected segment.

Just above the RX 550 in its refreshed product stack, AMD unveiled a Radeon RX 560 that also piqued our interest. Last year, we identified the Radeon RX 460 as a clear step up from previous-gen Bonaire-based cards and Nvidia’s GeForce GTX 750 Ti. Radeon RX 560 takes the same Polaris 11 GPU, enables all of its shading/texturing resources, increases its clock rate, and (theoretically) lowers its price.

If the RX 460 was already a strong option for HD gaming, we couldn’t wait to see how Radeon RX 560 improved upon it.

Meet Radeon RX 560: More Shaders; Higher Clocks

When Radeon RX 460 launched more than a year ago, we hadn’t seen a new mainstream GPU from AMD in years. Until then, everything was repackaged first- and second-generation GCN designs. Naturally, though, the shift to 14nm FinFET inherently meant new processors, even if they shared a lot of architectural attributes with their predecessors. Now the company is massaging its first wave of Polaris GPUs to better situate them against a full portfolio of Pascal-based competition. 

Compared to Polaris 10, composed of 5.7 billion transistors on a 232 mm² die, Radeon RX 560’s processor packs three billion transistors into 123 square millimeters of die space. It’s similarly based on AMD’s fourth-gen GCN architecture, but rebalanced for more power-sensitive applications.

A single Graphics Command Processor up front is still responsible for dispatching graphics queues to the Shader Engines. So too are the Asynchronous Compute Engines tasked with handling compute queues. As with Polaris 10, this chip’s command processing logic consists of four ACEs, with two Hardware Scheduler units in place for prioritized queues, temporal/spatial resource management, and offloading CPU kernel mode driver scheduling tasks. While many resources are trimmed moving from Polaris 10 to 11, this is not one of them.

Shader Engines, on the other hand, are halved—Polaris 11 gets two, compared to Polaris 10’s four. But whereas the version of Polaris 11 that went into Radeon RX 460 featured seven active Compute Units per SE, Radeon RX 560 gets a completely uncut GPU sporting 16 total CUs. Given 64 Stream processors and four texture units per CU, the math for Radeon RX 560 adds up to 1024 shaders and 64 texture units across the GPU—a ~14% increase.

Swipe to scroll horizontally
Header Cell - Column 0 AMD Radeon RX 560Asus ROG Strix RX 560 O4GB GamingNvidia GeForce GTX 1050Nvidia GeForce GTX 1050 Ti
GPUPolaris 11/BaffinPolaris 11/BaffinGP107GP207
Shaders10241024640768
Base Clock Frequency1175 MHz1221 MHz1354 MHz1290 MHz
Boost Clock Frequency1275 MHz1326 MHz1455 MHz1392 MHz
Memory Size & Type4GB GDDR54GB GDDR52GB GDDR54GB GDDR5
Process Technology14nm14nm14nm14nm
Transistors3 Billion3 Billion3.3 Billion3.3 Billion
Texture Units64644048
Texture Fillrate81.6 GT/s84.9 GT/s58.2 GT/s66.8 GT/s
ROPs16163232
Pixel Fillrate20.4 GPix/s21.2 GPix/s46.6 GPix/s44.5 GPix/s
Memory Bus128-bit128-bit128-bit128-bit
Memory Clock Frequency3500 MHz3500 MHz3504 MHz3504 MHz
Memory Bandwidth112 GB/s112 GB/s112.1 GB/s112.1 GB/s
TDP80W80W/100W75W75W

Two render back-ends per Shader Engine, each with four ROPs, total 16 pixels per clock, or, again, half of what you get from Radeon RX 580/570. Polaris 11’s memory bus is also cut in half to 128 bits. AMD tries to compensate somewhat with 7 Gb/s GDDR5, but even then, you’re only looking at 112 GB/s of bandwidth. This spec is unchanged from the Radeon RX 460.

There are higher GPU clock rates to talk about, though. AMD specifies a base frequency of 1175 MHz and a Boost ceiling of 1275 MHz. The company sent our U.S. and German labs Asus’ ROG Strix Radeon RX 560 O4GB Gaming OC Edition to test, which is overclocked to a 1326 MHz Boost frequency in Gaming mode.

Interestingly, while the Radeon RX 460 was officially rated under 75W, opening the door to implementations without auxiliary power, all of the boards we tested had six-pin connectors. For its higher-clocked Radeon RX 560, AMD cites a typical board power of 80W. And yet, we’ve already seen versions with no power connector and higher-than-reference clock rates. Our Asus cards do, however, come equipped with six-pin inputs.


MORE: Best Graphics Cards


MORE: Desktop GPU Performance Hierarchy Table


MORE: All Graphics Content

  • firerod1
    Cute the price of the 560 by 20$ than it will work.
    Reply
  • RomeoReject
    Cutting it by $20 would make it a $100 card. They'd likely be losing money at that price point.
    Reply
  • firerod1
    20235344 said:
    Cutting it by $20 would make it a $100 card. They'd likely be losing money at that price point.

    I meant this card since it’s 1050 ti price while offering 1050 performance.
    Reply
  • cryoburner
    ...we couldn’t wait to see how Radeon RX 560 improved upon it.

    Is that why you waited almost half a year to review the card? :3
    Reply
  • shrapnel_indie
    20235672 said:
    ...we couldn’t wait to see how Radeon RX 560 improved upon it.

    Is that why you waited almost half a year to review the card? :3

    Did you read the review?

    At the beginning of the conclusion:
    The pace at which new hardware hit our lab this summer meant we couldn’t review all of AMD’s Radeon RX 500-series cards consecutively.
    Reply
  • Wisecracker
    4GB on the Radeon RX 560 = "Mining Card"

    The minimal arch (even with the extra CUs) can't use 4GB for gaming like the big brother 570. The 2GB RX 560 even trades blows with its 4GB twin, along with the 2GB GTX 1050, at the $110-$120 price point for the gamer bunch.

    Leave the RX 560 4GB for the "Entrepreneurial Capitalist" crowd ...

    Reply
  • bit_user
    I think your power dissipation for the 1050 Ti is wrong. While I'm sure some OC'd model use more, there are 1050 Ti's with 75 W TDP.

    Also, I wish the RX 560 came in a low-profile version, like the RX 460 did (and the GTX 1050 Ti does). This excludes it from certain applications. It's the most raw compute available at that price & power dissipation.
    Reply
  • senzffm123
    correct, i got one of those 1050 TI with 75 W TDP in my rig, doesnt have a power connector as well. hell of a card!
    Reply
  • turkey3_scratch
    My RX 460 I bought for $120 back in the day (well, not that far back). There were some for $90 I remember, too. Seems like just an RX 460. Well, it is basically an RX 460.
    Reply
  • jdwii
    Man Amd what is up with your GPU division for the first time ever letting Nvidia walk all over you in performance per dollar, performance per watt and overall performance, this is very sad.

    Whatever Amd is doing with their architecture and leadership in the GPU division needs to change. I can't even think of a time 2 years ago and before where nvidia ever offered a better value.
    Reply