AMD Radeon RX Vega 56 8GB Review

Board Layout & Components

Board Layout

AMD’s RX Vega 64, RX Vega 56, and Vega Frontier Edition employ the exact same board and components. The only differences are the package soldered to the board (this includes HBM2) and adjusted firmware. If AMD wanted to shorten Vega to the length of Radeon R9 Nano, it could.

The lack of memory modules outside of the package opened up some possibilities for AMD to get creative with its PCB. Now, the voltage regulators are found where you'd expect to find GDDR5, and we're looking at a classic 6+1-phase design for power delivery to the GPU and HBM2.

A pair of eight-pin auxiliary power connectors have one coil each; they help smooth out certain voltage peaks. Interestingly, though, there are no large capacitors to be seen.

GPU Power Supply

At the center of it all is International Rectifier's IR35217, a poorly-documented dual-output multi-phase controller able to provide six phases for the GPU and two others. A closer look reveals 12 regulator circuits though, not just six. This is a result of doubling, allowing the load from each phase to be distributed between two regulator circuits.

Six IR3598 interleaved MOSFET drivers on the back of the board are responsible for the doubling. These are the parts we pointed out earlier. The following video was taken at idle and shows how the PWM controller switches the load back and forth between the circuits. This keeps efficiency high by using only one phase, but also avoids overloading a single circuit over prolonged periods of time.

AMD Radeon Vega Frontier Edition - Idle - Phase Doublers

The actual voltage conversion for each of the 12 regulator circuits is handled by an IRF6811 on the high side and an IRF6894 on the low side, which also contains the necessary Schottky diode. Both are International Rectifier HEXFETs that we've seen AMD use before.

For the coils, AMD went with encapsulated ferrite core chokes that are soldered in the front. At 190nH, their inductivity is a bit lower than the 220nH we often see.

Memory Power Supply

The memory's one phase is supplied by the IR35217 as well. One phase is plenty, since on-package HBM2 needs a lot less power. A CHL815 gate driver is found on the back of the board. For the voltage converters, AMD opted for ON Semiconductor's NTMFD4C85N, a dual N-channel MOSFET that supplies the high and low sides.

It’s interesting that AMD went with flat SMD capacitors instead of can caps. The somewhat lower capacity is compensated for by simply running two of them in parallel on the back of the board. It does make sense to spread the hot-spots and make the thermal solution's job a little easier. Waste heat is kept to a minimum, as is the cost associated with cooling.

At 220nH, the coils are a bit larger this time around. The ones corresponding to the "partial voltage" converters, which operate at a much lower frequency, are even larger at 820nH. They don’t have to deal with the same amounts of power, though.

Other Voltage Converters

Creating the VDDCI isn’t a very difficult task. But it's an important one since this regulates the transition between the internal GPU and memory signal levels. It’s essentially the I/O bus voltage between the GPU and the memory. As such, two constant sources of 1.8V and 0.8V are supplied.

Underneath the GPU, there’s an Anpec APL5620 low drop-out linear regulator, which provides the very low voltage for the phase locked loop (PLL) area.

ON Semiconductor's MC74HC238A demultiplexer drives the LED bar that shows the power supply’s load. It’s a fun gimmick, but does get annoying in a dark room at night due to its brightness.

The remaining components are the usual fare.

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

This thread is closed for comments
69 comments
    Your comment
  • kjurden
    What a crock! I didn't realize that Tom's hardware pandered to the iNvidiot's. AMD VEGA GPU's have rightfully taken the performance crown!
  • rwinches
    Just when on sale Newegg and Amazon $399... Gone!
  • Martell1977
    Vega 56 vs GTX 1070, Vega goes 6-2-2 = Winner Vega!

    Good job AMD, hopefully next gen you can make more headway in power efficiency. But this is a good card, even beats the factory OC 1070.
  • Wisecracker
    Thanks for the hard work and in-depth review -- any word on Vega Nano?

    Some 'Other Guys' (Namer Gexus?) were experimenting on under-volting and clock-boosting with interesting results. It's not like you guys don't have enough to do, already, but an Under-Volt-Off Smack Down between AMD and nVidia might be fun for readers ...
  • thomas.moore.ii
    Yawn....... It's 4am here at the party.....you just now showing up Vega?
  • 10tacle
    2539190 said:
    What a crock! I didn't realize that Tom's hardware pandered to the iNvidiot's. AMD VEGA GPU's have rightfully taken the performance crown!


    Yeah Tom's Hardware does objective reviewing. If there are faults with something, they will call them out like the inferior VR performance over the 1070. This is not the National Inquirer of tech review sites like WCCTF. There are more things to consider than raw FPS performance and that's what we expect to see in an honest objective review.

    Guru3D's conclusion with caveats:

    "For PC gaming I can certainly recommend Radeon RX Vega 56. It is a proper and good performance level that it offers, priced right. It's a bit above average wattage [consumption] compared to the competitions product in the same performance bracket. However much more decent compared to Vega 64."

    Tom's conclusion with caveats:

    "Even when we compare it to EVGA’s overclocked GeForce GTX 1070 SC Gaming 8GB (there are no Founders Edition cards left to buy), Vega 56 consistently matches or beats it. [snip] But until we see some of those forward-looking features exposed for gamers to enjoy, Vega 56’s success will largely depend on its price relative to GeForce GTX 1070."

    ^^And that's the truth. If prices of the AIB cards coming are closer to the GTX 1080, then it can't be considered a better value. This is not AMD's fault of course, but that's just the reality of the situation. You can't sugar coat it, you can't hide it, and you can't spin it. Real money is real money. We've already seen this with the RX 64 prices getting close to GTX 1080 Ti territory.

    With that said, I am glad to see Nvidia get direct competition from AMD again in the high end segment since Fury even though it's a year and four months late to the party. In this case, the reference RX 56 even bests an AIB Strix GTX 1070 variant in most non-VR games. That's promising for what's going to come with their AIB variants. The question now is what's looming on the horizon in an Nvidia response with Volta. We'll find out in the coming months.
  • shrapnel_indie
    We've seen what they can do in a factory blower configuration. Are board manufacturers allowed to take 64 and 56 and do their own designs and cooling solutions, where they can potentially coax more out of it (power usage aside)? Or are they stuck with this configuration as Fury X and Fury Nano were stuck?
  • 10tacle
    No, there will be card vendors like ASUS, Gigabyte, and MSI who will have their own cooling. Here's a review of an ASUS RX 64 Strix Gaming:

    http://hexus.net/tech/reviews/graphics/109078-asus-radeon-rx-vega-64-strix-gaming/
  • pepar0
    134065 said:
    Radeon RX Vega 56 should be hitting store shelves with 3584 Stream processors and 8GB of HBM2. Should you scramble to snag yours or shop for something else? AMD Radeon RX Vega 56 8GB Review : Read more

    Will any gamers buy this card ... will any gamers GET to buy this card? Hot, hungry, noisy and expensive due to the crypto currency mining craze was not what this happy R290 owner had in mind.
  • filipcristianstroe
    LOL.. Vega 56 > 1070 but that's not what im vamped about. AMD needs to get their Mod Edit Language together. Don't you guys see the Vega 56 beats the Vega 64 in witcher 3 1440p? LOL what in the world?
  • FormatC
    2539207 said:
    This linked review was done with a not stable working 3rd party tool and the results are mostly not plausible. I tried to reproduce this values a few times and it won't work for me. It is very difficult to change Vega's voltage and to get really stable results. It is simply not my style to publish click-bait reviews instead of reproducible and serious results. Sorry for that ;)

    BTW:
    You can undervolt it a little bit, but you have also to analyze the frame times! Only fps are saying simply nothing about the picture quality. With all this ups and downs you get a horrible, micro-stuttering result. About this effect I wrote a few times.
  • TMTOWTSAC
    2539190 said:
    What a crock! I didn't realize that Tom's hardware pandered to the iNvidiot's. AMD VEGA GPU's have rightfully taken the performance crown!


    You're ignoring the Titan XP because it isn't really a consumer gaming card right? And the 1080 ti, because...reasons? There's an outside chance of taking the value crown. I'd go with that, assuming everyone in this thread who wants one is able to buy one today for <$400.
  • redgarl
    @ 10Tacle... "Yeah Tom's Hardware does objective reviewing" Just no... they are not.

    1080p benches for CPU without 1440p and 2160p counterparts just for example. This is manipulation that can drive sales.

    Guy number 1: Check benches on Toms: "Oh Ryzen sux, I am not buying it for my next 1440p system".

    Guy number 2: Check benches on kitguru: "Oh Ryzen is offering the same gaming experience at 4k than Intel... and they kick Intel butt all over the place in multi-threaded application... I am buying it for my next 1440p system".

    See, I just proved you wrong.
  • redgarl
    2020099 said:
    2539190 said:
    What a crock! I didn't realize that Tom's hardware pandered to the iNvidiot's. AMD VEGA GPU's have rightfully taken the performance crown!
    You're ignoring the Titan XP because it isn't really a consumer gaming card right? And the 1080 ti, because...reasons? There's an outside chance of taking the value crown. I'd go with that, assuming everyone in this thread who wants one is able to buy one today for <$400.


    This card is the strongest miner at the cheapest MSRP. It will sell really well... unfortunately for gamers.
  • 10tacle
    251426 said:
    @ 10Tacle... "Yeah Tom's Hardware does objective reviewing" Just no... they are not.


    I was wondering when you'd show up complaining. I guess you missed the Guru3D article of this GPU earlier this month and their generally SAME conclusions. Provide evidence to back up your assertions every time there's a single negative comment on a review of an AMD product that Tom's is biased against AMD. You cannot and you know you cannot.

    And your 1080p benchmark argument is a fail, because EVERY major tech review website uses 1080p as a CPU gaming benchmark. This is not new either. You can go back TEN YEARS on Tom's Hardware, Guru3D, and others who ran a 1280x1024 CPU gaming benchmark resolution when the high end resolutions were the 2K of the time 1600x1200 and 4K of the time 1920x1080. The other reason your 1080p argument is a fail is because there are a lot of gamers out there with 144Hz 1080p monitors. The rest of your comment is just hypothetical nonsense with no statistical data to back it up, and you know it.
  • P1nky
    Your sweet spot graph is wrong. The right vertical axis numbers are for Watt/FPS (what?!) not FPS/Watt.

    Isn't there a better way to measure the sweet spot that a "bulge"?
  • FormatC
    Label fixed :)
  • P1nky
    Check it again. It's not updated. I see that graph twice now with "FPS/Watt". Used 2 browsers.
  • caustin582
    Just increase the clock rate and pump enough power into a card until it edges out the competition in raw performance. What an elegant strategy, AMD.

    I'd be interested in seeing benchmark comparisons between a 1070 and a Vega 56 both OC'd to their max stable frequencies with the same temperature caps. Something tells me the 1070 will win by a long shot every time. I honestly wish AMD would put out something to get really excited about, but it looks like they just gave up and went with the brute force approach.
  • artk2219
    Honestly the most interesting part about Vega is that it can be an efficient architecture if you're not chasing absolute performance. This bodes well for the mobile and APU implementations of it. That power bleed when you start chasing the rabbit is interesting though, you also see it with Ryzen when it starts doing the same thing on the same process, it seems like there are definitely some tweaks that need to happen to the process and the architecture. I'm not sure if we will see a part akin to the radeon 4890 which was basically a cleaned up and tweaked 4870, but I'm hoping we see something like Thuban on Ryzens side, a cleaned up and tweaked phenom II with more cores added. Honestly though we may not see any fixes until 7nm, and this may just be a place holder for Navi (There's always next year, its like a recurring joke almost :-/).
  • 10tacle
    300537 said:
    I'd be interested in seeing benchmark comparisons between a 1070 and a Vega 56 both OC'd to their max stable frequencies with the same temperature caps. Something tells me the 1070 will win by a long shot every time. I honestly wish AMD would put out something to get really excited about, but it looks like they just gave up and went with the brute force approach.


    Well in all fairness to AMD, they don't have the resources to back development of both a new CPU and GPU simultaneously. Also remember that Nvidia is not making CPUs. They can focus solely on CPUs. Not so with AMD. AMD put most of their resources into Ryzen which was a good move. It is earning them much needed revenue.

    But lackadaisical overclocking been the general pattern of AMD GPUs for quite some time. They are not near as overclock friendly as Nvidia GPUs. When Fury X came out, it was on par with a reference GTX 980 Ti. However, it had very little overclocking headroom as it was already near maximum clocks out of the factory. An AIB vendor overclocked 980 Ti beat it, and overclocked on top of that it destroyed the Fury X.

    But if you want to get a hint of things to come, check out my Hexus link above in a previous post showing a pre-production ASUS Strix "ROG" Gaming RX 64 and comparisons with the reference RX 64:

    (Reference vs. Strix FPS at 2560x1440)
    Battlefield 1 - 71 vs. 75
    Dues Ex - 55 vs. 60
    Fallout 4 - 66 vs. 70
    Warhammer - 100 vs. 103
    Witcher 3 - 75 vs. 79

    Now the last ASUS GTX 1080 Strix tested at Guru3D got this with the only same tested game, Dues Ex (reference vs. Strix vs. overclocked Strix at 2560x1440): 69 vs. 75 vs. 81.

    Hexus tested on an overclocked 7700K overclocked to 4.6GHz and Guru3D with an overclocked i7-5960X to 4.4GHz so really there's no difference there at 1440p. My guess is the lower numbers of the pre-production RX 64 Strix was due to early drivers. But it does give you a comparative snapshot of reference vs. AIB aftermarket GPUs between Vega and Pascal - pretty close. They key as you said will be in what can the higher clocked AIB Vegas do overclocked beyond out of box.
  • king3pj
    This seriously looks like a great card. If it would have been available with third party coolers last July I would have bought one with a 1440p 144Hz Freesync monitor instead of my 1070 and 1440p 144Hz Gsync monitor.

    It wasn't a surprise that it beat the 1070 in DX 12 and Vulkan games like Battlefield 1 and Doom but what was surprising to me is that it is essentially equal to the 1070 in DX 11 games like The Witcher 3. Unfortunately it was just a year to late for me and now that I've invested in a Gsync monitor I don't see how AMD can win me back in the foreseeable future.
  • artk2219
    2292022 said:
    134065 said:
    Radeon RX Vega 56 should be hitting store shelves with 3584 Stream processors and 8GB of HBM2. Should you scramble to snag yours or shop for something else? AMD Radeon RX Vega 56 8GB Review : Read more
    Will any gamers buy this card ... will any gamers GET to buy this card? Hot, hungry, noisy and expensive due to the crypto currency mining craze was not what this happy R290 owner had in mind.


    The hardest part is finding it for a decent price, we've put up with worse in the past. Or does no one remember the gtx 480, the nvidia fx 5000 series, and radeon 2900 series, or the blowers on the x800 and x1800 / x1900 cards?