AMD Radeon R9 380X Nitro Launch Review

Power Usage Results

Measuring power consumption at idle can be difficult due to the system’s load spontaneously changing. For this reason, we use a longer observation and pick the most representative two-minute time interval for our reading. We then average the results from this interval.

Please note that the minimum and maximum results for the different rails in the tables below don’t necessarily add up to the overall total for that category. The reason for this is that the extremes on the individual rails don’t always occur all at the same time.

Idle Power Consumption

At idle, AMD's Radeon R9 380X consumes approximately 13W, and this number stays essentially the same when two monitors are connected.

Swipe to scroll horizontally
Header Cell - Column 0 MinimumMaximumAverage
PCIe0W21W8W
Motherboard 3.3V0W1W1W
Motherboard 12V0W15W5W
Graphics Card Total1W29W13W

Gaming Power Consumption (Metro: Last Light Loop)

The gaming loop runs at Ultra HD, since that resolution imposes the highest power consumption. Sapphire's R9 380X comes in only six watts above MSI’s R9 380, which is a smaller delta than the performance increase of approximately nine percent might have suggested. Tonga XT turns out to be a bit more efficient across several different benchmark runs.

The table paints a fairly decent picture:

Swipe to scroll horizontally
Header Cell - Column 0 MinimumMaximumAverage
PCIe73W212W141W
Motherboard 3.3V1W3W2W
Motherboard 12V24W78W48W
Graphics Card Total102W287W191W

The motherboard slot never hits its output ceiling. Even the spikes almost never exceed the slot’s maximum. This is perfectly done.

Pulling out a snapshot of just one moment in a game shows how power is distributed across the rails:

The overall distribution can be seen here in its usual gallery format:

Full Load Power Consumption

Even when AMD's Radeon R9 380X is pushed as hard as possible via FurMark, its power supply stays stable. The new card does exhibit the same behavior we observed from the Nano, though: once it really gets going, it doesn’t stop for anything. The 252W it consumed during the torture test almost matched the 390(X).

Swipe to scroll horizontally
Header Cell - Column 0 MinimumMaximumAverage
PCIeRow 0 - Cell 1 244W186W
Motherboard 3.3V1W2W2W
Motherboard 12V23W94W64W
Graphics Card Total94W340W252W 

We put together all of the individual full-load diagrams as well:

  • ingtar33
    so full tonga, release date 2015; matches full tahiti, release date 2011.

    so why did they retire tahiti 7970/280x for this? 3 generations of gpus with the same rough number scheme and same performance is sorta sad.
    Reply
  • wh3resmycar
    this has all the features, freesync, trueaudio, etc.
    Reply
  • logainofhades
    Features, and power ratings are lower. Tonga is 190w, vs 250w for Tahiti.
    Reply
  • Eggz
    Seems underwhelming until you read the price. Pretty good for only $230! It's not that much slower than the 970, but it's still about $60 cheaper. Well placed.
    Reply
  • chaosmassive
    been waiting for this card review, I saw photographer fingers on silicon reflection btw !
    Reply
  • Onus
    Once again, it appears that the relevance of a card is determined by its price (i.e. price/performance, not just performance). There are no bad cards, only bad prices. That it needs two 6-pin PCIe power connections rather than the 8-pin plus 6-pin needed by the HD7970 is, however, a step in the right direction.
    Reply
  • FormatC
    I saw photographer fingers on silicon

    I know, this are my fingers and my wedding ring. :P
    Call it a unique watermark. ;)
    Reply
  • psycher1
    Honestly I'm getting a bit tired of people getting so over-enthusiastic about which resolutions their cards can handle. I barely think the 970 is good enough for 1080p.

    With my 2560*1080 (to be fair, 33% more pixels) panel and a 970, I can almost never pull off ultimate graphic settings out of modern games, with the Witcher 3 only averaging about 35fps while at medium-high according to GeForce Experience.

    If this is already the case, give it a year or two. Future proofing does not mean you should need to consider sli after only 6 months and a minor display upgrade.
    Reply
  • Eggz
    16976217 said:
    Honestly I'm getting a bit tired of people getting so over-enthusiastic about which resolutions their cards can handle. I barely think the 970 is good enough for 1080p.

    With my 2560*1080 (to be fair, 33% more pixels) panel and a 970, I can almost never pull off ultimate graphic settings out of modern games, with the Witcher 3 only averaging about 35fps while at medium-high according to GeForce Experience.

    If this is already the case, give it a year or two. Future proofing does not mean you should need to consider sli after only 6 months and a minor display upgrade.

    Yeah, I definitely think that the 980 ti, Titan X, FuryX, and Fury Nano are the first cards that adequately exceed 1080p. No cards before those really allow the user to forget about graphics bottlenecks at a higher standard resolution. But even with those, 1440p is about the most you can do when your standards are that high. I consider my 780 ti almost perfect for 1080p, though it does bottleneck here and there in 1080p games. Using 4K without graphics bottlenecks is a lot further out than people realize.

    Reply
  • ByteManiak
    everyone is playing GTA V and Witcher 3 in 4K at 30 fps and i'm just sitting here struggling to get a TNT2 to run Descent 3 at 60 fps in 800x600 on a Pentium 3 machine
    Reply