Radeon R9 290X Review: AMD's Back In Ultra-High-End Gaming

PowerTune: Balancing Performance And Acoustics

The last time we went into depth on AMD’s PowerTune technology was last year, when the company introduced its Boost feature to Radeon HD 7970 GHz Edition. Back then, we determined that the card’s base clock was stuck at 1 GHz, and overclocking consisted of moving the target on an extra P-state that’d hold as long as you ducked in under a power ceiling. All the way up, though, you’d see fan speed increase. Altering the fan speed through AMD’s OverDrive applet set a constant duty cycle, which probably wasn’t apropos all of the time.

With its Radeon HD 7790, AMD changed the behavior of PowerTune based on additional input from a second-gen VR controller. That same functionality carries over now to R9 290X.

So, now, PowerTune takes input from thermal sensors, creates an estimation of power use in real-time through activity counters, folds in telemetry data from the voltage regulator, and feeds that data into a digital power management arbitrator. That arbitrator is programmed to know the GPU’s power, thermal, and current limits. Within those bounds, it controls voltages, clock rates, and fan speeds, prioritizing maximum performance. If one of the input limits is exceeded, the arbitrator can pull back on voltage and/or frequency.

All of this can happen very quickly thanks to the aforementioned VR controller. Previously, there was a relatively long delay between the request for a higher voltage and a subsequent clock rate step. AMD’s second-gen serial VID is around two orders of magnitude faster (~10 µs rather than 1 ms), it provides confirmation of the switch, and it’s granular down to 6.25 mV steps.

With the ability to define and customize power, fan speed, GPU clock (performance), and target temperature, it becomes possible to very specifically dictate how an R9 290X behaves. Fan speed is one of the most clearly affected variables. Past cards employed a fan table that correlated temperature to RPM, but failed to deliver optimal acoustics—a point I’ve mentioned more than once. Now, however, the controller is both reactive and predictive, varying acceleration based on workload and, ideally, smoothing out changes to fan speed more than before.

Of course, all of this intelligence is still dependent on a well-designed thermal solution able to translate R9 290X’s 1 GHz clock rate and 95-degree temperature ceiling into friendly acoustics, even under load. By default, the card wants to run as close as possible to 1 GHz, and will let Hawaii get to 95 °C in the interest of spinning the fan slowly. You can imagine that the very nastiest loads will cause the fan to ramp up and up and up as it tries to maintain 95 degrees at 1 GHz. That’d be alright for performance, but it’d probably sound pretty bad. So, AMD implements two different BIOSes on R9 290X: one called Quiet Mode, and the other dubbed Uber. The first puts a default limit of 40% duty cycle on the fan, while the second one stops at 55%.

If the card is running in Quiet mode, hits 95 degrees, and cannot control temperature under 40% fan speed, it’ll start pulling back clock rate to avoid 96 degrees. Performance takes a hit in the interest of low noise. Switching to Uber mode simply gives you 15% more duty cycle before clock rates start dialing back.

I debated about where to put this graph. In one sense, it belongs with my CrossFire data because it shows that heat hurts the way two R9 290Xes perform. But I'm putting it here because this is an illustration of PowerTune in action. The technology, for better or worse, is forcing these cards to abide a 40% fan speed. So, when the GPU hits 95 degrees and can't spin its fan any faster, you have to watch the core clock melt away. The effect is even more severe with two cards next to each other (even with space between). Hawaii is still a very fast GPU, in spite of this phenomenon, but it's a shame to observe, regardless.

You’re certainly free to manually specify higher maximum fan speeds than the 40% I used, but it’s pretty telling that even AMD’s Uber mode stops at 55%. Again, we’re dealing with a reference cooler that makes a lot of noise once it gets going. I’d personally leave the card set to its Quiet firmware in my own PC.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
498 comments
    Your comment
    Top Comments
  • BigMack70
    Thank goodness AMD had some sense with the pricing. Finally, at long last, Nvidia can stop raping consumers' wallets due to lack of competition.

    This is win-win-win for everyone (except maybe Nvidia).

    Hope we never have to deal with a $1000 single GPU fiasco again. Good riddance.
    103
  • beta212
    That's incredible. Especially at high res, I wonder how they do it. But the low price alone is enough to blow the competition away. Seriously think about it, it's around half the price for higher performance!
    - AMD: We're not aiming for the ultra high end.
    I think Nvidia just got trolled.
    81
  • anxiousinfusion
    Wait the 290 X... X? is going to be $550?! Forgive me, padre for I have sinned.
    61
  • Other Comments
  • beta212
    That's incredible. Especially at high res, I wonder how they do it. But the low price alone is enough to blow the competition away. Seriously think about it, it's around half the price for higher performance!
    - AMD: We're not aiming for the ultra high end.
    I think Nvidia just got trolled.
    81
  • slomo4sho
    Great price point. This card has already broken world records just a few hours after release!


    31
  • esrever
    2 of these for 4k looks amazing but Im a little disappointed by the power consumption when you crank up performance.
    -20
  • aznguy0028
    I was thinking about hopping on the 7970ghz when it's on sale, but after seeing this, it's time to break apart the piggy bank for the 290x, what value!
    25
  • Benthon
    Like the conclusion said, you just can't argue about aesthetics and thermals at this price point/performance. Well done AMD, lets see team green's response! Go consumer!
    33
  • tuklap
    This is awesome for us ^_^
    21
  • Shankovich
    Wow, and it's pegged at 73% too. Even if nVidia's "780ti" beats the 290X, it probably won't beat a 290X running at full power. And if mantle does make some big performance boosts, nVidia is going to be in a really tight spot. Looking forward to what they'll do. In the mean time, loving this competition! We all win in the end.
    24
  • julianbautista87
    daaaaayyyyyuuuummmm
    14
  • anxiousinfusion
    Wait the 290 X... X? is going to be $550?! Forgive me, padre for I have sinned.
    61
  • Darkerson
    Good job, AMD!
    21
  • jkhoward
    I just purchased this card from Newegg.
    22
  • CaptainTom
    Wow AMD. GG! You exceeded every possible expectation! Have fun with your GTX 780 Ti Fanboys!!!!!
    16
  • ilysaml
    Time to Upgrade my HD 6950 to 290X, and my 1080P to 2500x Monitor.
    19
  • lt_dan_zsu
    I've never been more blown away by a hardware review. Never have I seen a gpu beat another gpu at just over half the cost.
    37
  • DarkForce_256
    So what's the deal with the 290?
    12
  • shin0bi272
    it always tickles the hell out of me when an amd card ties or about ties an nvidia card in one game then in an nvidia physx game the amd kicks the crap out of the nvidia card... Of course if you know why that happens in metro you know its more of a fault of the coders not physx but its still funny.
    8
  • jimmysmitty
    $550 is not bad for the fact that it beats the 780 easily and even pressures the Titan.

    Most of the higher resolution gaming wins come from the larger memory bandwidth and of course more vs the 780.

    That's a good sign. Maybe NVidia will drop prices and push this to $400-$450 and I will pick one up when there is a Vapor-X version of course,
    11
  • markbro89
    Where to buy?! Newegg still says coming soon D:
    2
  • DarkForce_256
    So what's the deal with the 290?
    -7
  • BigMack70
    Thank goodness AMD had some sense with the pricing. Finally, at long last, Nvidia can stop raping consumers' wallets due to lack of competition.

    This is win-win-win for everyone (except maybe Nvidia).

    Hope we never have to deal with a $1000 single GPU fiasco again. Good riddance.
    103