Sign in with
Sign up | Sign in

Power Consumption

ATI Radeon HD 5870: DirectX 11, Eyefinity, And Serious Speed
By , Fedy Abi-Chahla

Remember when Nvidia’s GeForce GTX 200-series cards were so energy efficient at idle that the company decided to completely drop its HybridPower technology after just one generation of use? So what’s up with our power consumption measurements?

We’ve seen Windows Vista-based results that show Nvidia’s GeForce GTX 285 and GTX 260 Core 216 using 20W or so less than ATI’s Radeon HD 4870 1GB. However, in these Windows 7 tests, the Nvidia cards seem to be using as much as 15W more than the older ATI cards.

Now, we haven’t run the numbers comparing GeForce cards in Windows Vista versus 7 (that’s coming), but we do know from our Intel Core i5 and Core i7 launch coverage that Windows 7 does exhibit very different power consumption behavior than Vista, and it’s possible that 7’s more complex desktop is taking the Nvidia boards to a greater extent than Vista’s did, and the ATI cards aren’t seeing the same sort of power hit. Incidentally, a few days before this launch Nvidia sent over a beta of its latest drivers, which were said to fix potential idle power issues. We tried them and saw the same results, so it's safe to say this isn't a driver issue.

Either way, the more significant news is that the Radeon HD 5870’s idle consumption drops an astounding 42W from last year’s Radeon HD 4870. Moreover, adding a second Radeon HD 5870 card only adds an additional 24W of consumption at idle (and those two boards even idle below a single Radeon HD 4870).

Firing up FurMark shows that, even at 40nm, 2.15 billion transistors still use up a lot of juice. A single Radeon HD 5870 still uses about 25W more than a Radeon HD 4890 under load. And adding a second board in CrossFire mode increases consumption by another 207W, bringing the total to 561W. Incidentally, that’s 71W more than a Radeon HD 4870 X2 and 79W less than a GeForce GTX 295. The worst power offender is a pair of GeForce GTX 285s in SLI though, which take system power up to 620W.

Making It More Efficient

So just how did ATI drop the Radeon HD 5870’s idle power to 27W, following up the 90W Radeon HD 4870? The most obvious improvement is a reduction in idle clocks. Sitting on the Windows 7 desktop, our 5870 sample dropped to 157 MHz core and 300 MHz memory clock rates. In comparison, the Radeon HD 4870 only dropped to 500/900MHz.

Reference Graphics Card
Idle Clocks (Core/Memory)
3D Clocks (Core/Memory)
ATI Radeon HD 5870
157/300
850/1,200
ATI Radeon HD 4870 X2
507/500
750/900
ATI Radeon HD 4890
240/975
850/975
ATI Radeon HD 4870
520/900
750/975
Nvidia GeForce GTX 295
300/100 (600 MHz Shader)
576/999 (1,242 MHz Shader)
Nvidia GeForce GTX 285
300/100 (600 MHz Shader)
648/1,242 (1,476 MHz Shader)


At the other end of the spectrum, Cypress does have a higher maximum board power than its predecessor. However, ATI has implemented direct communication between the VRM and the GPU to signal an over-current state, triggering the processor to throttle down until the board is back within its power spec. We’ll discuss this more on the following page, but running two 5870s in CrossFire, we were able to trigger this dynamic protection.

Speaking of CrossFire, when you have two 5870s running concurrently at idle, ATI says that secondary board will drop into an ultra-low power state (purportedly sub-20W). We measured a 25W increase with a second board at idle, which is still not bad at all when you consider a pair of 4870s would be rated at 180W.

Display all 214 comments.
This thread is closed for comments
Top Comments
  • 26 Hide
    hispeed120 , September 23, 2009 4:13 AM
    I'm. So. Excited.
  • 23 Hide
    cangelini , September 23, 2009 4:43 AM
    viper666why didn't they thest it against a GTX 295 rather than 280??? its far superior...


    Ran it against a GTX 295 and a 285 and 285s in SLI :) 
  • 22 Hide
    megamanx00 , September 23, 2009 4:48 AM
    O M F G!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    Just wish the darn thing wasn't so big, but man, what a card! Now I'm thinking about a bigger case :D 
Other Comments
  • 26 Hide
    hispeed120 , September 23, 2009 4:13 AM
    I'm. So. Excited.
  • 9 Hide
    Anonymous , September 23, 2009 4:15 AM
    Can't wait
  • 21 Hide
    crosko42 , September 23, 2009 4:21 AM
    So it looks like 1 is enough for me.. Dont plan on getting a 30 inch monitor any time soon.
  • 20 Hide
    jezza333 , September 23, 2009 4:29 AM
    Looks like the NDA lifted at 11:00PM, as there's a load of reviews now just out. Once again it shows that AMD can produce a seriously killer card...

    Crysis 2 on an x2 of this is exactly what I'm waiting for.
  • 8 Hide
    woostar88 , September 23, 2009 4:38 AM
    This is incredible at the price point.
  • 20 Hide
    tipmen , September 23, 2009 4:40 AM
    wait, wait, before I look can it play cry... HOLY SHIT?!
  • 23 Hide
    cangelini , September 23, 2009 4:43 AM
    viper666why didn't they thest it against a GTX 295 rather than 280??? its far superior...


    Ran it against a GTX 295 and a 285 and 285s in SLI :) 
  • 2 Hide
    Annisman , September 23, 2009 4:44 AM
    I refuse to buy until the 2GB versions come out, not to mention newegg letting you buy more than 1 at a time, paper launch ftl.
  • 15 Hide
    jasperjones , September 23, 2009 4:44 AM
    Thanks for the timely review. I have to say though, some of the technical details are beyond me. It'd be useful if you explained terms such as "VLIW architecture" or "tessellation engine"
  • 22 Hide
    megamanx00 , September 23, 2009 4:48 AM
    O M F G!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    Just wish the darn thing wasn't so big, but man, what a card! Now I'm thinking about a bigger case :D 
  • 17 Hide
    Annisman , September 23, 2009 4:49 AM
    Oops, who am I kidding ? I just ordered 2 5870's. One Sapphire, and one HIS, seeing as how they limit you to one per customer.
  • 18 Hide
    falchard , September 23, 2009 4:54 AM
    I think most of this review has to do with how many games are optimized for nVidia. The Crytek Engine 2.0 and Source Engine are well known for heavily favoring nVidia architecture yet compose the bulk of the benchmarks. I think the fact ATI can do best in these engines when they have a detect ATI instant nerf its performance speaks measures for the actual card.
  • 14 Hide
    tipmen , September 23, 2009 4:56 AM
    Another thing is that the 5800x2 isn't out yet, now think of two of those bad boys in Crossfire.
  • 6 Hide
    blackbyron , September 23, 2009 4:59 AM
    Not bad for Crysis benchmark. I really want 5870 for my christmas present, but damn I also need to buy a new PSU.
  • 12 Hide
    blackbyron , September 23, 2009 5:02 AM
    In addition, I am impressed that the 5870 has a better power consumption and better gaming performance compare to DX10 cards. If the card is affordable I'd definite buy one.
  • 3 Hide
    cangelini , September 23, 2009 5:10 AM
    jasperjonesThanks for the timely review. I have to say though, some of the technical details are beyond me. It'd be useful if you explained terms such as "VLIW architecture" or "tessellation engine"


    Jasper,
    TBH, the architectural details are secondary to how the card performs. However, if you'd like a better idea of what tessellation can do for you, check out the picture of the Alien on page six!
Display more comments