Sign in with
Sign up | Sign in

Power Consumption And Noise

GeForce GTX 570 Review: Hitting $349 With Nvidia's GF110
By

Power Consumption

My power numbers caused quite a splash in the GeForce GTX 580 review. Nvidia made changes to its power circuitry to protect against overloading the voltage regulation. Incidentally, these are changes AMD made back when it launched the Radeon HD 5870; Nvidia's leash is just a little bit tighter, likely out of necessity due to the more power-hungry GPU.

As a result, “power bugs” (AMD’s term, not mine) like FurMark result in throttling to keep from damaging the card. That was fine with me—the figure FurMark spits out isn’t particularly meaningful anyway, aside from its ramifications as an absolute worst-case. Nevertheless, there were sites out there that tinkered with the app until they found settings that’d duck in under Nvidia’s trigger. It seemed that, just because FurMark could be run, a lot of readers still thought it should be.

Logging power use in real-world games can be so much more meaningful, though. It’s an actual result. It’s the Crysis to 3DMark. And that’s what we care about. What really happens when you’re gaming. And so we cranked up the settings in Metro 2033 yet again and put as much stress on these cards as possible in a three-run pass through the built-in benchmark. The result is a telling chart of samples every two seconds, and an average power figure.

Average System Power Consumption
Nvidia GeForce GTX 570 1.25 GB
329.78 W
Nvidia GeForce GTX 580 1.5 GB
376.51 W
Nvidia GeForce GTX 480 1.5 GB
385.70 W
AMD Radeon HD 5870 1 GB
274.14 W


With GeForce GTX 580, we know that Nvidia used the GeForce GTX 480’s TDP as its ceiling, so it’s really not a surprise that the GTX 480 and GTX 580 run very close together. In fact, the GTX 580 averages 9 W less than the GTX 480 across the test (376 W system power versus 385 W). The GeForce GTX 570 drops that average to 329 W, a 47 W difference. This doesn’t exactly match Nvidia’s TDP spec, which puts the boards 25 W apart. AMD’s Radeon HD 5870 is most definitely a slower graphics card, but it also demonstrates impressive power figures, down at 274 W average system power use.

Noise

Another requested addition to the GTX 580 story was noise measurements. So, I fired up my Extech 407768 sound level meter, placed it 12 inches away from the back of our test machine to keep the registered output within the meter's range, and tested each of the cards tested in that story, plus this one.

It's hardly a surprise to see the GeForce GTX 480 topping our load charts after half an hour of loops in 3DMark11 (I took the measurement during Game Test 2 for each contender). The GeForce GTX 470 is also expected to be one of the louder cards here, and it shows up as the third highest.

It's a little more unexpected to see the 5870 under the GTX 480, but perhaps our original launch sample from more than a year ago is getting a little long in the tooth. Also interesting is that the very hot Radeon HD 5970 and Radeon HD 6870 appear to be as loud.

What I really like to see is that a pair of Radeon HD 6850 cards in CrossFire make less noise than the 6870. This was my biggest reservation in recommending a CrossFire configuration--especially one with cards sitting back to back, without a slot's worth of space between them. Nevertheless, we're seeing that AMD's latest not only put down impressive performance at a reasonable price, but they also do it elegantly (aside from the four slots that get consumed).

And the big news is Nvidia's GeForce GTX 500-series. The changes made to its heatsink, fan, and ramping algorithm make a massive difference in the GeForce GTX 580. Consequently, when the company decided to use the same solution on its GeForce GTX 570 reference design, it was only natural for the acoustics to get even more attractive.

I liken this to Toyota's JZ engine. The cast iron block was built for the twin-turbocharged 2JZ-GTE, and overbuilt for its role in the 2JZ-GT. So too are the changes Nvidia made for the uncut GF110 seemingly overkill for the GTX 570. We'll take them, though, along with the cooler temps and quieter acoustics they bring. Bear in mind that this is only going to apply to the reference heatsink and fan combo, though. Should add-in board partners deviate from Nvidia's implementation to save money or differentiate in some other way, these numbers will of course change.

Display all 108 comments.
This thread is closed for comments
Top Comments
  • 32 Hide
    nevertell , December 7, 2010 11:31 AM
    Yay, I got highlighted !
  • 26 Hide
    sstym , December 7, 2010 11:36 AM
    thearmGrrrr... Every time I see these benchmarks, I'm hoping Nvidia has taken the lead. They'll come back. It's alllll a cycle.


    There is no need to root for either one. What you really want is a healthy and competitive Nvidia to drive prices down. With Intel shutting them off the chipset market and AMD beating them on their turf with the 5XXX cards, the future looked grim for NVidia.
    It looks like they still got it, and that's what counts for consumers. Let's leave fanboyism to 12 year old console owners.
Other Comments
  • -9 Hide
    thearm , December 7, 2010 11:16 AM
    Grrrr... Every time I see these benchmarks, I'm hoping Nvidia has taken the lead. They'll come back. It's alllll a cycle.
  • 5 Hide
    xurwin , December 7, 2010 11:30 AM
    at $350 beating the 6850 in xfire? i COULD say this would be a pretty good deal, but why no 6870 in xfire? but with a narrow margin and if you need cuda. this would be a pretty sweet deal, but i'd also wait for 6900's but for now. we have a winner?
  • 32 Hide
    nevertell , December 7, 2010 11:31 AM
    Yay, I got highlighted !
  • 5 Hide
    verrul , December 7, 2010 11:35 AM
    because 2 6850s is pretty equal in price to the 570
  • 26 Hide
    sstym , December 7, 2010 11:36 AM
    thearmGrrrr... Every time I see these benchmarks, I'm hoping Nvidia has taken the lead. They'll come back. It's alllll a cycle.


    There is no need to root for either one. What you really want is a healthy and competitive Nvidia to drive prices down. With Intel shutting them off the chipset market and AMD beating them on their turf with the 5XXX cards, the future looked grim for NVidia.
    It looks like they still got it, and that's what counts for consumers. Let's leave fanboyism to 12 year old console owners.
  • 8 Hide
    nevertell , December 7, 2010 11:37 AM
    It's disappointing to see the freaky power/temperature parameters of the card when using two different displays. I was planing on using a display setup similar to that of the test, now I am in doubt.
  • -2 Hide
    reggieray , December 7, 2010 11:46 AM
    I always wonder why they use the overpriced Ultimate edition of Windows? I understand the 64 bit because of memory, that is what I bought but purchased the OEM home premium and saved some cash. For games the Ultimate does no extra value to them.
    Or am I missing something?
  • 6 Hide
    reggieray , December 7, 2010 11:50 AM
    PS Excellent Review
  • 3 Hide
    theholylancer , December 7, 2010 11:50 AM
    hmmm more sexual innuendo today than usual, new GF there chris? :D 

    EDIT:

    Love this gem:
    Quote:

    Before we shift away from HAWX 2 and onto another bit of laboratory drama, let me just say that Ubisoft’s mechanism for playing this game is perhaps the most invasive I’ve ever seen. If you’re going to require your customers to log in to a service every time they play a game, at least make that service somewhat responsive. Waiting a minute to authenticate over a 24 Mb/s connection is ridiculous, as is waiting another 45 seconds once the game shuts down for a sync. Ubi’s own version of Steam, this is not.


    When a reviewer of not your game, but of some hardware using your game comments on how bad it is for the DRM, you know it's time to not do that, or get your game else where warning.
  • 2 Hide
    amk09 , December 7, 2010 11:52 AM
    nevertellYay, I got highlighted !


    So you gonna buy it? Huh huh huh?
  • -1 Hide
    nevertell , December 7, 2010 11:54 AM
    I was planning on doing so, but I didn't get enough money from the work I was doing, so I'll stick with just a new monitor. I will definitely get a new card during the next year, but not for now :(  And by then, there might be new great cards out there.
  • 1 Hide
    lostandwandering , December 7, 2010 11:55 AM
    Good looking performance numbers. Will be interesting to see what this does in the real world to NVidia's pricing of the GTX 400 series.
  • -5 Hide
    SininStyle , December 7, 2010 12:02 PM
    Wow, no sli 460s included? Yet you include 6850s in xfire? really? *facepalm* fail
  • 2 Hide
    darkchazz , December 7, 2010 12:04 PM
    now I want gtx 560
  • 3 Hide
    anacandor , December 7, 2010 12:07 PM
    While the 5xx series is looking decent so far, it seems to me (pure speculation here) that Nvidia held back with this series and are possibly putting more resources into Kepler. I feel this because they aren't trying to kill AMD for market share, instead put up a perfectly resonable product that neither EXCELS vastly beyond last gen, but providing enough performance to justify a new product. That said i'm looking forward to their 2011 lineup.

    Also, it would have been interesting to see Metro 2033 tested with max instead of medium settings. All the cards are able to play medium at all resolutions with no AA... push them to their limits? :) 

    Thoroughly enjoyable review though. Thanks, Chris!
  • 4 Hide
    gxpbecker , December 7, 2010 12:20 PM
    i LOVE seeing Nvidia and AMD trading blows back and forth. Keeps prices in check lol and gives more optiosn for buyers!!!
  • 1 Hide
    kg2010 , December 7, 2010 12:26 PM
    You can see how the 460's in SLI did here vs the 580
    http://benchmarkreviews.com/index.php?option=com_content&task=view&id=614&Itemid=72

    But yeah, this review NEEDS 460's 1GB in SLI to be fair, as they are definitely an alternative to a 580, even a 570. There are quite a few cards at or below $199

    Dual Hawks for $320 AFTER MIR:
    http://www.newegg.com/Product/Product.aspx?Item=N82E16814127518

    And these cards will overclock well.
  • 0 Hide
    tronika , December 7, 2010 12:32 PM
    ReggieRayI always wonder why they use the overpriced Ultimate edition of Windows? I understand the 64 bit because of memory, that is what I bought but purchased the OEM home premium and saved some cash. For games the Ultimate does no extra value to them.Or am I missing something?

    noticed that too. i really can't think of any reason other than the language support for the Tom's engineers. 99% of the gamer market would be better off with home premium 64 bit. the other 1% that actually runs and maintains a domain in their house should get professional or the bloated "ultimate". i mean, who really uses bitlocker on their gaming machine anyway? great article though! i jumped on the 580 and haven't looked back. used to run on a 5850 but now that i've seen fermi in all of it's glory i'm really interested in nvidia's future "vision".
  • 1 Hide
    cangelini , December 7, 2010 12:38 PM
    theholylancerhmmm more sexual innuendo today than usual, new GF there chris? EDIT:Love this gem:When a reviewer of not your game, but of some hardware using your game comments on how bad it is for the DRM, you know it's time to not do that, or get your game else where warning.


    Wait, innuendo? Where? :) 
Display more comments