GeForce GTX 580 And GF110: The Way Nvidia Meant It To Be Played

GeForce GTX 580: Similar Dimensions, Improved Design

Change For The Better

Because Nvidia chose to spend the thermal budget it freed up on performance (more shaders, higher clocks), the GeForce GTX 580’s TDP is very similar to the GeForce GTX 480 (244 W versus 250 W). As a result, you might expect the card to run into similar heat and noise issues as its predecessor. Fortunately, as the architects were reworking the GPU itself, another team tackled 480’s shortcomings.

Gone are the exposed heatsink, the protruding heat pipes, and the noisy fan.

A vapor chamber-based cooler enables those changes. Heatpipes disappear completely, as the copper vapor chamber has its own evaporation/condensation cycle to more effectively transfer heat to a fin array. Nvidia still uses a blower-style fan to move air across the heatsink and out the rear I/O panel, but it’s reinforced to drop pitch. Fan control migrates into the GPU itself, facilitating a more gradual (less noticeable) ramp up and down in response to activity.

The overall result is a completely contained card that’s still hot, but nowhere near as painful to touch after a bout on the bench system. It’s remarkably quiet at idle, and much more pleasant under load. Perhaps I should repeat that for emphasis. Excessive noise was one of the strongest reasons to avoid GeForce GTX 480, and it is effectively dealt with on the GTX 580.

Aside from those tweaks, the GeForce GTX 580 is as long as 480 (10.5”). It still requires one eight-pin and one six-pin auxiliary PCIe power connector (Nvidia additionally recommends at least a 600 W power supply). 

And it’s still limited to two simultaneous display outputs via two dual-link DVI connectors and a mini-HDMI port. With AMD now able to support up to four simultaneous displays using a combination of DisplayPort, DVI, and HDMI (from a mid-range lineup, no less), Nvidia really needs to address its two-output limitation soon. That's a practical advantage on AMD's side that goes unanswered unless you buy two GeForce cards. Naturally, the GTX 580 is still a dual-slot board, too.

Monitoring Power

Protection-oriented circuitry is quickly changing the way power, temperature, and noise measurements can be made.

Back when AMD launched the Cypress GPU, it introduced a feedback mechanism to read the state of the card’s voltage regulator. If it overvolted (even prior to the GPU overheating), the ASIC would throttle down to drop power draw. This was in addition to the thermal protection scheme that’d simply keep the chip from getting too hot. As a result, it was entirely possible to run a constant-load application like FurMark and see unrealistically low power numbers without the tell-tale high GPU temps to indicate that something was wrong. Fortunately, I figured out a combination of settings that’d still tax the GPU without triggering either of AMD’s mechanisms.

Power-monitoring circuitry added to GeForce GTX 580

Nvidia’s a little late to this party, but it’s implementing something similar on GeForce GTX 580. The company claims its GPUs have had thermal monitoring for many years (though that doesn’t explain why its 196-series driver sent a number of G92-based boards to an early grave earlier this year). Now, the current and voltage of each 12 V rail is being monitored as well. Should power levels exceed spec, the GPU throttles performance. More on this once we get to power testing.

Update: Multi-Display Testing

I had a couple of requests to check back on issues that originally plagued the GeForce GTX 480, one of those being a dramatic increase in power consumption and heat with two displays attached.

According to Nvidia, it rectified the out-of-control increases that were being reported in the GeForce 256-series driver released earlier this year. So long as you're using two monitors with the same resolution and timing settings, you're supposedly safe. In an effort to double-check/verify, I attached a pair of Dell P2210H displays to a GeForce GTX 580 and charted out temps and power use:

Swipe to scroll horizontally
Header Cell - Column 0 Power ConsumptionTemperature
One Display (1920x1080), Idle190 W40 deg. C
Two Displays (1920x1080), Idle192 W45 deg. C
Two Displays (1 x 1920x1080, 1 x 1280x1024), Idle255 W56 deg. C

Power consumption doesn't increase much when you attach a second display running at the same resolution and timings, but the temperature does increase by five degrees.

Swapping over to a display running a different resolution, however, continues to have a profound effect on power and temperatures (Nvidia does not deny this). The jump from 192 W to 255 W and 45 degrees to 56 degrees is significant. The good news is that if you're using the same screen, the latest drivers minimize the impact of utilizing both of the GeForce GTX 580's display outputs.

Chris Angelini
Chris Angelini is an Editor Emeritus at Tom's Hardware US. He edits hardware reviews and covers high-profile CPU and GPU launches.
  • xurwin
    its the beast(best)! no doubt nvidia is making a way to combat 6900's
    Reply
  • KT_WASP
    The last bit of the article is the most important I think. Anyone who drops $500 on this card right now, before Cayman releases, should have their head examined. With two companies releasing so close together, it would be in a person's best interest to see what the other is bringing to the table before shelling out such a large chunk of change.

    If the 6850 and 6870 have shown one thing.. they are much better then the last gen in many ways (power, noise and scaling) and the cayman is much more robust then the barts. So, before you start calling a winner here, wait and see. That is my advise.
    Reply
  • awood28211
    Sound performance but the game here seems to be...double leap-frog. You can just release a product that competes with the competitors current offerings, you gotta compete with what he releases next... If AMD's next offering is significantly faster than it's current, then NVIDIA will still be playing catchup.
    Reply
  • Wheat_Thins
    Kinda pointless article other then the fact that the 580 offers superb performance but until I see power and noise set in stone I honestly don't care.

    A single GPU nearly outperforming a 5970 is quite a statement. Wonder if AMD has what it takes to answer this as the 6850 IMPO is pretty disappointing other then the price.
    Reply
  • nevertell
    So it's basically what the 480 should have been. Fair enough, I'll wait for the 470 version of the gf110 and buy that.
    Reply
  • TheRockMonsi
    The price right now for this card is way over $500 on newegg. For that price NVIDIA better be giving me a bj as well.
    Reply
  • It'll certainly be interesting, even if i don't agree with NVIDIA playing catchup. The 480 had its flaws, but it still was the fastest single GPU around.

    We'll see what the 69xx have to offer. NVIDIA releasing now puts somewhat of a time constraint on AMD though. If it takes them too long to get something out the door, even some people waiting now may just get the 580 for christmas.
    Reply
  • kevin1212
    Nvidia is embarrassed by the power draw of the gtx 580, haha. Improvement in performance but uses the same amt of power... still not a big enough improvement in efficiency, and no big leap in value either. AMD will wipe the floor with this card.

    By the way, i know you guys decided to drop crysis, and i can understand that, but given that this is a high end card, maybe you should have considered it, since frankly anyone buying a card like this would probably want it for crysis more than anything else. A 6870 is more than enough for the others.
    Reply
  • iamtheking123
    Looks to me that the 580 is somewhere between a 5870 and a 5970. Might have been more impressive if it was Q2 2010 and not Q4 2010.

    With ATI's meat-and-gravy bits of the 6000 series on the launchpad, you'd be an idiot to buy one of these at this price.
    Reply
  • Blink
    On Civ 5 benchmark the 5970 has the worst 'Zoomed Out' fps. Strange?
    Reply