Sign in with
Sign up | Sign in

Nvidia: GeForce GTX 480 Was Designed to Run Hot

By - Source: Tom's Hardware US | B 157 comments

Hot -- the way it's meant to be played.

The Nvidia GeForce GTX 480 is a hot card, both figuratively and literally. It features the latest and greatest from Nvidia, and it also runs quite hot when asked to push pixels. Those looking for lowering their energy consumption from their computers aren't likely to find their greenest solutions in the GTX 480. But according to Nvidia, that's the way it's meant to be played.

Nvidia's Drew Henry earlier this week updated its company blog in response to concerns of the GeForce GTX 480's hungry and hot tendencies:

We wanted to let you know that we’ve also heard your concerns about GTX 480 with respect to power and heat. When you build a high performance GPU like the GTX 480 it will consume a lot of power to enable the performance and features I listed above. It was a tradeoff for us, but we wanted it to be fast. The chip is designed to run at high temperature so there is no effect on quality or longevity. We think the tradeoff is right.

The GF100 architecture is great and we think the right one for the next generation of gaming. The GTX 480 is the performance leader with the GTX 470 being a great combination of performance and price.

Display 157 Comments.
This thread is closed for comments
Top Comments
  • 41 Hide
    ColMirage , April 3, 2010 1:40 AM
    "It's not a bug, it's a feature!", GPU Style.
  • 40 Hide
    rigaudio , April 3, 2010 1:03 AM
    "We wanted your chips to last longer, so we made them run really really hot."
    Does not compute.
  • 38 Hide
    frostyfireball , April 3, 2010 1:01 AM
    "The chip is designed to run at high temperature so there is no effect on quality or longevity."

    sure, we'll see in a few months when people's cards start dying from the heat.
Other Comments
  • 38 Hide
    frostyfireball , April 3, 2010 1:01 AM
    "The chip is designed to run at high temperature so there is no effect on quality or longevity."

    sure, we'll see in a few months when people's cards start dying from the heat.
  • 40 Hide
    rigaudio , April 3, 2010 1:03 AM
    "We wanted your chips to last longer, so we made them run really really hot."
    Does not compute.
  • 41 Hide
    ColMirage , April 3, 2010 1:40 AM
    "It's not a bug, it's a feature!", GPU Style.
  • 33 Hide
    hunter315 , April 3, 2010 1:41 AM
    Good to know "Space heater" was on their feature list not their bug list, wonder if they make you pay any extra for that feature.
  • 26 Hide
    xenol , April 3, 2010 1:44 AM
    Well considering that most GPUs seem to have a meltdown temperature of 115C... I wonder what the threshold is this time.
  • 27 Hide
    scook9 , April 3, 2010 1:46 AM
    so the GPU can take it....how about the solder?!

    Remember the whole g84/g86 fiasco in notebooks 2 summers ago
  • 34 Hide
    flyinfinni , April 3, 2010 1:49 AM
    Ok, so they had to choose performance or low power/heat? How the heck did ATI get performance almost as good, while still using ridiculously low power and heat?
  • 30 Hide
    z0d , April 3, 2010 1:51 AM
    They should have made a better GPU cooler if it was made to be hot.
  • -8 Hide
    IzzyCraft , April 3, 2010 1:52 AM
    xenolWell considering that most GPUs seem to have a meltdown temperature of 115C... I wonder what the threshold is this time.

    105c is what has been said over and over, the chip regularly seems to go into 90-100c depending on case but never really gets over 100c the fan is pretty lenient only really pumping up the speed when it gets past 95c

    It was the right move by nvidia to get more performance for the exchange of heat and power, first off it's a massive chip they aren't going to make great money on it if the card wasn't expensive.

    to justify the expensive card they need it to be at least competitive in a price range, although just barely competitive the GTX 470 and competitive in another way as being the most powerful gpu being the GTX 480 nvidia can still turn a profit on selling the cards

    instead of selling an under powered mess for much cheaper like ATI had to for their 2600 Xt etc that ran hot used alot of power and still did not compete.

    I do agree in thinking it was the right move, as for the ones saying it would die in a few months, that's unlikely probably sooner then most but you realize most chips could last 10-20 years depending on quality and more likely is the board will fail long before the chip, so what's it to you if it only last 4-5 years when you should probably trade up by then.
  • 13 Hide
    mdillenbeck , April 3, 2010 1:55 AM
    Quote:
    The chip is designed to run at high temperature so there is no effect on quality or longevity. We think the tradeoff is right.


    ...because heat radiating in a case is never an issue to the surrounding components, right?

    (I, personally, like it when I have to connect an air conditioning unit directly onto my modded case - not like I have any hearing left after cranking my games loud enough to be heard.)
  • 15 Hide
    kravmaga , April 3, 2010 1:55 AM
    rigaudio"We wanted your chips to last longer, so we made them run really really hot."Does not compute.


    Those are misleading quotation marks; that ain't what they said at all.
    They said the high power requirements were a tradeoff for performance and that the chip itself was designed to be able to take the heat so its longevity won't be affected by higher temps.
  • 11 Hide
    cscott_it , April 3, 2010 1:58 AM
    Well, that's reassuring (not sarcasm).

    It's nice when ANY company comes forward and says "We know that XXXXX is above what is considered the norm (in this case heat), but we designed it to have a higher threshold and withstand prolonged use... you'll have about the same lifespan as any other card"

    Not that I think anyone was really worried, I doubt they would release it otherwise.
  • 16 Hide
    saint19 , April 3, 2010 1:59 AM
    So, he says that more heat and more power means better performance on GPU.

    Just can I ask something, Where are the more heat and more power that use the 5970, 5870 or 5850, because in the benchmarks you see a better performance against fermi with less power and heat. Do I'm wrong?
  • 24 Hide
    builderbobftw , April 3, 2010 2:01 AM
    GTX 480= FX 5800

    Nuff said.

    Nvidia: "Hot, Slow, power hungry and overpriced, the Way it's meant ot be paid!"
  • 8 Hide
    a4mula , April 3, 2010 2:03 AM
    Maybe this card is meant to run hot, maybe it isn't. Whatever the case might be I do know this; Every other component of the pc isn't meant to run hot. When you stick this next to your brand spanking new 930 how do you think it's going to respond? Let's see, a 50c 5870 vs a 100c 480. 50c difference, what's that going to do to every other temp in your case?
  • -1 Hide
    joytech22 , April 3, 2010 2:07 AM
    SAINT19So, he says that more heat and more power means better performance on GPU.Just can I ask something, Where are the more heat and more power that use the 5970, 5870 or 5850, because in the benchmarks you see a better performance against fermi with less power and heat. Do I'm wrong?


    Well you only see better performance than fermi in most games when using the 5970, fermi beats out 5870 and 5850 in some/most other applications.

    So it is by far the worlds fastest single GPU core, not fastest single slot solution.

    And yes, You Am Wrong. (spelling?) i dunno.
  • 18 Hide
    kageryu , April 3, 2010 2:08 AM
    Just like how the 5xxx series are designed to be a BETTER VALUE!
  • 25 Hide
    micr0be , April 3, 2010 2:15 AM
    flyinfinniOk, so they had to choose performance or low power/heat? How the heck did ATI get performance almost as good, while still using ridiculously low power and heat?


    there there......CUDA will fix everything.....
Display more comments