Nvidia: GeForce GTX 480 Was Designed to Run Hot
Hot -- the way it's meant to be played.
The Nvidia GeForce GTX 480 is a hot card, both figuratively and literally. It features the latest and greatest from Nvidia, and it also runs quite hot when asked to push pixels. Those looking for lowering their energy consumption from their computers aren't likely to find their greenest solutions in the GTX 480. But according to Nvidia, that's the way it's meant to be played.
Nvidia's Drew Henry earlier this week updated its company blog in response to concerns of the GeForce GTX 480's hungry and hot tendencies:
We wanted to let you know that we’ve also heard your concerns about GTX 480 with respect to power and heat. When you build a high performance GPU like the GTX 480 it will consume a lot of power to enable the performance and features I listed above. It was a tradeoff for us, but we wanted it to be fast. The chip is designed to run at high temperature so there is no effect on quality or longevity. We think the tradeoff is right.
The GF100 architecture is great and we think the right one for the next generation of gaming. The GTX 480 is the performance leader with the GTX 470 being a great combination of performance and price.

Does not compute.
sure, we'll see in a few months when people's cards start dying from the heat.
sure, we'll see in a few months when people's cards start dying from the heat.
Does not compute.
Remember the whole g84/g86 fiasco in notebooks 2 summers ago
105c is what has been said over and over, the chip regularly seems to go into 90-100c depending on case but never really gets over 100c the fan is pretty lenient only really pumping up the speed when it gets past 95c
It was the right move by nvidia to get more performance for the exchange of heat and power, first off it's a massive chip they aren't going to make great money on it if the card wasn't expensive.
to justify the expensive card they need it to be at least competitive in a price range, although just barely competitive the GTX 470 and competitive in another way as being the most powerful gpu being the GTX 480 nvidia can still turn a profit on selling the cards
instead of selling an under powered mess for much cheaper like ATI had to for their 2600 Xt etc that ran hot used alot of power and still did not compete.
I do agree in thinking it was the right move, as for the ones saying it would die in a few months, that's unlikely probably sooner then most but you realize most chips could last 10-20 years depending on quality and more likely is the board will fail long before the chip, so what's it to you if it only last 4-5 years when you should probably trade up by then.
...because heat radiating in a case is never an issue to the surrounding components, right?
(I, personally, like it when I have to connect an air conditioning unit directly onto my modded case - not like I have any hearing left after cranking my games loud enough to be heard.)
Those are misleading quotation marks; that ain't what they said at all.
They said the high power requirements were a tradeoff for performance and that the chip itself was designed to be able to take the heat so its longevity won't be affected by higher temps.
It's nice when ANY company comes forward and says "We know that XXXXX is above what is considered the norm (in this case heat), but we designed it to have a higher threshold and withstand prolonged use... you'll have about the same lifespan as any other card"
Not that I think anyone was really worried, I doubt they would release it otherwise.
Just can I ask something, Where are the more heat and more power that use the 5970, 5870 or 5850, because in the benchmarks you see a better performance against fermi with less power and heat. Do I'm wrong?
Nuff said.
Nvidia: "Hot, Slow, power hungry and overpriced, the Way it's meant ot be paid!"
Well you only see better performance than fermi in most games when using the 5970, fermi beats out 5870 and 5850 in some/most other applications.
So it is by far the worlds fastest single GPU core, not fastest single slot solution.
And yes, You Am Wrong. (spelling?) i dunno.
there there......CUDA will fix everything.....