Ever since the release of the Fermi cards, all the discussion has centered around their heat and noise. Noise I can understand because it can really be an annoying distraction. But what's the big deal about the operating temperature of a particular GPU? If the video card is not overheating and is functioning within the normal temperature range, even into the low 90c's, what's wrong with that?
Thanks, this is something I have wondered about due to it coming up so frequently in almost any video card discussion. (I'm sure many you remember that story of an egg getting fried on a GTX480.)
Well, I guess as long as the GPU is able to withstand the temperatures it isn't that bad, but for some people that live in 95-100F weather during the summer(like me), we don't need a GPU to make the room hotter than it already is .
The GPU could also make the ambient temp inside the case warmer, especially since the outside of the GTX 480 is a heatsink.
The amount of heat a card gives off is directly related to the amount of power it uses. How important it is depends on a variety of factors like the ambient temperature of the environment the computer will be running in, whether you are using multiple cards(or may want to in the future) or if you have a small case and/or one with poor airflow. The temperature the card runs at is a different but related topic which can limit overclocking and sometimes affect the lifespan of the card.
The reason its brought up is because heat thrown off is an issue for many of us, for several reasons, and you can often get equivalent performance with less.
- more heat, more noise as gpu fans spin
- more heat, less soundproofing in most cases, increasing noise.
- more heat, less cpu heat headroom in many cases, a concern for many of us.
- more heat, more heat. I run two boxes at a time . . . it can get warm in my cubby.
- more heat, the less hidden the tower can be. Some of us actually have homes.
how hot the chip on the card gets can affect overclocking ability. Also if the heat is dispersed within the case, everything else in the case gets hotter. Personally, I would want any vid card i get to vent most of its heat out of the case. Its also the extra power consumption that goes with that heat, as it will run up the power bill. In my opinion, the early fermi cards seem like they were released before they were refined enough to try to catch ATI. Then they released the gtx460 later, which is how the cards should function. It was cooler, used less power and was faster than the 465.
how hot the actual gpu gets is not a representation of the amount of heat it produces. It may just be poorly designed to dissipate heat or the heatsink may be rubbish. It would be interesting to get some specs on how many BTU's a gpu produces.
^ and that has other ramifications, too. For example, I saw a review recently of a guy who re-installed his 2x480s in a new case, Raven 1. Turns out that case moved so much air his 480s became quiet. No, they weren't drowned out by the other fans - the gpu fans never had to spin up to cool the cards.
To Iam's point - isn't total power consumed a useful surrogate? I mean, where else would the electricity go.
Anyhow, I don't think we'll see anyone arguing the early Fermi's were "cool".
If anything the heat produced by the GTX480 has caused a lot of us to examine the finer points of case airflow and overall system design. It's a common acceptance that the 480 can be quieted down with a good cooling scheme. This article is from a guy who used the new Silverstone Raven RV02 case, which has a motherboard platter that spins 90 degrees. When fully installed the video card exhaust aims up at the top of the case, allowing heat to rise and a cooler overall system. http://www.hardocp.com/article/2010/09/30/my_quiet_gala...