Difference between Temperature and Heat

Greetings,

I am currently replacing my old 320mb 8800 gts, and I have a question about the amount of heat newer graphics cards produce in comparison.

My current card, when I lower the clocks to their lowest and turn up the fan to 100% loudly idles at around 48 degrees c. (and always has)

I am looking at nvidia cards that idle around 32-38 degrees c (the 560 ti, 570, 448 core etc), but that is with their fans at 30% to 40% by comparison.

My question is this: Do these newer graphics cards PRODUCE substantially less heat at idle than my old 8800 gts, or do they simply have substantially more advanced cooling systems that can expunge much more heat at lower fan speeds?

The reason this matters, is that even at idle, my room temperature with my computer on is about 12 degrees f higher than the rest of the house, and I want my new card to produce less heat so that the room temperature doesn't rise.

The card I am leaning most closely towards is an evga gtx 570 which I've found for $230, but if that's going to produce just as much heat, I might should pick up the 560 ti, even though it's a bit less powerful...

thoughts?
18 answers Last reply
More about difference temperature heat
  1. The 8800 and 9800 were hot cards with relatively crappy coolers on them. I remember my 9800GT sounded like a vacuume cleaner with just the slightest graphical load on it, but later I changed to a passive cooler (acellero s2 I think it was) and never had any problems. So it is not that the GPU put out 'too much' heat (though it was toasty), the issue was that the heat sinks bundled with the cards sucked (and mine was a nicer eVGA card).

    Newer cards have much more effective coolers, so no matter what you go with you should not have the same issues. The real question these days is if you want a blower fan which will exhaust the hot air out the back of the fan (best for small or poorly vented cases), or to get one with an oversized cooler and large fans (much quieter, but adds heat to the case).

    Even the 580 will idle in the upper 30's, and the 680 is even better, so what you are worried about is no longer an issue.
  2. My MSI Twin Frozr idles around 40c and I can't hear a peep out of the fans. At full blown BF3 gaming it hits 62c or so with maybe 50% fan speed (still can't hear it). Room temp in my office is about 23c.

    I can't answer your question about newer cards relative to older cards and the heat they produce. But I will add that the room temperature and the temperature insdide your case have a lot of bearing. What temp is your CPU and mobo at during idle? What kind of case do you have and do you have adequate exhaust fannage?
  3. Quote:
    Do these newer graphics cards PRODUCE substantially less heat at idle than my old 8800 gts, or do they simply have substantially more advanced cooling systems that can expunge much more heat at lower fan speeds?

    Both, and the degree of each depends on the card. If you're looking for a card that will produce less heat and keep your room cooler, look at the TDP for each card. TDP is a measure of how much heat the card produces.

    Drop your TDP by 50% and your video card starts dumping 50% less heat into the environment, regardless of cooling. You could also look into exotic stuff like oil immersion, which will lock most of the heat up in the oil and only heat your room up after very long gaming sessions. Not cheap, though, and a massive pain in the ass.

    Another thing you can do is try undervolting the GPU. Temperature rises with the square of the voltage, so even small decreases can result in big differences in temperatures. In all likelihood, you'll need to underclock the card as well to make it stable at lower voltages.
  4. catatafish said:
    What temp is your CPU and mobo at during idle? What kind of case do you have and do you have adequate exhaust fannage?


    My cpu (a core2duo e6600 at 1.4v oc 3.2ghz) is around 34c, case temp is 31c, HDD says it's 24c, graphics card is currently 49c with fan at 2820rpm.
  5. willard said:
    Quote:
    Do these newer graphics cards PRODUCE substantially less heat at idle than my old 8800 gts, or do they simply have substantially more advanced cooling systems that can expunge much more heat at lower fan speeds?

    Both, and the degree of each depends on the card. If you're looking for a card that will produce less heat and keep your room cooler, look at the TDP for each card. TDP is a measure of how much heat the card produces.

    Drop your TDP by 50% and your video card starts dumping 50% less heat into the environment, regardless of cooling.



    All I can find is the TDP rating for the cards (how much heat their coolers are rated for?)

    my graphics card is rated for 147, the gtx 570 is rated for 217-240?

    I think this is more an estimate of how effective the cooler is, rather than how much heat the unit actually produces. Does the fact that they put in a 61% more effective cooler into the 570 mean it produces 61% more heat at load? What would this mean for idle (which is all I really care about, as I'm ok with the room being hot during a game)?

    source http://www.geeks3d.com/20090618/graphics-cards-thermal-design-power-tdp-database/
  6. misinformedman said:
    All I can find is the TDP rating for the cards (how much heat their coolers are rated for?)

    my graphics card is rated for 147, the gtx 570 is rated for 217-240?

    Nope, the TDP is for the GPU, not the cooler. The 570 really will produce that much more heat. Today's GPUs get really, really hot.

    You might want to focus your search for a low TDP card on AMD. They're typically cooler than Nvidia's offerings. You also don't want to be looking at the top cards in a line, as they'll have much higher TDPs than the mid-range cards will.

    For example, the GTX 590 has a TDP of 365 watts. The 580 is down to 244, and the 570 at 219. Getting into the mid range, the 560Ti is 170 and the 550Ti at 116.
  7. misinformedman said:
    All I can find is the TDP rating for the cards (how much heat their coolers are rated for?)

    my graphics card is rated for 147, the gtx 570 is rated for 217-240?

    I think this is more an estimate of how effective the cooler is, rather than how much heat the unit actually produces. Does the fact that they put in a 61% more effective cooler into the 570 mean it produces 61% more heat at load? What would this mean for idle (which is all I really care about, as I'm ok with the room being hot during a game)?

    source http://www.geeks3d.com/20090618/graphics-cards-thermal-design-power-tdp-database/


    TDP is the metric you want to look at. The TDP rating is the maximum amount of power the card will draw during load (before any overclocking). The figure includes the energy being used to run the card, and the power being 'wasted' as heat during the process.

    What you need to do is look at a cards rated TDP, then use benchmarks and reviews. Finding a card with a low TDP, that also give acceptable performance (for your needs), should show you which is most efficient, and hopefully, which will put out the least heat.

    Currently I'd say the card to go for (based on performance VS TDP) is the HD 7850. Otherwise if you're ok with just 'acceptable' gaming performance, the HD 7750/7770 are OK performers and should put out a lot less heat than your 8800.
  8. willard said:
    Nope, the TDP is for the GPU, not the cooler. The 570 really will produce that much more heat. Today's GPUs get really, really hot.

    You might want to focus your search for a low TDP card on AMD. They're typically cooler than Nvidia's offerings. You also don't want to be looking at the top cards in a line, as they'll have much higher TDPs than the mid-range cards will.

    For example, the GTX 590 has a TDP of 365 watts. The 580 is down to 244, and the 570 at 219. Getting into the mid range, the 560Ti is 170 and the 550Ti at 116.



    Yes, I see for example that the 7850, which is $250 and has almost identical performance to the 570 is rated for 10 idle and 130 max...

    I wish I could find data about what the 570 is rated idle, I'm pretty sure that 219 rating on the 570 is max (though that's still nearly twice the 7850).

    Foo. I really don't want an AMD card, as I was so very impressed with physx in batman and mafia II. I was really hoping to stay nvidia, but I just can't deal with such high idle heat production again.

    The thing is, I don't game often, but when I do I want it to be the best visuals possible (hence my enjoyment of physx). I want a card that runs cool at idle because 99% of my computer use will be with that card at idle.

    hrm...
  9. misinformedman said:
    Yes, I see for example that the 7850, which is $250 and has almost identical performance to the 570 is rated for 10 idle and 130 max...

    I wish I could find data about what the 570 is rated idle, I'm pretty sure that 219 rating on the 570 is max (though that's still nearly twice the 7850).

    Foo. I really don't want an AMD card, as I was so very impressed with physx in batman and mafia II. I was really hoping to stay nvidia, but I just can't deal with such high idle heat production again.

    The thing is, I don't game often, but when I do I want it to be the best visuals possible (hence my enjoyment of physx). I want a card that runs cool at idle because 99% of my computer use will be with that card at idle.

    hrm...

    If you can wait a few weeks, Nvidia should be releasing their competitor to the 7850/7870 soon. It should have an equally low TDP, but I doubt it'll match AMD's low idle power, as I believe that relies on an AMD specific technology. That said, most cards you buy now (AMD or NVIDIA) will have a lower idle heat output than your 8800.

    Anything in the AMD HD 5000, 6000 & 7000 and in NVIDIA 500 & 600 will have a lower idle heat output than your 8800. My previous post was mostly in reference to heat output during load, at idle most modern cards are pretty efficient.

    Check this out:
    http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/27

    The first chart is a systems total usage (including cpu, hard drives etc), with each respective card installed. Note, the 4870 way at the bottom is comparable to your 8800 in terms of idle power, so you can see how the newer cards have improved things in this aspect.
  10. Just to answer the title directly, temperature is, well, the temperature - a unit of measurement. Heat, however, is a unit of power.

    For example a teacup of water at 100C has less heat than a 10L bucket at 40C.

    So with regards to GPUs, the temperature you read doesn't mean much in the way of heat produced. As mentioned, the TDP is what you want to look at. The Thermal Design Power is how much heat the card can produce, although it's basically also the amount of electrical energy the card uses.

    Newer cards are generally more efficient, so cards of similar TDP are faster in the new models.

    As far as GPU temperature goes, this is affected by the cooler and ambient temperatures both in the room and in the case. The room temp provides a baseline, and a cooler provides a delta above ambient. Generally the delta a cooler provides at a given heat input will be pretty much the same, so if your room is 5C hotter than another room, the GPU should also be 5C hotter in that room.

    Case ambient temps matter simply because it's a fairly closed environment, and if it heats up then of course the GPU is using the hotter air and so it's the same effect as having a warmer room - except that a case can easily be 40C inside or more. My fan controller has temperature probes so I put 1 at the front of my case and 1 at the back. The intake one usually reads around 26C (although my room is around 20-23. Probably a calibration issue), and the exhaust one at 28-29C under normal use. When I game, it goes up to around 38-39C exhaust, and that's with pretty good airflow. If a case has bad airflow that warm air will stagnate and warm up more and more.

    So about upgrading, if the card has a similar TDP to your 8800 then it is putting as much "warmth" into the case - however, the temperature might be lower and the fans quieter because of a better cooler design. On the other hand, you can definitely find a lower TDP card with much better performance as well which means overall a cooler case - but still, that card might have a high temperature simply due to the cooler's performance and fan speeds. The temperature doesn't really matter though, as long as it is below 90C (100C is the danger zone).
  11. misinformedman said:


    The reason this matters, is that even at idle, my room temperature with my computer on is about 12 degrees f higher than the rest of the house, and I want my new card to produce less heat so that the room temperature doesn't rise.




    Forgive me for straying off of the topic but I can't help but wonder if your room is a closet for you to be concerned about idle heat generated by our computer. :o
  12. alrobichaud said:
    Forgive me for straying off of the topic but I can't help but wonder if your room is a closet for you to be concerned about idle heat generated by our computer. :o


    Actually it's a 12x8 with an open door and open windows, and yet walking into it from the hallway it is immediately noticeably warmer. (and the thermometer in my keyboard reads 82f, vs the thermostat in the hallway reads 71f)

    The only things I have in the room are my computer (with the 8800gts, and core 2 duo) e6600) and my monitor (a dell 2408wfp)

    I just want my office to be cooler than 82f when I'm browsing the internet or writing a word document. I'm hoping that swapping to a 2500k and a new graphics card will mean less heat at idle.
  13. misinformedman said:
    Actually it's a 12x8 with an open door and open windows, and yet walking into it from the hallway it is immediately noticeably warmer. (and the thermometer in my keyboard reads 82f, vs the thermostat in the hallway reads 71f)

    The only things I have in the room are my computer (with the 8800gts, and core 2 duo) e6600) and my monitor (a dell 2408wfp)

    I just want my office to be cooler than 82f when I'm browsing the internet or writing a word document. I'm hoping that swapping to a 2500k and a new graphics card will mean less heat at idle.


    At idle we're talking about maybe 100 watts going into your room. That is not going to make much difference. The monitor probably adds 30-50 watts to that, so 150 watts. It would take a long time to heat up a room.

    Chances are the sun is your problem, not the PC. But let's assume it is the PC, you might as well just buy a room fan to blow the cooler hallway air into the room and hopefully create air flow out the windows. Otherwise maybe open another window or external door somewhere else in the home so that the air can travel through.
  14. Only reason I asked is because my 'office' is a 6x8 roughed in bathroom with no windows. It is the only room in the house that I have left to myself and we really don't need another bathroom. With my rig dissipating the heat from two 7970's an overclocked i7 990X and 4 monitors, my room temperature stays within 3 degrees C of when my computer is totally powered down. Keep in mind, I am in the basement off of our family room where the average temperature is 20 degrees C and the door is always open to my 'office'. My furnace fan is usally on most of the time circulating the air in the house which may also help to keep the temps fairly constant.
  15. If you don't mind my asking, where (state/country wise) do you live.

    I live in Michigan, and when I used to live with my parents, I had a room which was extremely hot compared to the rest of the house in the winter; however it was hot due to the exhaust from our furnace being run through the wall my room shared with the kitchen (my parents use a wood burning furnace in the winter, but I think it got hot when they ran their gas furnace too). Considering it's still March, if your watching the temp over the last several months or so, you'd have been heating your house the whole time if your in a wintery area.

    I say this because it would be pretty shocked if your pc noticeably contributed to the temperature of an 8x12 room. You could also try turning your pc off when not in use if the temperature delta is coming from your pc (a great excuse to get an SSD :) )
  16. djscribbles said:
    If you don't mind my asking, where (state/country wise) do you live.

    I live in Michigan, and when I used to live with my parents, I had a room which was extremely hot compared to the rest of the house in the winter; however it was hot due to the exhaust from our furnace being run through the wall my room shared with the kitchen (my parents use a wood burning furnace in the winter, but I think it got hot when they ran their gas furnace too). Considering it's still March, if your watching the temp over the last several months or so, you'd have been heating your house the whole time if your in a wintery area.

    I say this because it would be pretty shocked if your pc noticeably contributed to the temperature of an 8x12 room. You could also try turning your pc off when not in use if the temperature delta is coming from your pc (a great excuse to get an SSD :) )


    I'm in eastern MA. do you think replacing my 10,000rpm HD would cut down on my room temp significantly? it says it runs at 41c without any fans on it, so I would doubt that'd contribute much. My office temp does drop to around 73f when the computer and monitor have been off overnight, which is only ~1f above the temp of the rest of the house. The computer is located about 5 feet off the ground and pointed toward an open door. I think the problem is simply that it produces so much heat.

    How difficult is it to under-volt a gpu like the evga 570? Since I will turn the clocks to their lowest when not gaming, undervolting would be the next logical step. I really do want the 570 over something like the 7850 because of physx...
  17. I don't know about nvidia cards but AMD will throttle voltage, gpu and memory speed down when not in use.
  18. misinformedman said:
    My office temp does drop to around 73f when the computer and monitor have been off overnight, which is only ~1f above the temp of the rest of the house.


    Overnight is key. Leave your computer on one night and absolve it from contributing to your heat issue.
Ask a new question

Read More

Graphics Cards Heat Temperature Graphics