GPU vs. CPU temp?

derek2006

Distinguished
May 19, 2006
751
0
18,990
Hi, I have been wondering this for quite some time now. Why does a gpu running at 600mhz such as my x850 put out more heat than a cpu running at like 3ghz or so such as the Pentium 830. They both use the 90nm transistor size. Doesn't heat output increase with clock rate? Or is gpu architecture that inefficient? Also why do lowly clocked gpu suck so much more power, is it again because of low efficiency?
 
GPU's do a lot more work, look at the fact that folding works a lot quicker on a GPu than a CPU (although this is also an optimisation issue)

The real question here is why can GPU's cope with higher temps than CPU's (GPU's are better) the temps that they run at is determined by the size and effectivness of the HSF and the power in the form of heat that they are having to remove.

GPUs will be hotter as they are doing more, have a less effective heat sink and therefore they stabilise at a higher temp. If they could not cope with these temps they would need a bigger HSF to draw heat energy away quicker to maintain an appropriate temp.

I would hardly say that GPU's are 'less efficient'. In fact as a limited purpose processor they are moree efficient than CPU's and in their ability to work at higher temps they are superior.