Archived from groups: alt.comp.hardware.overclocking (
More info?)
power_ranger wrote:
> Sat, 17 Apr 2004 13:33:58 -0500, na alt.comp.hardware.overclocking, David
> Maynard wrote:
>
>
>
>>Well, something certainly 'changed' but if the "system temperature" you
>>speak of is the motherboard's monitoring of "system temperature" then you
>>can't draw the conclusion you appear to have drawn.
>>
>>Ignoring, for the moment, airflow patterns inside the case the GPU is
>>putting out the same amount of heat flux regardless of how well you cool it
>>(or how hot it gets) so, from the internal case perspective, there is no
>>difference in the overall heat load. I.E. the average internal case
>>temperature should be unaffected and so should system temp.
>
>
> I can't agree here because my GPU, though still producing the same amount
> of heat, is not as hot as it used to be because it's cooled better. So, the
> overall heat load is lower, hence the lower case temp. It's like removing a
> big redundant heater from a house that already has a few heaters.
You're confusing temperature with heat flux and they're not the same thing.
Heat flux is the power dissipated whereas temperature is the result of that
heat flux across the resistance to the heat flow (I.E. how good the
heatsink is); just as voltage and current are not the same thing with
voltage being the result of how much current flows across an electrical
resistance (or vice versa depending on how one orients the equation).
The power being dissipated has not changed so the heat flux out of the GPU
has not changed regardless of how 'hot' the GPU is getting, which will be
determined by how much resistance your GPU cooler presents to the heat
flux, but you can't tell by measuring how hot things around it get because,
again, the same amount of heat flux is being dissipated.
Let's take your room heater analogy. No, you have not removed a heater;
you've changed (you hope) the thermal resistance (with your new heatsink)
from it to the room it's in. I.E. imagine the heater is a constant 1
kilowatt and has a wooden box (poor thermal conductor) around it (and we'll
presume it doesn't catch fire or self destruct). The heating element has to
reach a certain temperature to force the heat across the thermal resistance
of the box (your old GPU cooler) and it'll get as hot as it takes to do so
(which, in the real world, probably WILL cause it to catch fire or self
destruct from getting TOO hot). Replace the wooden box with a better
conductor, a metal box, and the thermal resistance is now less (your new
GPU cooler) so that same 1 kilowatt heater element doesn't get so hot to
dissipate the energy (I.E. it runs 'cooler' with your new, lower
resistance, heatsink). Does the room get warmer or cooler?
Neither. There's still that exact same 1 kilowatt of heat being pumped into
the room just like before. The heater element isn't as 'hot' now, because
the metal box is a better thermal conductor, but there's no way to know
that from the room temperature because it is not affected. I.E. A kilowatt
into the room is a kilowatt into the room.
>>If the motherboard's "system temperature" changed, as you indicate it did,
>>then it most likely means that the fan on the GPU cooler has altered the
>>internal case airflow (or something else changed even though you are
>>unaware of it;
>
>
> I think you shouldn't underestimate me at this - I know it's just because
> part of the cooling radiator is sticking out the case and that's why the
> heat surplus is flying away outside
I hadn't heard anything about an external radiator. However, that still
doesn't mean your measurement of system temperature is representative of
how well the GPU is being cooled. It's simply a variant of the altered
heating of the system temp sensor that I mentioned. I.E. I postulated a
possibility of the heat flux being directed away from the sensor and being
dissipated 'outside' is certainly 'away' from the sensor.
I.E. Ok, so the heat flux, or at least some, is being vented to the outside
so the 'system temp' is 'cooler'. That says nothing about how hot the GPU
is getting for the heat flux to flow though that thermal resistance, be it
to the 'outside', 'inside', or wherever.
>>such as cable locations)
>
>
> cable locations can make a difference of 12C?
Sure it could, if cables block off airflow around the sensor (and then
being unblocked it would drop) or cause a source of hot air to blow on the
sensor.
> I'd have to wrap the
> motherboard and the CPU with them!
You obviously didn't put much thought into it.
>>so that the motherboard sensor is now being cooled better,
>>but that says nothing about how well it's cooling
>>the GPU.
>
>
> I don't see your point
The point is that a drop in the system temperature says nothing about how
well the GPU cooler is performing.
> - well of course I can't be sure how much the GPU
> has cooled down but still I can be sure that given all the remaining pieces
> of the puzzle I get the answer
Depends on which pieces of the puzzle you're talking about and what
'answer' you think you're getting.
I'm only talking about your claim to Phil that you 'know' the GPU cooler is
doing a better job than the old one 'because' your system temp went down
but while you may 'know' it's doing better from some other 'piece of the
puzzle' it sure isn't because the system temp went down.
>>And, btw, the "system temperature" being lower doesn't mean that the
>>average case temperature is any lower either. It simply means that
>>particular 'hot spot' is not as hot but it too is still putting out the
>>same heat load into the case.
>
>
> Yeah, but I don't think this hot spot suddenly decided to cool down a
> little bit just to please me.
I didn't say it did.
> There has to be a cause and effect, doesn't
> it?
And I gave you some possibilities for what the cause might be but the
specifics aren't important because that's not the topic. The point is that
a drop in system temp, for whatever reason, is not an indicator of how well
the GPU cooler is cooling the GPU.