I read that when a P4 CPU overheats it slows down itself and so produces less heat (AMD CPUs just burn). So I was wondering when does having a hot CPU starts effecting performance? Temp wise I mean...
I benchmarked my CPU in Prime95 when it was hot (50C) and when it was cool (38C) and there was somewhat of a decrease in timings, meaning it was faster. (It's hard to tell from that benchmark what impact it would have on everyday performance like gaming). So I assume there's a temp from which the CPU doesn't perform as good.
Second question: How does the CPU slow itself down? Is it by underclocking? If it is, shouldn't I be able to see it at the My Computer or the CPU diagnostic?
I'm a nuclear reactor cooling system programmer, if you see me running, it's probably already too late.
The video showing the Athlon burning up isn't really relevant to todays' athlons and chipsets, it's old tech - AMD has had a thermal protection method since the palomino Athlon XP, and we're a good few cores down the line since then(t-bred A, t-bred B, Barton). Granted the solution required some circuits on the motherboard so was reliant also on the motherboard manufacturer, but the specs AMD gave them would work if followed properly.
The A64s IIRC have on-die thermal protection, so no special stuff on the mobo is needed.
<font color=red>The preceding text is assembled from information stored in an unreliable organic storage medium. As such it may be innacurate, incomplete, or completely wrong</font color=red> :wink: