So, I built a new system and OC'd to 3.6Ghz (400 x 9). Everything seemed fine after some voltage tuning, but I wanted to make sure it wasn't overheating so I took some temps while running Prime95. The results were rather confusing, at least to me. I used 3 different programs just in case since I heard arguments about which is better.
Realtemp - 44 C
Coretemp - 48 C
Speedfan - 53 C
Realtemp - 55 C
Coretemp - 60 C
Speedfan - 65C
My Specs are:
G.SKILL 4GB (2 x 2GB) 240-Pin DDR2 SDRAM DDR2 800
Core 2 Duo E8400
ASUS EAH4850/HTDI/512M Radeon HD 4850
Western Digital Caviar SE16 WD6400AAKS
ASUS P5Q Deluxe LGA 775
SeaSonic S12 Energy Plus SS-550HT 550W System Builders 1pk DSP
Antec Nine Hundred
So, is any of them telling the truth? Which should I believe? Help! Thanks!
Realtemp is believed to be more accurate for 45nm chips because it uses a TJMax (temperature junction max) of 95 compared to most other temperature apps that use 105. TJMax is the temperature at which the cpu will start to throttle itself to prevent damage ie reduce speed\voltage.
Temps are read from the cpu in terms of degrees distance from TJMax. So coretemp and many other temp apps will give you a reading of 10C hotter than realtemp due to the higher tjmax used in the calculation
Actual reading from CPU = 65 (which is distance from TJMax)
RealTemp TJMax = 95; 95 - 65 = 30C core temp displayed
CoreTemp TJMax = 105; 105 - 65 = 40C core temp displayed
Intel has recently released a doc stating the actual TJMax for 45nm Dual-Cores is 100C and not 95C so it is up in the air which is more accurate. It is up to you whether you want to trust what Intel states or what actual testing by both the author of RealTemp and other users have come up with (95).
Intel doc quote:
45nm Desktop Dual-Core Processors
Intel Core 2 Duo processor E8000 and E7000 series - 100°C