I've been dubious about temp readings from CPUS and GPUs in the past but not thought too much of it. Recently tho a series of observations have me really confused.
I've got an HTPC with 8 Core AMD FX8300 and originally an R7 240 now upgraded to an R7 260X. EVEN under heavy load like running the new thief the old R7 would spin like a buzzsaw but report a temp that never went over 70 or 80 F. The CPU fan was noisy too so I decided to update the CPU cooler to an R7 260X and to add a Corsair H100 to the CPU. Now with a DIFFERENT video card under load it reports GPU temps of 35-40 F and the CPU now reports no higher than 55 F under load. It can't be over 70 in this room right now but still, I can't imagine ANY kind of water/air based cooling system being able to cool to 20 degrees below ambient.
Are the various temp sensors in a PC just inherently inaccurate? Is so, why include them? Can I calibrate them in some fashion or would it not help? I'm going to go grab one of those laser thermometers and make some notes but before then I just wanted to see if anyone had any thoughts?
I've got an HTPC with 8 Core AMD FX8300 and originally an R7 240 now upgraded to an R7 260X. EVEN under heavy load like running the new thief the old R7 would spin like a buzzsaw but report a temp that never went over 70 or 80 F. The CPU fan was noisy too so I decided to update the CPU cooler to an R7 260X and to add a Corsair H100 to the CPU. Now with a DIFFERENT video card under load it reports GPU temps of 35-40 F and the CPU now reports no higher than 55 F under load. It can't be over 70 in this room right now but still, I can't imagine ANY kind of water/air based cooling system being able to cool to 20 degrees below ambient.
Are the various temp sensors in a PC just inherently inaccurate? Is so, why include them? Can I calibrate them in some fashion or would it not help? I'm going to go grab one of those laser thermometers and make some notes but before then I just wanted to see if anyone had any thoughts?