How can I find or make a graph of CPU/GPU Load vs Temperature?

rritoch

Reputable
Apr 6, 2014
5
0
4,510
I am working on a thermal control system which varies core clock speed of a GPU/CPU to maintain a specific temperature. I want to include load into the calculation but to do so I need to know the normal temperature response (partial derivative) to load when all other variables (core speed, fan speed, etc.) are held constant. I intend to factor load into equations which increase clock speed, but not into equations that reduce clock speed. The purpose of this is that if load is <100% I don't want the system to crank up the core speed to maximum only to burn up the chip if the load spikes to 100%. I have considered using a straight line function from idle temperature to target temperature but it may not be the ideal solution.

T_target_applied = Target temperature to plug into existing optimization algorithm
T_idle = Idle temp
G_load = GPU Load (0 to 1)
T_target = Actual target temperature

T_target_applied = T_idle + (G_load * (T_target - T_idle))

To find an ideal solution I need to see the normal temperature response of a CPU/GPU to load.
 

rritoch

Reputable
Apr 6, 2014
5
0
4,510


I am referring to CPU usage percentage so .5 = 50% cpu usage, 1 = 100% cpu usage, 0 = idle, etc.

 
He wants a way to record this data, he's not asking us to generate it.

My best guess on how to do it would be to get an application like MSI Afterburner (GPU overclocking tool, it also monitors stuff like temp,% usage and clockspeed) and find some way for it to export its results as raw data, and then import that into an Excel sheet to turn into a graph. I dont have Afterburner or any overclocking programs on my computer so I dont know if it can do that natively, but given your creating what is essentially an auto-overclock tool by the looks of it, I think you might be able to code up a solution for it.