How does GPU usage affect temps and power draw?

Gamer-Potatoes

Reputable
Jan 16, 2016
30
0
4,540
As the title suggests I would like to know how GPU/CPU usage can affect both core temps and powerdraw?

From what I've read then at a fixed voltage even the wattage and amperage should be fixed as well, and changing one would change the other. If this is correct then what happens to the excess "power" given to the CPU/CPU? I am assuming the power draw is affected since temps are different even though clocks and voltage are fixed with fluctuating core usage, and I have also read in some forums and 1000W of heat is always 1000W of heat.

Could someone shed some light on this please?
 
Solution
well if the core has nothing to do it will simply be NOP (no operation instruction) all the time.

If you think about the electrical pathways through a chip, it makes sense that a NOP will be very short and low resistance, i.e. very little heat is produced.

When you ask the CPU to actually process something the pathways and registers will suddenly be changing state and a lot more "paths" will be active. The total length a current will travel will likely be much longer than with NOP. Therefore resistance will almost always be much higher, therefore the heat produced will be much more.

This is a very generalised picture I am trying to paint for you. But basically this is the difference that causes more temperature at load than idle.

americanbrian

Distinguished
Your question is not very clear. Yes 1000w of heat == 1000w of heat, that is just saying the same thing twice. Not really a question.

Locking voltage does not lock the frequency. so power states are used to reduce draw. Cores instantly increase frequency when needed so unless you have purposefully switched off power states then load makes a difference.

More power = more heat.

What is it you actually want to know?
 

Gamer-Potatoes

Reputable
Jan 16, 2016
30
0
4,540


Thanks for the speedy reply.
Maybe I made the question unclear but I would like to know what happens to the excess power when the core is speedup and at full voltage despite low load.

E.G
If I lock my GPU at 1.2V and 1000/1500 core and memory clocks, then disable the power saving features so that these clocks and voltages remain fixed. Looking at temps I can see at "idle" this would be somewhere between 50-60C, whilst under full load with the same clocks and voltage temps can hit 85C. I would basically just like to know how excess power is handled, and if it is then how is it using less power under high load vs low despite identical clocks and voltage?

Maybe I made it unclear again but I hope you catch my drift.

 

americanbrian

Distinguished
well if the core has nothing to do it will simply be NOP (no operation instruction) all the time.

If you think about the electrical pathways through a chip, it makes sense that a NOP will be very short and low resistance, i.e. very little heat is produced.

When you ask the CPU to actually process something the pathways and registers will suddenly be changing state and a lot more "paths" will be active. The total length a current will travel will likely be much longer than with NOP. Therefore resistance will almost always be much higher, therefore the heat produced will be much more.

This is a very generalised picture I am trying to paint for you. But basically this is the difference that causes more temperature at load than idle.

 
Solution

Gamer-Potatoes

Reputable
Jan 16, 2016
30
0
4,540