question about TDP

gtadem

Honorable
Dec 28, 2013
18
0
10,510
Suppose 2 PCs of identical hardware competently running identical software. The only difference is that one has a CPU that is twice is strong and rated for twice the TDP. Will the stronger CPU run hotter all the time because it is stronger or cooler most of the time since it doesn't have to work as hard to achieve the same results?
 
Solution
If your theoretical chips double TDP to double performance they have the same IPC and efficiency. So the more powerful chip at half power should use the same amount of energy as the weaker chip running at maximum.

In the real world there is a minimum overhead with running the chip in the first place, so even a dual core vs quad core of the same design won't be exactly half/double the power.

Eximo

Titan
Ambassador
If your theoretical chips double TDP to double performance they have the same IPC and efficiency. So the more powerful chip at half power should use the same amount of energy as the weaker chip running at maximum.

In the real world there is a minimum overhead with running the chip in the first place, so even a dual core vs quad core of the same design won't be exactly half/double the power.
 
Solution

gtadem

Honorable
Dec 28, 2013
18
0
10,510


If one CPU was twice as strong as the other, applications wouldn't cause both CPUs to run at the same load. I'd say that's common sense, but it's rude to do so. In my question, both scenarios I entertained were reasonable. Since they both cannot be true simultaneously, I was looking to learn which way it played out in practice. To say it's common sense isn't helpful, especially if what you claim to be common sense is self-contradictory.
 
If in your theory double tdp can equate to double performance then as said it'll mean double efficiency as well. Say a task takes n seconds to complete in weaker chip, it'll take n/2 time in stronger chip if both run at Same percentage of their full speed.

Why will the stronger chip not perform as hard? It'll take less time, but equal work for this hypothesis to make sense.

While rendering identical videos, both cpus will work at full strength to finish task as soon as possible.
 
Not sure what you meant by that, but I'd like to add another variable.

Temparature. Since your question revolves around how hot or cool the chip gets, Assuming the temps increase proportionally to the load they have, say from 20C 0% to 80C 100%, meaning 1C for every 2% load increase, say. They'll both increase temps uniformly, since they have same efficency (since double the power and performance).

Going by that, both chips should be at 55C on 50% load. So when both have been assigned a task, say to render a 100sec video, and to make it fair, we'll limit the better chip wth double performance to 50% so that they both take equal time.

Now, better chip at 50% will operate at 55C while weaker chip at 100% will operate at 80C. So technically, the weaker chip will be hotter. But since Heat released will also include Power consumed, so 50% of the double TDP CPU will still be equal to 100% of weaker CPU's TDP. Like if weaker is 100W TDP and stronger is 200W, then 50% of 200W is still 100W.

So in theory (and only in theory), both shall produce same amount of heat in equal time.