Affects of TDP on Temperature and Power Consumption?

thismafiaguy

Distinguished
Jan 9, 2011
607
0
19,160
I used to have an AMD FX8350, which has a TDP of 125W as we know it, and it ran incredibly cool at stock clocks under a Corsair H80i. However, when I switched to an Intel i5-4670K with a TDP of 84W, the load temps were 20?C higher than the FX8350 on average. This was a while back but I do remember adjusting for ambient temperature, and the results were from the same Corsair H80i using the same system monitoring software. After I delidded my 4670K, the load temperature dropped by about 10?C, but I still ultimately do not understand why my FX8350 ran cooler than my 4670K, also at stock clocks and even after delidding.

As far as I understand, TDP stands for Thermal Design Power, and is meant to reflect the expected heat output from a processor under typical loads. So this means that my FX8350, with its higher TDP, should have ran hotter than my 4670K, but that was not the case. Can anyone explain what could have possibly been the reason for this?

On a separate note, I see that a lot of people misinterpret TDP for power consumption, and automatically think that a power hungry processor will inherently produce more heat, ignoring the architectural power efficiency of the said processor. Some rumors are suggesting that an upcoming AMD GPU will have a 300W TDP, and that has got people worried, but mostly about power consumption. We know by definition of TDP that this actually means it will be a very hot GPU, but what do you think it will indicate about power consumption? If AMD has managed to improve the power efficiency of this GPU, is it possible for it to consume the same or less power than other GPUs of lower TDP?
 
Solution


the way amd cpus measure heat is different from how intel cpus measure it. Furthermore AMD cpus don't even have an on chip thermometer, temp is estimated with a mathematical algorithm. What an AMD cpu reports as it's core temp is "believed" to be closest to the "surface" temp of the heat spreader. Of all the temps reported in an AMD system, the one temp believed to be closest to the actual core temp of the chip is the "socket" temp on the motherboard.

TDP is the estimated Watts of heat energy a CPU cooler will need to be able to deal with under heavy workloads. Both AMD and Intel measure this differently. In fact intel's been acused by OEMs, AMD and ARM with playing games with their TDP numbers (in thier low power consumption chips such as atoms and low end core Y chips) in order to make their chips appear more energy efficient. The recent Broadwell Y parts seem to continue the trend as OEMs have come to recognize 5W rated heatsinks aren't good enough to keep the 5W TDP broadwell parts cool.

That said while TDP is a HEATSINK rating, not a power consumption rating, it usually does a pretty good job of giving the consumer a good idea of the AVERAGE power draw of a cpu/gpu (non-overclocked) when under a full load. for example, haswell core i 5 cpus rated at 88W TDP typically clock in around 87W-91W average power draw under full load.

The issue gets more confused with gpus. see gpus actually have a much greater load "range" for their power draw, under load a gpu's power draw will vary greatly, sometimes as much as a 100W in under a second. So while the TDP of a gpu typically does a solid job of giving you the idea of how much heat energy the gpu cooler needs to move, and does a solid job of letting you know the ballpark average power draw under load, it does not come close to representing the MAX power draw this gpu will pull down. For example, while a r9-280X might be rated at 250W TDP it can draw as much as 300W under full load. The same could be said about the gtx 780, which was rated with a 250W TDP, but regularly hit power draw numbers under load around 300W

300W TDP isn't a big deal. the 780ti had a reported TDP of 250W, however no one really noticed or cared that the card drew significantly more power from the wall then the 780, it even had the two 8 pin power plugs to draw 350+W, which is where its max power draw from the wall spiked around regularly under load.


as to the new 300W card? it will warm your room pretty good in the winter, but i'll tell you this, if it's 20-30% faster then the gtx980 or titianII then no one will care how much of a power pig it is.
 
Solution

iamlegend

Admirable
TDP= Thermal Design Power or the safe power that an object is made to absorb.
In thermodynamics the equation is Q=mCp(T2-T1)

Where Q is Heat (power)
m is mass
Cp=enthalpy
T2=Temperature 2
T1=Temperature 1

This is the basic engineering thermodynamics computation but for electrical components (CPU) the equation is different but the principle is the same.

So, stating the above facts, you will notice that there is a direct effect between Q (Power) and temperature as for the Temperature increases the power also increases. However in designing there is the factor of safety usually included in the computation in number 1.5,2,3, 5 depends on the designer.

Having 88W TDP and factor of safety of 1.5 you can go until 132W to reach the limit. It is unsafe though.
 

thismafiaguy

Distinguished
Jan 9, 2011
607
0
19,160
Thanks guys, very good answers from both of you! And it seems that it might take a bit more time before we can have truly fanless laptops that will stay cool under load. I really didn't expect the 780 Ti to have a max power draw into the 300W territory, so the fanboys are just fanboys after all, Nvidia GPUs definitely aren't "twice" as efficient as AMD GPUs. I really think AMD is going to give Nvidia a run for their money this year with new GPUs. As a consumer, you just gotta love competition in the market!

I have to downsize my rig to fit into a carry on suitcase, so I'm actually a bit spooked by high TDP. I can modify an ITX case to house a dual 140mm radiator, but even then, I'm not so sure that it will be able to handle my 4670K + a 300W TDP GPU. But here's to hoping that it all somehow works!
 


well it depends what they mean by "efficient". if they are measuring TFlops vs Watt then yes, the new maxwell gpus are probably x2 more efficient then 2-3 year old AMD gpus on the GCN 1.X architecture.

If they're comparing a 170W gtx970 vs a yet to be released 380x based on a 300W rumor, then yes, they're being fanboys... and even if this 300W gpu is only 20-30% faster while drawing almost 2x the power no one will care that the maxwell design is more power efficient because it's 20-30% slower.

that's like saying a 5W broadwell Y part is x3 more efficient then a 15W haswell part, when the haswell part is 30% stronger. They aren't even in the same discussion at that point as they might as well be totally different chips in totally different markets.

if you have a 30% or more jump in performance with the r9-380x over the best offering nvidia has on the market i'll guarantee that no one will care if it draws x2 the power.
 

thismafiaguy

Distinguished
Jan 9, 2011
607
0
19,160
I was referring to the endless fanboy debate between the 780Ti and the 290X. Both were ludicrously expensive at one point, and the people who bought them were determined to defend their purchase to the death, as it seemed.

Before the Maxwell debut, I thought the 780Ti was quite a bit overpriced, and AMD definitely should have done a better job with the 290 series reference coolers. Thanks to Bitcoin mining and TSMC's inability to produce 20nm chips, it has been a pretty stale year in the GPU market. Maxwell was the only interesting thing of the year, but it didn't necessarily push the performance envelope like I was hoping. As a desktop PC gamer, I wouldn't even think about power consumption as long as I got the highest possible performance. It's good to have improved power efficiency, but that should have been a lower priority for desktop graphics.

The fanboys will always find something to argue about, meaningful or not. I just hope there will be some much needed competition in the GPU market again soon. We need another Radeon 7970 vs. GTX 680 battle. As it is now, Nvidia doesn't even seem to be trying, just take a look at the latest information about the GTX 960: http://wccftech.com/asus-zotac-evga-gtx-960s-pictured/
 


well the r9-290x sorts deserved the crap it got with that horrible stock cooler AMD stuck on it; even though it drew similar power numbers under load to the titan, it ran so f-ing hot it throttled itself and under performed what it should have been able to put up "numbers wise". That whole stock cooler fiasco sort of sunk the whole product lineup until bitcoin miners lost their collective minds and started buying them by the bushel full. Then of course the market got flooded with busted up cards on ebay as those same people tried to scam some cash out of unsuspecting people selling thier used mining junk, further damaging the card's rep.

no, the r9-290x clearly deserved most of the horrible press and word of mouth it got.

That said the 780ti when seriously overclocked by an enthusiast was a much bigger power pig then the r9-290x; heck most enthusiasts would tell you that you weren't overclocking your 780ti unless you shoved 500W down it's gullet.
 

TRENDING THREADS