question about TDP

Ahmadovich

Honorable
Jan 21, 2014
103
0
10,680
hey ,guys

I wonder how do I know the maximum total amount of power a certain processor would consume ?
I asked someguy, he told me that the Maximum Total amount of power a CPU would consume is Called "TDP"

but then I Found this wikipedia article
http://en.wikipedia.org/wiki/CPU_power_dissipation
and the writer says in it
" TDP is not reflecting the actual maximum power of the processor"

So the question is.
"how do I know the maximum total amount of power a certain processor would consume ?"

Thanks
 
Solution


the writer on the wiki page is right. your friend is not. TDP is the amount of heat energy your HEATSINK needs to be able to deal with... not the electricity the cpu uses.

Generally you need to do some research. which chip are you looking at?
 
TDP means both the power consumption and (roughly) the heat output. The TDP stated on a product page indicates its maximum power consumption at stock clocks. When your water-cooling or otherwise trying to estimate how much heat its going to be putting out, the TDP indicates the maximum amount of heat it could put out, if all electrical energy is converted to heat (which it wont). But is close enough for that purpose.
 


nope. this is wrong.

at stock clocks both intel and amd chips can use a LOT more watts then their TDP states. This might be true of GPUs... (which tend to be pretty close to their TDP at stock under load) but it most certainly isn't true of your cpu.
 


Do you have an articles to support this?
A cpu doesn't produce movement or light.
Virtually all the energy drawn by the CPU must be converted into heat.
This should suggest that the maximum heat energy produced by the CPU (TDP) is very close to the maximum power drawn by the CPU.
Generally the power drawn will be lower than the maximum because the CPU rarely runs at 100% utilization.
Overclocking the CPU of course can cause it to exceed its rated TDP and draw more power.
If the cpu were to draw "a LOT more watts then their TDP states", where does this energy go?
 


The complete statement you have quoted from is "Both Intel and Advanced Micro Devices (AMD) have defined TDP as the maximum heat generation for thermally significant periods, while running worst-case non-synthetic workloads; thus, TDP is not reflecting the actual maximum power of the processor."

The inference is that the CPU power usage may exceed the rated TDP in synthetic workloads or for short periods of time that are not "thermally significant", meaning not long enough to generate heat in excess of the TDP.

TDP is a good indication of maximum power consumed under real world usage.
It can be used as a maximum value when calculating system power usage.

Once you have calculated maximum power draw:
- Do not exceed 80% of the rated power of the power supply
- In general most of the power drawn will be from the +12V rails, so do not exceed 80% of the combined +12V rating
- Make some allowance for motherboard, fans, hard drives and any overclocking of CPU or GPU (I use 40W as a rule of thumb for most systems)

Mechanical hard drives can draw up to 20W on startup but this quickly reduces and at that stage draw is not high from CPU or GPU so this isn't a big factor unless you have low powered CPU + GPU and lots of mechanical hard drives.
 


TDP is not a scientific measurement and has no concrete definition. All major manufacturers including Intel, AMD, NVidia, IBM, Samsung, and Qualcomm measure it sightly differently.

In general it is a rolling average of heat dissipation (energy is conserved, electricity in = heat out) over a significant period of time under marketed operating conditions. Taking the device outside of marketed operating conditions either through overclocking, tweaking power management settings, or running extreme synthetic benchmarks can push the actual power consumption over the same time window above the marketed TDP.

A good example of a synthetic benchmark that will cause a microprocessor to exceed its TDP is running Intel Burn Test. Performing SIMD matrix calculations designed to stress floating point subsystems is not something that is considered to be normal usage.
 

Ahmadovich

Honorable
Jan 21, 2014
103
0
10,680


isn't Power Going in = Power Going out as Heat ? as VincentP and Pinhedd say





------------------------------

if actually Power Consumed By CPU doesn't equal amount of Heat it radiate, I'm looking for knowing the amount of power (I5 4440) comsumes
 


For any practical use in calculating power usage, you can use the TDP as a maximum. Core i5 4440 TDP = 84W.
 
Solution


yeah... it will be pretty close... the benches I've seen place it around 95W max (not sustained, sustained is pretty close to the TDP of 84W)... under full load on all cores. understand that's at stock settings.

for the non-k chips or k chips at stock settings, TDP isn't too far off the sustained load power use on intel cpus. Which was why I was asking what chip you were using.