Why does everyone say 220W TDP for FX 9370

Petabyte

Reputable
Feb 14, 2015
70
0
4,660
Just wondering why everyone bashes the 9000 series AMD CPU's for having high power usage if the actual Watts are only 76.10W. I've searched around and everyone ALWAYS refers to the 220W TDP

J1Pthz.png
 
Solution
Right the gpu was probably idle. It was a cpu load to test full system running, aka every part needed for a functioning pc since a cpu can't run by itself, and was tested just loading the cpu. It was also tested using a power meter at the plug for just the tower, not the monitor and tower. Hardware meters are more accurate than software which attempts to read from motherboard sensors.

Another way they test in some power consumption benchmarks is to use an amp meter clamped over the wires leading the cpu power plug from the psu. Software reporting tools are mediocre most of the time and don't necessarily test everything properly or take everything into account.

When comparing to an intel system, the same/similar hardware is used...

Justin Millard

Reputable
Nov 22, 2014
1,197
0
5,660
220w TDP means a massive cooling requirement. They are so massively overclocked that they are crazy power inefficient, hot, and un-reliable under load. You need a great Power Supply, case and great after market water cooling solution like a h100i to keep them cool an reliable in the Summer.

Here is what wikipedia has to say about TDP.
"The thermal design power (TDP), sometimes called thermal design point, is the maximum amount of heat generated by the CPU that the cooling system in a computer is required to dissipate in typical operation."

220w TDP means a lot of cooling that needs to be taken care of.
 

Petabyte

Reputable
Feb 14, 2015
70
0
4,660
ok here's the result while Stress testing the CPU. Again it doesn't go much higher than 76W at 78W. So does the TDP mean that it can withstand more heat than average? I just don't understand how 78W is 220W. The CPU has treated me well in the past few years mainly gaming, but I notice it gets lots of hate for it's TDP which doesn't really mean anything because it's not really the power it uses.

Here's more from wiki-
The TDP is typically not the largest amount of heat the CPU could ever generate (peak power), such as by running a power virus, but rather the maximum amount of heat that it would generate when running "real applications." This ensures the computer will be able to handle essentially all applications without exceeding its thermal envelope, or requiring a cooling system for the maximum theoretical power (which would cost more but in favor of extra headroom for processing power).


W8emM1.png
 
Many people confuse tdp with power consumption, watts can represent either. In terms of tdp as mentioned above (thermal design power), it's watts related to heat or the expectancy of how much heat the cooler is expected to dissipate under typical loads. Compared to say 88w of an intel cpu, 220w is 250% more heat to be dissipated.

Tests like this have been done indicating system power consumption, it's not cpu only but obviously a pc operates as a system not just one part.

"When it comes to single-threaded loads, the top-end Vishera-based CPUs consume 70% more power than their Haswell-based opponents."

"The FX-9370 and FX-9590 really need 100 to 120 watts more power than the FX-8350 at full load. As a result, the overall power draw of a configuration with a top-end Vishera-based CPU may be twice that of a Haswell-based configuration that would deliver the same performance and cost the same money."

http://www.xbitlabs.com/articles/cpu/display/amd-fx-9590-9370_7.html

It's not to say they don't work but they're highly inefficient.
 

Petabyte

Reputable
Feb 14, 2015
70
0
4,660
Hi synphul. Thanks for the reply. I notice on the power consumption graph on the xbitlabs that the draw for the fx 9370 is just under twice the amount my cpu is taking at max load stock clocks. How could that be? I mean it's showing 57.6W higher than mine full load. Where are they getting 136W?

One more thing I have to point out with those graphs. How does the intel Full system load Number be lower than what the GPU ( GTX 780ti ) draws by itself?
The card draws at peak seen here 269W
7fwlDV.png


This is 780 ti Average draw.
PSoIQ5.png


yet the full system draw is only 151-168W
Qub1Gj.png


This is from

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_780_Ti/25.html
 

migronesien

Honorable


By running a CPU benchmark.
 
Right the gpu was probably idle. It was a cpu load to test full system running, aka every part needed for a functioning pc since a cpu can't run by itself, and was tested just loading the cpu. It was also tested using a power meter at the plug for just the tower, not the monitor and tower. Hardware meters are more accurate than software which attempts to read from motherboard sensors.

Another way they test in some power consumption benchmarks is to use an amp meter clamped over the wires leading the cpu power plug from the psu. Software reporting tools are mediocre most of the time and don't necessarily test everything properly or take everything into account.

When comparing to an intel system, the same/similar hardware is used. Obviously the motherboard cannot be the same since they require different sockets, but it too uses a motherboard, ram, data drive, gpu etc same as the amd system. Both tests for amd/intel in these comparisons use a fully functioning complete tower so it's a fair comparison pitting cpu/system to cpu/system.

It really depends on the personal outlook of the user as to whether it's an issue or not. It may also depend on someone's region where energy cost has to be factored in. For instance the cost of electric in England runs around double per kwh what it does where I live in the U.S. so it might be more important to someone living there.

Heat is another factor, amd's 8 core cpu's produce more total heat volume from power consumption than intel's quad core cpu's do. Partly because they're less efficient per core, partly because they're running twice the processing cores. Often times the amd 8 core cpu's get a further bad wrap because when it comes to performance in applications they compete closely head to head with intel's 4 core cpu's. (Close enough for comparison purposes). When it takes twice the hardware (cpu cores) to achieve the same performance it's another aspect of the inefficiency.

It doesn't mean the fx 9370 is totally incapable, it just means it consumes more power and generates more heat to do the same task(s). Whether or not that's an issue depends on the individual. For 1 home use pc it may not matter much. If someone's energy rates are higher it could matter more to them. If it's a business with 50-100 pc's it could matter much more.
 
Solution

TRENDING THREADS