Status
Not open for further replies.

Noworldorder

Distinguished
Jan 17, 2011
956
0
19,160
This is just a rhetorical question for curiosity's sake;

Lets say that the software power meter shows that the CPU is using 80 watts
and that it's voltage is 1.284

80 / 1.284 = 62.3 amps

Surely it is not drawing 62.3 amps, since the CPU +12V rail is only rated at 20 amps.

What is wrong with this math?
 
Solution
when people say oc at stock volts, the cpu does actually use more volts.
stock volts for an i7 920 is 1.18 in bios

you boot the pc and run cpu-z you will find its actualy using 1.01 @stock 2.6-2.8 turbo on
u then decide to crank it up to 3.6 which is the limit for most samples to stay on stock volts. so you enter bios do your tweeks and reboot.
this time when you run cpu-z your using 1.12 but your still within the stock settings of 1.18.
and the idle temp has risen by 5'c
so yes your using about 10% more volts and getting a 20%ish oc
but the resistance has also gone up because as you can see the tempriture has also risen not just because of the higher volts but also the higher frequencies.



From Wiki:

A voltage regulator module or VRM, sometimes called PPM (processor power module), is a buck converter that provides a microprocessor the appropriate supply voltage, converting +5 V or +12 V to a much smaller voltage required by the CPU. Some are soldered to the motherboard while others are installed in an open slot. It allows processors with different supply voltage to be mounted on the same motherboard. Most modern CPUs require less than 1.5 volts. CPU designers tend to design to smaller CPU core voltages; lower voltages help reduce CPU power dissipation, often referred to as TDP or Thermal Design Power

Some voltage regulators provide a fixed supply voltage to the processor, but most of them sense the required supply voltage from the processor, essentially acting as a continuously-variable resistor. In particular, VRMs that are soldered to the motherboard are supposed to do the sensing, according to the Intel specification.

The correct supply voltage is communicated by the microprocessor to the VRM at startup via a number of bits called VID (voltage identificator). In particular, the VRM initially provides a standard supply voltage to the VID logic, which is the part of the processor whose only aim is to then send the VID to the VRM. When the VRM has received the VID identifying the required supply voltage, it starts acting as a voltage regulator, providing the required constant voltage supply to the processor.

Modern GPUs also use VRM due to a need of more power and high current.
 

Your cpu runs at a core voltage of 12V?


Noworldorder: Yes, your CPU really does draw 60 amps. That's not coming from the +12V directly though. The 12V only has to supply a few amps to the VRMs, which then convert the higher voltage, lower current supply to a low voltage, high current supply for the CPU. Some CPUs can even pull hundreds of amps - that's why if you look at the contacts at the bottom of the CPU, a huge number of them are for the supply voltage.
 

Haserath

Distinguished
Apr 13, 2010
1,377
0
19,360

There's nothing wrong with the math. The processor uses 1.2V instead of 12V; there is a conversion in between to keep the cpu from dying from that much voltage. It can take the amps.

Voltage*amperes=Power

12V*20amperes=240W

1.2V*63amperes=80W

It stays within the power of the power supply. You would need 1.2V@200A to match the power supply.

An overclocked processor can easily pull 150W also, at higher voltages of course.

150W=1.4V*110A

Now I'm wondering. Does the resistance change at all in a processor with increasing clockspeeds?
 

You are confusing power with current. My OC'd Q6600 pulls 9.5 amps from the CPU power connector. Figuring 5% loss in the SMPS on the motherboard, that's 12 volts X 9.5 amps X .95 equals 108 watts. It needs 1.42 volts vcore for around 76 amps.

Your 60 amp house CB is more than likely 240 volts for 14.4 KW.

Noworldorder,
While not exact, the TDP approximates the amount of power the CPU will use at it's stock frequency.

Haserath, all semiconductors are by their nature nonlinear devices. A pretty good approximation of how much more power a highly OC'd CPU will need is:
(higher OC'd voltage)^2/(lower stock voltage)^2. Multiplying that by the TDP will give a very usefull approximation of power needed.

For example - my Q6600, 90 watt TDP, 1.2625 volt VID needs 1.42 volts at 3.6 GHz to be stable. Measured current at stock is 8 amps for 96 watts. Allowing for losses in the SMPS gives something very close to the TDP. At 3.6 GHz, the CPU pulls 9.5 amps.

(1.42)^2/(1.2625)^2 = 1.26. That is almost exactly 10 amps, very close to the measured 9.5 amps.

Or just assume TDP times 1.25 for a highly OC'd CPU. You will be 5% - 10% high.

If you are estimating power needs, you are better off on the high side.
 

stillerfan15

Distinguished
Jun 28, 2010
582
0
18,990

Maybe a Cray supercomputer? :)
 
Transistors can handle incredibly large currents, but they always do so at low voltage, an i7 950 has a max rated voltage by intel of 1.375V, but a rated TDP of 130W which means it will be pulling 94.5A through the VRMs on the motherboards, this seems like a lot but the total power flowing through them is just a bit higher than a big lightbulb pulls powerwise, consider that the average residential house has 150A service in the US, thats at 240V, the power lines at the street run at higher voltages to get more power to you at lower current levels, just like the 12V rail in the PSU feeding the VRMs, it sends 12V at low current, they drop the voltage and boost the current way up.
 

Noworldorder

Distinguished
Jan 17, 2011
956
0
19,160
I'm going to shock everyone OP. A I7 CPU overclocked may create a load range between 0 to a 140A!!!

While I get stoned for that answer google for this datasheet ISL6336
Actually, I did download and read some of the pages. Most of it was over my head but not all.
What I gleaned from that fascinating read (thank you!), was the importance of ambient temperature upon the workload of the VRMs, and how crucial it is to not operate the motherboard at stressed levels in the summer heat. About one-fifth of a VRM is devoted just to temperature compensation, and higher temps increase it's workload exponentially, ~ ergo: a cold computer lasts longer.
 


IIRC ENIAC used several hundred kilowatts. I'm guessing since it used vacuum tubes, the plate voltage was at least 100V, maybe 200V :p..
 

Haserath

Distinguished
Apr 13, 2010
1,377
0
19,360

I was actually just wondering if clockspeed would cause it.

Let's say you overclock using stock volts, since you usually can get a decent overclock from stock. Does this increase the resistance of the transistors? It would seem to me that it would, since the transistors would have to switch more times per second.

The only problem with this is that it would keep constant voltage while decreasing current thus lowering power. But processors seem to gain a few watts of power with a few hundred Mhz added, without voltage increases in the Bios.
 
when people say oc at stock volts, the cpu does actually use more volts.
stock volts for an i7 920 is 1.18 in bios

you boot the pc and run cpu-z you will find its actualy using 1.01 @stock 2.6-2.8 turbo on
u then decide to crank it up to 3.6 which is the limit for most samples to stay on stock volts. so you enter bios do your tweeks and reboot.
this time when you run cpu-z your using 1.12 but your still within the stock settings of 1.18.
and the idle temp has risen by 5'c
so yes your using about 10% more volts and getting a 20%ish oc
but the resistance has also gone up because as you can see the tempriture has also risen not just because of the higher volts but also the higher frequencies.



 
Solution

Noworldorder

Distinguished
Jan 17, 2011
956
0
19,160
This has proven to be a fascinating discussion. The science of computing is never boring.
I gave HEXiT Best Answer in that his was the best given given his ranking.

I think a fair way of rating is Best Answer to Rank ratio.
 
Status
Not open for further replies.