Closed
Solved
How many amps does a CPU use?
Noworldorder
This is just a rhetorical question for curiosity's sake;
Lets say that the software power meter shows that the CPU is using 80 watts
and that it's voltage is 1.284
80 / 1.284 = 62.3 amps
Surely it is not drawing 62.3 amps, since the CPU +12V rail is only rated at 20 amps.
What is wrong with this math?
Lets say that the software power meter shows that the CPU is using 80 watts
and that it's voltage is 1.284
80 / 1.284 = 62.3 amps
Surely it is not drawing 62.3 amps, since the CPU +12V rail is only rated at 20 amps.
What is wrong with this math?
20
answers
Last reply
Best Answer
More about amps

From Wiki:Quote:A voltage regulator module or VRM, sometimes called PPM (processor power module), is a buck converter that provides a microprocessor the appropriate supply voltage, converting +5 V or +12 V to a much smaller voltage required by the CPU. Some are soldered to the motherboard while others are installed in an open slot. It allows processors with different supply voltage to be mounted on the same motherboard. Most modern CPUs require less than 1.5 volts. CPU designers tend to design to smaller CPU core voltages; lower voltages help reduce CPU power dissipation, often referred to as TDP or Thermal Design Power
Some voltage regulators provide a fixed supply voltage to the processor, but most of them sense the required supply voltage from the processor, essentially acting as a continuouslyvariable resistor. In particular, VRMs that are soldered to the motherboard are supposed to do the sensing, according to the Intel specification.
The correct supply voltage is communicated by the microprocessor to the VRM at startup via a number of bits called VID (voltage identificator). In particular, the VRM initially provides a standard supply voltage to the VID logic, which is the part of the processor whose only aim is to then send the VID to the VRM. When the VRM has received the VID identifying the required supply voltage, it starts acting as a voltage regulator, providing the required constant voltage supply to the processor.
Modern GPUs also use VRM due to a need of more power and high current. 
jaguarskx said:80 watts / 12 volts = 6.667 amps.
Your cpu runs at a core voltage of 12V?
Noworldorder: Yes, your CPU really does draw 60 amps. That's not coming from the +12V directly though. The 12V only has to supply a few amps to the VRMs, which then convert the higher voltage, lower current supply to a low voltage, high current supply for the CPU. Some CPUs can even pull hundreds of amps  that's why if you look at the contacts at the bottom of the CPU, a huge number of them are for the supply voltage. 
Noworldorder said:This is just a rhetorical question for curiosity's sake;
Lets say that the software power meter shows that the CPU is using 80 watts
and that it's voltage is 1.284
80 / 1.284 = 62.3 amps
Surely it is not drawing 62.3 amps, since the CPU +12V rail is only rated at 20 amps.
What is wrong with this math?
There's nothing wrong with the math. The processor uses 1.2V instead of 12V; there is a conversion in between to keep the cpu from dying from that much voltage. It can take the amps.
Voltage*amperes=Power
12V*20amperes=240W
1.2V*63amperes=80W
It stays within the power of the power supply. You would need 1.2V@200A to match the power supply.
An overclocked processor can easily pull 150W also, at higher voltages of course.
150W=1.4V*110A
Now I'm wondering. Does the resistance change at all in a processor with increasing clockspeeds? 
topsoil said:LOL if your CPU used 62.3 amps your house power would trip all the time . I mean my houses main is 60 amps
You are confusing power with current. My OC'd Q6600 pulls 9.5 amps from the CPU power connector. Figuring 5% loss in the SMPS on the motherboard, that's 12 volts X 9.5 amps X .95 equals 108 watts. It needs 1.42 volts vcore for around 76 amps.
Your 60 amp house CB is more than likely 240 volts for 14.4 KW.
Noworldorder,
While not exact, the TDP approximates the amount of power the CPU will use at it's stock frequency.
Haserath, all semiconductors are by their nature nonlinear devices. A pretty good approximation of how much more power a highly OC'd CPU will need is:
(higher OC'd voltage)^2/(lower stock voltage)^2. Multiplying that by the TDP will give a very usefull approximation of power needed.
For example  my Q6600, 90 watt TDP, 1.2625 volt VID needs 1.42 volts at 3.6 GHz to be stable. Measured current at stock is 8 amps for 96 watts. Allowing for losses in the SMPS gives something very close to the TDP. At 3.6 GHz, the CPU pulls 9.5 amps.
(1.42)^2/(1.2625)^2 = 1.26. That is almost exactly 10 amps, very close to the measured 9.5 amps.
Or just assume TDP times 1.25 for a highly OC'd CPU. You will be 5%  10% high.
If you are estimating power needs, you are better off on the high side. 
Transistors can handle incredibly large currents, but they always do so at low voltage, an i7 950 has a max rated voltage by intel of 1.375V, but a rated TDP of 130W which means it will be pulling 94.5A through the VRMs on the motherboards, this seems like a lot but the total power flowing through them is just a bit higher than a big lightbulb pulls powerwise, consider that the average residential house has 150A service in the US, thats at 240V, the power lines at the street run at higher voltages to get more power to you at lower current levels, just like the 12V rail in the PSU feeding the VRMs, it sends 12V at low current, they drop the voltage and boost the current way up.

Quote:I'm going to shock everyone OP. A I7 CPU overclocked may create a load range between 0 to a 140A!!!
While I get stoned for that answer google for this datasheet ISL6336
What I gleaned from that fascinating read (thank you!), was the importance of ambient temperature upon the workload of the VRMs, and how crucial it is to not operate the motherboard at stressed levels in the summer heat. About onefifth of a VRM is devoted just to temperature compensation, and higher temps increase it's workload exponentially, ~ ergo: a cold computer lasts longer. 
HEXiT said:yes! the higher the voltage the more resistance...
I was actually just wondering if clockspeed would cause it.
Let's say you overclock using stock volts, since you usually can get a decent overclock from stock. Does this increase the resistance of the transistors? It would seem to me that it would, since the transistors would have to switch more times per second.
The only problem with this is that it would keep constant voltage while decreasing current thus lowering power. But processors seem to gain a few watts of power with a few hundred Mhz added, without voltage increases in the Bios. 
Best answer
when people say oc at stock volts, the cpu does actually use more volts.
stock volts for an i7 920 is 1.18 in bios
you boot the pc and run cpuz you will find its actualy using 1.01 @stock 2.62.8 turbo on
u then decide to crank it up to 3.6 which is the limit for most samples to stay on stock volts. so you enter bios do your tweeks and reboot.
this time when you run cpuz your using 1.12 but your still within the stock settings of 1.18.
and the idle temp has risen by 5'c
so yes your using about 10% more volts and getting a 20%ish oc
but the resistance has also gone up because as you can see the tempriture has also risen not just because of the higher volts but also the higher frequencies.
Related Resources
Ask a new question
Read More
CPUs
Power
Software
Related Resources
 CPU Voltage & Temp Fluctuation  Help!
 CPU Voltage DS3
 Cpu voltage?
 2600k voltage jumping
 CPU voltage
 E6750 voltage
 Core Voltage Wont Change
 Q6600 voltage question
 PSU: Rails? Voltage? Amps? Please explain...
 Voltage
 PSU showing low voltage in HWMonitor
 KT600 Plus V1.0 CPU Core Voltage Low
 Gigabyte H61m and i32100 voltage
 Insufficient PSU power?
 Voltage a7v133a