Home wiring vs. PSU

phujitiv

Distinguished
Oct 22, 2004
1
0
18,510
I think I may have a unique situation. Right now I am finishing the basement to our house. The walls are still open and I can put anything in them. I already have CAT6 and coax cable in the all the rooms. Now we are adding power.

Well, I am trying to plan for the future and will also be building a couple of high-end computers for Christmas. Of course I will be getting a case that handles 2 PSU's because I will be getting an nForce4 SLI system with RAID and the works, etc. Each power supply says it is 550W. Input Current is 12A @ 115V. I assume 12 amps at 115 Volts AC power coming from the wall socket. Two power supplies will pull 24 Amps of power I think. Well, homes are usually wired for 20 Amps per line from the circuit breaker using 12 Gauge electrical wire. 24 Amps will blow the fuse on a 20 Amp line to my wall socket. I still haven't added in for monitor, printer, scanner, 6.1 speakers, etc. Those eat up AMPS as well. I am concerned that with just adding two PSU's I will max out my line to my wall socket.

The question is: Does each power supply really pull in 12 Amps sustained when the computer is on? I am working with someone else on my basement finishing project and they think I'm going way overboard by trying to add (2) 20-Amp lines to wall sockets for my computer. They say that, for example, a PSU rated at 480W only pulls 4-AMPs because 480W/120V=4A Watts/Volts = AMPS. I think that AC power 12AMPs going into PSU get coverted to DC power 480 Watts and that a lot of the power is lost in the conversion from AC to DC. I'm no expert in electricity... I just read a bunch. My friend also says that no company would make computer products requiring more power than a normal power run to a socket can give... 20 Amps @ 115Volts. I disagree. Power requirements are going up so fast that 1 PSU isn't enough... When will we be dragging an extension cord from another part of the house just to have enough power for two PSU's so that we don't flip the circuit breaker? Any help/thoughts/links would be helpful. Thanks!!!
 

Crashman

Polypheme
Former Staff
No. Each power supply pulls in only what it outputs plus around 20% waste current (expelled as heat from the power supply). Also, 20 amps is 2400W!!!

If you really want to protect both your sytem and your grid, you can put in a huge UPS, mine prevents the line from being overloaded when I power up my lazer printer with a bunch of other things running, and I live in a place with horrible wiring.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 

sparky853

Distinguished
Jun 25, 2003
909
0
18,980
OK, first off, residential receptacles are run from 15A breakers with 14AWG wire, rated at 15A. maximum allowable load on a 15A line is 80%, or 12A.

If you run #12 wire to a 20A breaker and have a 15A Receptacle on the end, your asking for trouble. For this, you would need a 115V 20A recpetacle, which wouldn't work in your case.

Your best bet would be to run some 3 conductor cable to each receptacle, and make them split receptacles, where each plug is supplied by a seperate breaker. You have to make sure when you do this that you break off the little tab on the side of the plug that has the power running to it.

Also, you are correct in your logic that P=I*E or Watts=Current*Voltage, therefore Current=Watts/Volts.

Speaking from experience, you can put about 8 high end systems, complete with monitors and speakers, on a single circuit, before you run into problems on a standard household circuit.

Any more questions, just let me know.

Your Friendly Neighbourhood Electrician. :smile:

XP2800+, Abit NF7, 1GB Dual-Channel DDR333, ATI R9800PRO 128MB, TT PurePower 420W, LG DVD+-R/RW
:redface: <font color=red>My wife says I suffer from premature ejaculation...I don't remember suffering<font color=red> :wink: