Is a higher wattage power supply more likely to trip a circuit breaker?

Will1789

Commendable
Feb 15, 2016
6
0
1,510
So I'm building a new PC, however with my current pc it will occasionally trip the circuit breaker for the room I have the computer in. I have noticed that a few other overhead lights in other rooms are on the same circuit as my pc, so having these lights off reduces the chance of the breaker popping. these are the current specs of my computer.

CPU: i5 4690k
GPU: EVGA GTX 970 SSC (x2)
mobo: Asus Z-97a
PSU: Corsair 750 watt
16 gb 1866 MHz ram

New Build:

CPU: i7 6800k
GPU: MSI GTX 1080
mobo: MSI X-99 SLI plus
32 gb 2666 MHz DDR4 ram
PSU: ?

I understand that my new build will have a lower power consumption, but I would like to buy a higher wattage power supply so I don't have to worry about it for future upgrades. Would a higher wattage power supply make my C/B more likely to pop, or would it be less likely to pop because my computer is consuming less power?
 
Solution
lol, ok i see the confusion. yah changing to a higher amp breaker is asking for trouble unless done by a pro with whatever changes made to the box that need to be made.

i only suggest a new breaker of the same amps to try and fix the popping issue. that can be telling of a lot of things including a shorting wire in the wall somewhere which can be a fire hazard. i had the issue in my house as well. new breaker did not fix it and i ended up having a bald wire in the ceiling above the box that was shorting on some hvac. could have ended badly, i got lucky and it was an easy to get to fix. it is worth figuring out why the breaker is popping.
First thing to clear is your new system has higher power requirements than the old - not by too much though. Buying a "higher wattage" PSU in theory could trip the breaker since it will be less efficient under the lighter load of your computer.

In your situation efficiency plays a role, you want a power supply that will be most efficient with your setup under load. With yours, a quality 550W-650W Platinum or Titanium unit will be best. If you overclocked, the EVGA 650 P2 is great. Titanium units are very expensive, I know Raidmax has a unit in the 650W area if I recall that is good quality.

However there is no guarantee still that the breaker will not be tripped with a more efficient PSU - just a better chance it won't.
 

Will1789

Commendable
Feb 15, 2016
6
0
1,510


Really? I figured that going from SLI 970s to one GTX 1080 would significantly drop the power requirements, however I haven't compared the power requirements of the CPUs. So your suggestion would actually be to buy a lower wattage psu, I guess my new question is: What would be the most efficient wattage if I were to eventually get a second 1080 years down the road?
 
Right, it all depends on efficiency, system power consumption, and load. Right off, a higher wattage PSU won't necessarily be better or worse at this, it all depends how efficient it is at various load points.

Your system will actually be pulling a fair amount less power, as you are switching from a SLI setup to a single GPU. Honestly, for efficiency purposes, you are probably better off with a quality 650 W PSu like was suggested above.
 

Math Geek

Titan
Ambassador
assuming the room is on a normal 20 amp circuit, i'd say you might want to look into replacing the circuit breaker for the room. they are cheap and rather easy to switch out.

but a normal 20 amp circuit should more than handle any pc unless it is drawing more than 2000w!! even a 10 amp circuit would handle 1000w easy.
 


Oh sorry my bad I did not see that it was SLI. Did the SLI 970s trigger the breaker? Even if 970 SLI would give ytou better efficiency hence higher load, the efficiency would still not be enough of a boost to make the AC wattage lower than the GTX 1080. So it would not make sense that the 1080 rig would trigger the breaker and not the 970 SLI.
 


Yea. The max a GTX 970 should pull on average is about 145 W as Nvidia points out on its site:
http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications

The most a GTX 1070 should pull is 150 W, just 5 W more.
http://www.geforce.com/hardware/10series/geforce-gtx-1070

Calculating the overall power consumption of the system is a bit tricky, as things can vary from card to card, but overall a single GTX 1070 should consume up to 140 W of power less than a GTX 970 SLI configuration. If we wanted to add in the extra detail from the CPU, chipsets, RAM, we would end up with Skylake being more energy efficient, using more energy efficient RAM and probably more energy efficient chispets (hard to judge), which means a drop in power consumption of around 150 W is quite possible.

Overall a PSU capable of between 550 and 650 W of power is the sweet spot for your hypothetical build.

I do have to ask though, why are you building a new system when you already have such a powerful PC? Just the circuit breaker issue? Perhaps could you run a really long extension cord from another room? Would be far more economical. Even if you kept your displays on this current room.
 


Ah right, oops. Suppose that's what I get for using a TV on the other side of the living room as my display. Missed the 8. Anyway, that specific chip has an 88 W TDP, and the i7-6800K has a 140 W TDP. So at the very most there is an increase of 52 W. Not the most accurate way to figure it, but close enough. Subtract that from the 140 W decrease in power consumption from the GPUs, and we still have an 88 W drop in power consumption. Let's round it up to 90 W from the energy efficiency of DDR4. That is still a considerable drop.
 

Will1789

Commendable
Feb 15, 2016
6
0
1,510


While you've all given great info, you haven exactly answered my original question, and missed a key detail. Lets assume that my new build will consume ~500 watts, if I were to buy 850 watt psu with 90% efficiency would that mean that at worst the power supply is drawing ~550 watts? Still less than my current build on my 750 watt psu. Would that be any different with a 650 watt power supply with 90% efficiency? Or, for simplicity's sake, lets say I just used the old 750 watt power supply for new computer. Would the ~90 watt drop create less draw on the circuit? The reason I asked this question was because I wanted to buy a power supply that would be capable of handling SLI 1080s in the future. By the time I do eventually get a second 1080 I likely wont live in this apartment any longer, so I'm not worried about the C/B popping at that point.

The reason for this upgrade has nothing to do with the C/B popping, it doesn't happen that often to begin with. Maybe once every week or so. I was just worried I would drastically increase the frequency by buying a higher wattage power supply. I'm upgrading for a few reasons. #1 I'm disappointed with SLI. On games with good SLI profiles it works wonderfully, but on other games, Unreal 4 engine included, I'll get frame relatively poor framerates. I should mention I have a 1440p 144Hz and a 1080p 60 Hz monitor. Also, I typically have a lot of other programs running on my other monitor, and I figured 2 extra cpu cores would free up the other 4 to worry about the game being played. On top of all this, I found someone willing to buy my current build so, Free excuse to upgrade!

Thanks for all the replies guys, I really appreciate it.
 

Faux_Grey

Honorable
Sep 1, 2012
747
1
11,360
Get a PSU that has a good inrush current rating, inrush current is the "surge" that you get when you turn on your pc.

Total wattage has nothing to do with how much power it actually pulls as people are saying up there.

I wouldn't recommend changing the breaker, as they're designed with the house/room wiring in mind, changing that, you risk burning your house down. :)

Failing that, invest in a small UPS.
 

Math Geek

Titan
Ambassador


that's utter nonsense!! the breakers are designed to be replaced when they fail. just replace it with one of the same amps and it's no different that chasing a fuse in your car. if it's 20 amps, then replace it with a 20 amp. this will ensure the breaker is working right. if it still pops the circuit, then there is probably something wrong with the wires in the walls somewhere. that can be an expensive fix since you have to tear the wall apart and replace whatever is messed up. but replacing a breaker is simple and does not require an electrician to do like replacing wires in the walls does.
 

Faux_Grey

Honorable
Sep 1, 2012
747
1
11,360


Breakers are designed to trip after hitting a certain current.
If it tripped at a higher current, (replacing a 10A with a 20A) that means the wires have more current going through them, more current means more heat.
20A of current through a 10A wire may result in melted plastic, a short, then possibly a fire. :)

Apologies, I was under the impression you were telling us to change the breaker to one that is higher rated.
 

Math Geek

Titan
Ambassador
lol, ok i see the confusion. yah changing to a higher amp breaker is asking for trouble unless done by a pro with whatever changes made to the box that need to be made.

i only suggest a new breaker of the same amps to try and fix the popping issue. that can be telling of a lot of things including a shorting wire in the wall somewhere which can be a fire hazard. i had the issue in my house as well. new breaker did not fix it and i ended up having a bald wire in the ceiling above the box that was shorting on some hvac. could have ended badly, i got lucky and it was an easy to get to fix. it is worth figuring out why the breaker is popping.
 
Solution

Will1789

Commendable
Feb 15, 2016
6
0
1,510
So, to follow up, I ended purchasing everything I mentioned and went with an 850 watt EVGA Gold power supply. The circuit breaker hasn't popped once since I completed the build over a month ago. So, that leads me to believe that there are two possible reasons why. Either the drop in power consumption put less load on the breaker, or the previous power supply I had was faulty. Thanks again to everyone for the help.