Is EXTRA POWER WASTED?

Hello,
Wondering if I had like a 400 watt PSU and my system was only pulling 300 watts would it only pull 300 from my electricity or would it pull 400 and just waste 100? Also having a 350watt psu on for a year about 8 hours a day any idea approxamately how much energy costs that would be (at .1$ or so)
5 answers Last reply
More about extra power wasted
  1. Where would that energy go? The rating of the PSU is the maximum power it can provide, not how much power it sucks from the outlet. I don't believe you'll be wasting any power.

    AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
  2. Hmmmmm...

    I'm kinda torn how to answer this one, but since the last guy already gave you the correct answer, I'll go with choice two.

    Yeah, the 100 watts is just making your house hotter. You should sell that 400w psu, and get something that fits your computer better. Typical prices run around $.06 per kilowatt hour. So, that 100watts that you're wasting costs you $.06 every ten hours, or around $.15 per day.

    <font color=blue>I hacked Msft, and all I got was this lousy source code.....</font color=blue>
  3. Hello,
    so are you saying what you said is just a lie sine the right answer was already given? Or doe it waste the 100 watts?
  4. lol, he was joking!!!!!! No power is wasted. Your computer will only consume the amount of power it needs!

    AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
  5. It's similar to a light bulb. If you change a light bulb from one rated at 100-watts to a 60-watt bulb, the 60-watt will draw less current. If you take the light bulb out of the socket all together, no power will be used even though the light switch is turned on.

    Basically, the energy must be consumed by something if it is to be used at all. If there is a lack of hardware available to consume 400-watts, then 400-watts will not pass through your power supply. As stated above, a power supply is rated by how much power it CAN supply, not how much it forces out.

    One more analogy if I may... If you were to put a 700-watt light bulb in a standard socket, you would blow the fuse in your house or melt the wiring leading to the socket. Why? Because it's the lightbulb (or your computer's hardware) that dictate how much CURRENT will run through the electrical system. Your house's wiring is only rated for a certain amount, like the power supply in your computer.

    If you're confused on current vs. voltage, think of it like a flowing river. The width and depth of the river will tell you how much volume is being supplied. This would be like voltage. The speed at which the river flows tells you how much water will pass by a given spot of land. This is like current.

    Sorry, that was two more analogies. :)

    -- Ah sh*t! sys64738 --
Ask a new question

Read More

Power Components Product