I wonder if a power supply really always draw the specific energy that it is sold and maked for? Example: You put the electric cable from the wall socket (220V) into the computers power supply (say a OCZ 750W), turn on the computer and let it be active (no sleep mode or power saving mode) for a whole year. Will it (the power supply) always consume a stable 750W from the wall socket no matter how you use the computer, even if you just plug the inside cables to 1 tiny motherboard, a 400 MHz processor and 1 small fan - or plug in say 10 big fans, 5 high-end HDD, 2 Crossfire high-end graphic cards, an i7 processor and a high-end motherboard? Will it then consume 750W/h in 24 hours every day in one year, regardless of how much you use inside the computer? Or will the total W consumtion variate depending on how much things you put the inside cables to?
The nominal wattage is the maximum input power that the PSU can take.
At the side of the PSU you will see a table of voltage and currents. Each rail (12V, 5V, 3.3V) always supplies that voltage (roughly). but the corresponding current is the maximum that that rail can supply and depends on how much the connected hardware needs.
And (Power) = (Voltage) x (Current)
so the more current your hardware draws, the more power you will use.
Thank you very much :-) When we´re speeking of power, what is the differens between a PSU like they are talking about in the listed post in this forum (a outside-computer power supplies with batteries that can drive the computer some time if a black out happens) and the more normal power supplies (Electric net unit) you have inside-computer? Is both called PSU, why?
DarthTengil, the one that is inside your computer is a PSU (Power Supply Unit). Depending on the efficiency of it, most power supplies take only as much as they need. If you have a computer that consumes 300watts on idle and run it on a 1000watt power supply, it will only consume approximatly 300watts or however much it need and depending on that specific power supplies efficiency, it can be + or - that approximate amount.
The power unit that is external with batteries is called a UPS (Uninterruptible power supply). This is basically a unit that you plug into a wall outlet which charges a battery inside of it. YOu plug your computer+monitor into the UPS which always runs from its battery and the battery is constantly being charged from wall power. If the power goes out, your computer won't be effected or even flicker since its running from a battery. a normal UPS will give you a few minutes of battery power to give you time to safely shut-down or hibernate your computer so as not to suddenly cut power and cause damage.
Thank you all, have learn a lot of electricity and how important a good PSU is. Best answer is: Proximon!!!! Have give you a "usefull message" but cannot find the "best answer" to click on, hope it comes up so I can reward you :-) Now is my brain fried with resistors and voltage measurement... This thread is now solved!
If you have a 350W PSU, remove it and put in a 1000 watt PSU, all construction being otherwise equal, they will draw the same power from the wall. The PSU only uses what the components within the puter draw.
A 50 watt rated light fixture w/a 50 watt bulb draws the same electricity as a 100 watt rated fixture w/ a 50 watt bulb.
Or put another way, will Oprah weigh any more on a 300 pound scale than she would on a 200 pound scale ? ......trick answer below
only if she goes off her diest and breaks 200 pounds