Sign in with
Sign up | Sign in
Your question

relative power consumption

Last response: in Components
Share
May 18, 2006 6:57:47 PM

if a CPU uses between 60 and 150 watts.. give or take whatever, and a lightbulbs wattage is like 85 or 100, does that mean running a couple lightbulbs is usin more power than the computer? if so... who cares about saving 20 or 30 watts in a PC, when having a few lightbulbs on is like 5 times that amount anyway??
May 18, 2006 7:19:52 PM

Dude,

Once you move out your all utilities paid dorm room and into a pay your own utilities apartment, then you will see what all the fuss is about.

Not only that, wait until you have kids and they have their own pcs running 24/7. You will definitely want to haul-off and start whooping some asses.
a b ) Power supply
a c 478 à CPUs
May 18, 2006 7:51:02 PM

It also depends on where you live. In NYC you will be paying $0.19 per KWH. From what I've read that twice the national average.
Related resources
May 19, 2006 12:30:36 AM

uh that didnt answer my question at all.. i meant is the pc power relative to other things inside that draw power? the diference between a PC drawing 300 or 350 watts would be insignificant if having 2 lightbulbs on alone draws 150 watts
a b ) Power supply
a c 478 à CPUs
May 19, 2006 12:36:14 AM

I hate yellow light, so I buy florecent cool white bulbs. Plus they last much longer to offset the initial "high" price. A 23w florecent lightbulb is the equivalent to a 100w (incandecant) bulb.

I think a 18w florecent is the equivalent to a 75w regular bulb.
May 19, 2006 2:04:56 AM

Sorry for the non-relevant comment before. If I understand correctly your comparing light bulb energy to computer energy (I know you said cpu, but cpu can not run by itself).

I fore one am a fan of low energy devices especially if they are going to be running any length of time. I would think the cpu should be idling most of the time, unless you are folding or some other constant intensive cpu task. Depending on the components in your system you very well could be expending energy equal to 2 150 watt light bulbs, possibly even three if you have one of the high power graphic cards.

I am not sure of the answer your wanting to read, but in most situations both are NOT mutually exclusive. At work/school you have flouresent lights putting out the sum 1000+ watts, at home in the evening you may have on a light or two. So, you not only have the lights drawing all this power you also have a pc pulling on the circuit. I agree a difference of 50 watts between computers is not big deal when all this energy is needed. However, 50 watts could make the difference between a lightening fast computer and a relatively slow one.

Like I said before, I am a fan of low energy devices, but I still want speed. AMD has a fabulous technology called Quiet 'N Cool that dynamically adjust voltage levels, but so does Intel with SpeedStep. I still complain about energy use, but I will continue to do so even if I was using the equivalent of 1 60 watt light.

Sorry if my blather made things worse for you.
May 19, 2006 2:28:02 AM

im all about using least amount of power as possible too. i was considerin when i upgrade to get a mobile proc for my desktop. but then i was thinking, well i usually have two lightbulbs over my desk on when im at my comp anyway.. so is saving a couple watts from my mobile proc even worth it?

jaguar-- do fluorsecent light bulbs work in a normal light bulb socket?
May 19, 2006 3:15:44 AM

Quote:
if a CPU uses between 60 and 150 watts.. give or take whatever, and a lightbulbs wattage is like 85 or 100, does that mean running a couple lightbulbs is usin more power than the computer? if so... who cares about saving 20 or 30 watts in a PC, when having a few lightbulbs on is like 5 times that amount anyway??


Imagine having a room full of servers running 24/7... Not only do businesses pay for computer power, but they pay to cool the rooms the servers are in.

It all adds up.
May 19, 2006 4:37:35 AM

Also.... how often do you leave lights on 24/7 365? If that is the case... I would recommend something other than 100W light bulbs.... 1 100W bulb on all month would run you about $7.20 @ 10cents per kw/h. So, think about it.... $15 for a pc idling all month @ 200W. And that is if you don't use it at all.
Or if you have an overclocked 805 D @ close to 500W.... you are looking at over $30 a month at load.... I fold.... so I also like low power.... my 165 under a load, with 5 HDD's and 3 120mm fans, along with 2 7800GTX's draws about 240W.... this system idling draws under 180W.... that is of course with CNQ.... which is actually a little better than EIST IMHO.... from what I understand the lowest EIST can go is 2.8Ghz.... my Opty drop down to 1Ghz.... off topic a little.....
May 19, 2006 1:58:19 PM

Honestly, its up to your wallet and personal taste. There is a give and take. The processor could use less energy if you have hardware encoding devices in your computer. For example, if you game alot and have a good graphics card, the cpu will be using less power because the graphics card will be doing most of the work. It will be the other way around if you don't have a powerful graphics card. Overall, though, there is no difference.

Please don't take following words the wrong way, but maybe your focusing on something to meaningless. $10 dollars a month for computing energy is not so expensive given the fact you would spent much greater than that on a date or something else.

I really don't know about mobile processors in a desktop. It seems to feel like you have all these full size components (full size graphics cards, high speed hard drives, tv tuners, network cards, etc.) asking the itsy bitsy processor to handle such a great load. I think there will be an off balance of performance somewhere and you will end up with a lot of bottlenecks.

Now, if you buying a notebook I am all for that. Some of the notebooks, especially the Apple's MacIntels are looking very promising to me, since they can now dual boot Mac OS X and Windows. Plus there energy use is less the 100 watts at max usage.

I suggest getting the most powerful processor out that has some sort of voltage adjustment/power reduction scheme. That way you'll have the best of both worlds: low power when nothing intensive is going on and monsterous power when you want to party.
a b ) Power supply
a c 478 à CPUs
May 19, 2006 2:24:04 PM

Quote:

I suggest getting the most powerful processor out that has some sort of voltage adjustment/power reduction scheme. That way you'll have the best of both worlds: low power when nothing intensive is going on and monsterous power when you want to party.


Actually, with the release of Conroe and AM2, CPUs do not have to be real power hogs to party. Even the current generation of Socket 939 Athlons are are pretty efficient for desktop CPUs. AM2 Athlons will have lower TDPs which should mean that they will consume slightly less power than current S939s.

The GPUs on the other hand are out of control. nVidia is not too bad, the GeForce 7800GTX 512mb has been measured to consume about 95w, the 7900GTX 512mb consumes 5w - 8w less than the 7800GTX according to nVidia. ATI is a different story, the X1900XTX has been measured to consume 121w, and from what someone told me a X1900XT Master Card consumes 20w more than the X1900XTX. Of course overclocking these cards will increase power consumption.
May 19, 2006 2:41:05 PM

In reply to getting a mobile processor to save power, there is no magic wand to suddenly reduce power consumption with everything else remaining the same. Mobile processors use less power because they run slower. There are more details involved, speed throttling, blah blah blah, but that's it in a nut shell. If your desktop motherboard supports all the features of a mobile processor and you want to get one, do it, but just aware that mobile processors don't draw less power because of magic. If that were true, there wouldn't be a reason why they don't use them automatically in all desktops.
May 19, 2006 2:44:37 PM

thanks for all the input. so yeah, it is defintely good for businesses to be power efficient cos they got a million PCs running. but an effecient desktop processor would be best in a desktop i see.

thx for the link to the lights jag ill be pickin some up soon.

and sun angel-- my dates usually dont go over $10. i got the efficiency on those things down.
May 19, 2006 3:26:43 PM

Well, it would if the only thing running in a computer was the CPU, but a computer has a lot of other things running, like a motherboard, graphics card, modem, sound card, hard drive, etc. Suddenly you're not talking about 60 to 150 watts, but 500 or 600 watts, perhaps more depending on the total equipment in the computer. Add that to power losses in the power supply itself (efficiency rating), and now you're talking about a house full of lights that are on.

So, does that mean you want the leanest power using computer around, or that you're willing to pay for the electricity to power the dual graphics cards, super sound card, speakers, etc? That's a choice only you can make, and it dictates whether you'll be able to only play games like Pacman or be able to play Oblivion.
May 19, 2006 4:10:27 PM

Quote:
you're not talking about 60 to 150 watts, but 500 or 600 watts, perhaps more depending on the total equipment in the computer. Add that to power losses in the power supply itself (efficiency rating), and now you're talking about a house full of lights that are on.

Actually Choco's estimates are not far off. The power draw at the socket is far less than the PSU rating, here is someone who measured w/ a kill-a-watt and got these results:

Quote:
For my work PC, which currently contains the following items:

* Athlon X2 4800+
* GeForce 6600 video
* Maxtor 300gb SATA HDD
* GeForce 5200 PCI video (for 3rd display)
* 2gb PC3200 DDR RAM
* generic DVD-ROM

The Kill-a-Watt tells me I'm pulling this much power from the wall socket:

Idle windows desktop 118w
Defragmenting hard drive 122w
1 instance of Prime95 147w
2 instances of Prime95 (affinity set) 177w
Battlefield 2 demo 172w

Now, that's power draw at the wall socket. About 25 percent of this energy is lost in the power supply as it converts from wall power to something the PC can use. So the actual peak power usage of my work PC is around 132 watts. And that's a fairly beefy PC, probably unrepresentative of the vast majority of current desktops.

It's amazing how much you can infer from such simple, basic data collection:

* Each "core" of the X2 4800+ consumes 30 watts
* The GeForce 6600 video card consumes 25 watts
* The 300gb SATA hard drive consumes 4 watts

So if you leave your PC on 24x7, 118w would be a good estimate or even more if you enable the green power-saving modes then it can drop a lot more.

So the light-bulb comparison is not far off, other than the fact that the light bulb is a resistive load and the pc is a complex reactive load from an AC network perspective - but for all intents and purposes that is not a consideration since most power billing does not account for apparent power, only real power draw for a typical consumer household.
May 19, 2006 5:36:41 PM

So if I understand you correctly, which I might not, instead of the 450 watt power supply minimum reccomended to run one 512 mg 1900 XTX, much less two of them, a person should be able to get by on a power supply giving out about 200 watts.
May 19, 2006 6:33:56 PM

You will not ba able to run that card on a 200W psu. Not a chance in hell. You really should have at least a good 400W psu for that video card. And yes, AMD cpu's at stock are very lean and efficient. Intel's cpu's can be. Overclocking is the big issue. Overclocking can be the difference between a 250W system under load or a 500W system under load.... the newer Intel chips are much better than that older ones..... they got there act together.
May 19, 2006 6:36:17 PM

Not exactly. What he is saying is that most people way way way overestimate how much power the typical computer actually draws. I'm sure there are some that draw 500W these days, but I'd bet 95% of computers today draw 300 or less, probably much less.
a b ) Power supply
a c 478 à CPUs
May 19, 2006 7:21:59 PM

While the example is good to demostrate how much power a system actually uses, it will not apply in all situations.

For example:

Under full load, the GeForce 6600 draws 28w.
The X2 4800+ has been measured to consum 96w under full load.

Right there is 124w if both CPU and GPU were to be stressed at full load combined. While playing the Battlefield 2 demo, 172w was measured from the outlet. Assuming a 75% efficient PSU, that means a total of 129w is being consumed. That doesn't mean that everything else including the hard drive, dvd drive, motherboard, etc. only consumes 4w.

Different applications and games stress the CPU & GPU differently along with the other components. I base my selection and recommendation of a PSU based on the Maximum Estimated Power Consumption of each component. That would be the worst case scenario. Then lower the total by a little, because when playing a game all of your hard drives will not be reading or writing all the time. Same thing for the DVD drive, since it will be inactive most of the time.

It's better to over estimate than under estimate how powerful of a PSU is required. But don't go overboard or you will just be wasting money. As an example, I remember estimating someone's build to have a maximum load of about 400w (high-end gaming system), but he was insistant on buying a 700w PSU.
May 19, 2006 7:29:19 PM

Jaguar.... is absolutely correct. a good 400-500W PSU would do for just about anything. The only thing you may nee more than 400W for is OC'ing a dual core Intel cpu or you are trying to x-fire 2 1900XTXs.
May 19, 2006 7:54:09 PM

Well, not really - a good analogy to that would be a typical car audio amplifier rated at 100w per channel, and you normally listen to it at 10% of the rated capacity and it is relatively quiet music, and you have a well insulated car so outside noise is filtered out pretty well. So a 10w per channel amp may work (if we disregard the audio quality, since the clipping would drive most people mad)

However there are times when you have to crank it up louder, say when your windows are down and you are on the highway, you may need 50% of the required power output or at a tailgate party, say 75%...

In perspective the average current draw is low because it is over a long period of time and doesn't really account for games that use a lot of power over a short amount of time, so it is a good thing to have the on-demand power there when you need it, considering a x1900xt crossfire setup will be very demanding on your +12 and the only ones to dish it out are the 600+W rated units, and most PSU calculations don't have any per-rail considerations at all only a sum wattage.
May 19, 2006 8:03:27 PM

Yes the power conclusions are definitly off, I mean he had a kill-a-watt to measure the outlet draw so I am suprised he didn't take a clamp-on ammeter and measure every rail current to gather a total power PSU draw and calculate his own effiency rather than go on the rated specs :?
May 19, 2006 8:18:28 PM

My reply was meant to be on the sarcastic side. No, I'm not serious about a crossfire setup taking 200 watts. I have a few computers; the simplest being able to get by on a 315 watt power supply with room to spare even powering a big multifuncton printer. On the other hand, my gaming computer draws well over 500 watts, closer to 600 at full load. So I have a 680 watt power supply in it. That way, nothing gets overtaxed and I suffer no slowdowns or failures duw to lack of power. Yes, I'm fully aware of the differences between low end and high end computers, and their subsequent power needs.
May 19, 2006 9:20:16 PM

Quote:
Yes the power conclusions are definitly off, I mean he had a kill-a-watt to measure the outlet draw so I am suprised he didn't take a clamp-on ammeter and measure every rail current to gather a total power PSU draw and calculate his own effiency rather than go on the rated specs :?


Clamp on meters will not measure DC, only AC.

ADDED: Nevermind, I've been proven wrong. :oops: 
May 19, 2006 10:54:34 PM

DC clamp-on ammeters actually exist using hall-effect sensor technology.
Have a look...

edit: removed rudeness :twisted:
May 20, 2006 10:50:09 AM

If that is what that kill-a-watt meter is, then I stand corrected.

If that quote is directed at me, there is no cause to be rude.
May 20, 2006 11:36:40 AM

no, a kil a watt measures current draw from the wall socket.... nothing more... all doolittle is saying is that thje person in question should use a fluke meter or simpson....(flukes are better-but expensive) to check it fromt eh rails.....

this is my question to doolittle though, are you referring to the molexs or physically opening the psu up?
May 20, 2006 11:56:42 AM

Quote:
are you referring to the molexs or physically opening the psu up?

The molex connectors definitely!

A good spot would be just as the wires exit the PSU - just bunch up the similiar colors and get an amp reading for each voltage then calculate watts for each to get the entire DC output.

The same for the cd/dvd drives can be calculated at the hdd molex for the +5 / +12 then you can calculate the motherboard-only draw, but for usb/firewire I guess you can read pre/post currents and take the difference.

And apologies to pain for the rudeness :oops: 
May 20, 2006 12:08:04 PM

Quote:
Clamp on meters will not measure DC, only AC.


Wanna bet?

http://www.amprobe.net/amprobe/Clamp_on/ACDC_1000A.htm

I've already been corrected, so no, I don't want to bet.

Well, I wasn't piling on. At the time that I replied, the other correction was not yet showing. I actually have an AC/DC unit, but not one nearly as expensive as the one I linked.
May 20, 2006 1:08:35 PM

The other correction was posted 12 hours before yours, but it's neither here nor there.

I haven't seen any of those meters for what obviously is a very long time. The first ones were only AC, but I should have known they had improved. :oops: 
!