How much electricity does my computer use?

Status
Not open for further replies.
Solution
*I found some actual numbers. Your system should use roughly:

IDLE/Light usage: 70 Watts average

Gaming: 270 Watts average

Plus the MONITOR which might be 30Watts or more (see manual)

That's based on the following articles, one about a 7700K APU system with no card, and the second based on a system with the R9-270X.

http://www.eteknix.com/complete-amd-kaveri-review-a10-7850k-a10-7700k-a8-7600/16/
http://www.guru3d.com/articles_pages/radeon_r7_260x_r9_270x_280x_review_benchmarks,10.html

So...

1) Make sure the computer is OFF, or Hibernating when not used.

2) Here's an example calculation for a week: 10 hours idle + 10 hours gaming = (10x100W) + (10x300W) = 4000Wh per week.

So that's 4KWh's (4 Kilo-Watt hours). If the price of...

kittle

Distinguished
Dec 8, 2005
898
0
19,160
I cant give you an exact number of how much it uses.

Most of it will depend on how long your PC is running. Make sure you turn it off at nite when your not using it and when you are away during the day. Make sure you use "shutdown" and not 'sleep', or just get up and walk away.

Same goes for the rest of the parts for your system - speakers, monitor, etc. Especially the monitor, make sure its also turned off when your done using your system.
 
As you can see here, under a GFX load that you will never see under gaming, your system will draw about 286 watts / 119 at idle from the wall,..... and their CPU uses lot more power than yours.

http://www.guru3d.com/articles_pages/msi_radeon_r9_270x_gaming_review,8.html

Let's say you spend half ya time gaming the most power hungry game on the planet and half your time just browsing the web. Average power would then be about 200 watts. So:

200 watts / 1000 watts per kw-hr x 36 hours a week x 4.3 weeks per month x $0.10 per kw-hr = $3.10 per month.

That's the average USA electric cost ...could be as high as say $0.22 on Long Island for example.

That's just the PC....monitor will add a few watts and anything else that is plugged in separately but, barring a 500 watt speaker system and such, it would be tough to break $8.50 a month even on Long Island.


 


Should be around 400-450 watts at most during heaviest use.... like 7-8, 60W light bulbs.

***Actual numbers are better than my guess attempts, for sure =)
 
*I found some actual numbers. Your system should use roughly:

IDLE/Light usage: 70 Watts average

Gaming: 270 Watts average

Plus the MONITOR which might be 30Watts or more (see manual)

That's based on the following articles, one about a 7700K APU system with no card, and the second based on a system with the R9-270X.

http://www.eteknix.com/complete-amd-kaveri-review-a10-7850k-a10-7700k-a8-7600/16/
http://www.guru3d.com/articles_pages/radeon_r7_260x_r9_270x_280x_review_benchmarks,10.html

So...

1) Make sure the computer is OFF, or Hibernating when not used.

2) Here's an example calculation for a week: 10 hours idle + 10 hours gaming = (10x100W) + (10x300W) = 4000Wh per week.

So that's 4KWh's (4 Kilo-Watt hours). If the price of electricty was 20 cents per kilo-Watt hour then your electricity for a month is....

$3.20

(Provided I made no mistakes, and using my estimated hours)

Update:
If you have AIR CONDITIONING running this would change the number quite a bit since it would have to work harder to remove the hot air from the computer but I don't know how much difference that would make.
 
Solution

Johnny Jiang

Reputable
Jun 23, 2014
5
0
4,510


I live in California, I have no idea how much does the electricity cost, and my bill went up like $50, I'm sure its not my computer. Cause I have this oxygen machine that's on like most of the time for my grandma and she has medical problems. I think its mostly the machine's fault.

 

USAFRet

Titan
Moderator


There is no way your new PC uses $50/month.
 

Johnny Jiang

Reputable
Jun 23, 2014
5
0
4,510


I know right? Like she has the machine on for like 24/7 for oxygen. I'm sure most of the electricity usage is coming from that..

 


Go to this link and plug the number you get for your zip code and your actual hours into the formula in Post # 4 (that's for the 270x) and you will get an accurate number.

I ran an electric utility for 6 years, we bought it for like 1/2 cent per kw from Niagra Falls..... spent like 9 cents renting the transmission lines getting it downstate, used 1.5 cents to cover our budget and sold it for 11 cents a kw.
 

suah117

Commendable
Jan 16, 2017
4
0
1,510


This is really old, but came up while I was searching. At the time of that writing, tier 4 usage in some parts of California was $0.34 per kilowatt hour. If a PC used 200 watts and was left on 24/7, that's 4.8Wh per day (200w x 24h), which is 4.8kWh per day. 4.8kWh per day times $0.34 per kWh is $1.63 per day, times 30 days is $48.96 per month.

What people don't generally get is that at the higher tiers of usage, one additional appliance under that old rate structure was exceedingly expensive.
 

thetimtam97

Prominent
Sep 7, 2017
1
0
510


However, you have to also understand that it won't use 200W per hour directly from the mains. The whole point of a power supply is to down-step the voltage that is coming from the mains and to the pc else it blows it apart. By stepping down the voltage you are also increasing the watts that is being sent through. So for example (very hypothetically), you may only be drawing 20W from the mains but because you are stepping down the voltage by a factor of 10, the pc is actually getting 200W (not including any efficiency issues).
 

roberts.glyn

Prominent
Nov 7, 2017
3
0
510
My computer is a Desktop computer with an Intel i5 CPU with Gigabyte Motherboard, 16Gb of RAM, a 240gb SSD, 4Tb, 3Tb, 2Tb and 1Tb hard disks and a blu-ray burner with a 500W power supply.
I am usually running 3 virtual computers in my configuration as well as the main computer.
So you'd expect it to use a reasonable amount of power.

So, I put a wattmeter on it and found that the MAXIMUM power it used while starting up was 63Watts and it's currently running while I am typing this at 48Watts. So people installing a 1000W power supply would seem to be a bit of an overkill - even if you have two huge graphics cards (which I don't).

I am in Australia so the Mains voltage is 240V and the computer is currently using 0.2Amps
hence 240 x 0.2 = 48Watts.

So my 500W power supply is an overkill and I probably could have used a 350W supply with no issues.
I was thinking of upgrading to a 600W or 750W power supply before doing this test but I am certainly NOT going to do so now.

We pay 0.34c per kwh for most of our power and the computer is left on 24/7 and is set to NOT go to sleep since I often need to access it remotely.
so 48W x 24 x 356 / 1000 = 420kwh used per annum @ $0.34 each = $143.00 per annum.
My household runs 4 such computers so our total bill is slightly less than $600 per annum or $50.00 per month.

So I think that for MOST people they could assume that their computer uses around 50W to 60W and probably never more than 100W. People with gaming computers will use more power than I do but I doubt that they would use more than 200W. Laptops use considerably less power than desktops - so perhaps 20W but I haven't measured this.
 
Status
Not open for further replies.