Upgrade from 560 Ti ----> 960 4gb or 7950. Read on before you post-->

Radioactive Gamer

Reputable
Feb 11, 2016
110
0
4,710
So here is the real issue. Saving money on my power bill is important to me, but I won't cry over a few more dollars. My PC is already costing me about 30 bucks a month at 0.14 cents a KWH. At first I thought, ill just get a 660 ti, or a 7870 to save money, and get some decent performance, but screw that I want a worthy upgrade for GTA V, Rust, and a few others. So then comes the question with a 200 dollar budget, should I get the 960 4gb to save on my power bill, and will that be a worthy upgrade at the same time over a 660 ti, and 7870, or would my power bill not suffer much with the greatness of a 7950? Not only is the 7950 a better performer, but out of my 200, may save room for an 80+ PSU, throwing in a few extra bucks of course ;) cheat a little, my wife will not kill me too badly. My specs: FX 4350 4.20 ghz not overclocked. A 2TB HD, and a 1TB storage HD, along with 8 gigs of ballistic 1600 ram, and a factory amd cooler.
 
MATH PROBLEM:

By my calculation, that amount of power used is unlikely (see bottom).

If we assume it's closer to $10 max per month then that's $120 per year, however most people use more HEAT electricity than air conditioning (the HEAT OUTPUT isn't completely wasted from a computer when you are heating your house). So it's hard to be exact, but I'd be surprised if it came to $100 per year but I don't know your specifics.

*So get the GTX970 and just enjoy it, and sure the FX-4350 will be a big bottleneck at times. However it VARIES by the game. Tomb Raider for example will see little CPU bottleneck and of course buying a completely new Intel system isn't likely an option.

Other:
Math example for worst-case? (again, don't know all details)

$30 / $.14 = 214 (KWh)

assume average 333W (mix idle and gaming; total PC + monitor power cost?) = 642 hours usage

approx 12 hours per day, every day?
 

Dugimodo

Distinguished
To cost you $30 a month you would need to be drawing 300W 24 hours a day. I really doubt that, what do you use the PC for that uses so much power? I have a 6700K and a GTX 980, power costs me $0.25 a KWH, I use my PC a lot, and it doesn't cost me as much as you say, more like $10-$20 on average. If you really want to find out costs invest in a watt meter and measure it, there is no other way to actually know and they can be had cheap enough.

Also a modern graphics card when idling or not being used for demanding tasks does not use much power regardless of what it might use at full load. Compare the idle power consumption of a mid range card with a high end one and you'll see there is not that much difference if there is any at all.

The only time the power consumption will be higher is when you are gaming or using the graphics hardware at high load, unless you do that for a significant part of every day it's unlikely a graphics card upgrade will make more than a few dollars a month difference to he power bill if that.

So basically what I'm saying is get whatever graphics card you want and the extra power costs aren't likely to be much. If you really believe it will cost you much maybe a 750ti is the best choice, it's still the fastest card that doesn't require an extra power connector I believe.

Or maybe a better way to approach it, go here http://outervision.com/power-supply-calculator and put in your system specs. then just change the graphics card and recalculate. that'll show you the difference in maximum power draw and give you the worst case. Bear in mind most of the time PC's don't use maximum power.

By doing that I see a 960 uses up to 40W less than a 560Ti and a 7950 uses up to 20W less, so both are a saving and there's 20w between them. If you managed to run your graphics card at 100% load for 24 hours a day 40W will save you $4 and 20W will save you $2. I really wouldn't worry about the power consumption.
 
Dugimodo,
Your math is incorrect for the monthly amount, though perhaps you didn't use the 14c/kWh amount to calculate. See my calculation above.

Another calculation to double check:
*Note (as said above) that the SAVINGS is only based on the DIFFERENCE in power saved between graphics cards.

I won't look up the actual value, but if we said 50W (remember it's an AVERAGE of idle/gaming) over 20 hours (per week) then that's 1KWh which costs you 14 cents.

Put another way, you might only save about $5 to $10 per year.
 

Dugimodo

Distinguished
The maximum power usage for a 960 is 40W less than a 560Ti and a 7950 is in between so 20W less. That's where I got the difference from.
.04 KW x 24 hours x 31 days x .014 = $4.10 which is where I got the $4 saving number from, like you say based on the difference.

For monthly costs I took a wild stab and assumed a 300W average
.3KW x 24 x 31 x .14 =$31.248

Where's the math error ? I calculated for worst case scenario, you tried to work out a more realistic actual usage. Either way the numbers are too small to worry about.
 

Radioactive Gamer

Reputable
Feb 11, 2016
110
0
4,710


Which is the best model to buy of the 960 4gb? MSI, EVGA ect...
 

clutchc

Titan
Ambassador


I bought the Evga FTW+ in both my GTX 960 and 970 because they have the best warranty, excellent quality, and the fastest clocks out of the box. (At least at that time, they were the fastest)
http://www.evga.com/Products/Product.aspx?pn=02G-P4-2968-KR
The ACX 2.0 cooling is very good. In fact the fans never run until the temp gets ~60C. Of course that can be changed if you like your fans running at low RPM even when not needed.
 

Dugimodo

Distinguished
Here's one outside the box a little. If you still use old style incandescent bulbs for lighting switch to LED lighting and you'll save more than any difference in graphics hardware will use :) If your hot water cylinder is electric and not modern put an insulating wrap around it and some cheap lagging on the first few feet of hot water outlet pipes. Point is there are many ways to save some power that work better than swapping out a graphics card. Put some effort into that and maybe the wife will be happy enough not to care about the graphics card.

Going back to the watt meter, it really is the best way to find out costs, and you can use it to see what anything that plugs in is costing you.
No calculations and assumptions needed, just measure it.

For most households and assuming a heavily used gaming PC power usage would go something along the lines of:
(highest to lowest)

Hot water
Home heating /Aircon (Assuming both hot water and heating are electric)
Electric cooking appliances
Refrigerator
Lighting (using incandescent)
PC and large appliances
Lighting (using LED)
other small appliances
 

Radioactive Gamer

Reputable
Feb 11, 2016
110
0
4,710
Actually I got the 4gb Zotac 960, as it was on sale for 180, quite happy with it so far.