Closed

# How much money would you be saving?

Tags:
Last response: in Graphics & Displays
Share

Let's say card A has a TDP of 151 watts, while card B has a TDP of 201 watts.

How much money would you be saving when you use card A vs card B?

I don't pay the electrical bill, but I'm trying to considerate to my parents.

a c 377 U Graphics card

Tell us the actual cards. TDP from the manufacturers are not really what the card actually uses. We would also need to know how many hours a day you would be gaming and how many hours a day the computer would be off entirely to be accurate.

The 6870 vs 6950.

I'm not saying I'm going to buy either, just curious as to how much money I can be saving.

Hours:

PC on and not being used; anywhere between 5-40 hours.

PC on and being used for internet browsing or music (no strain on gfx card); probably 2-8 hours a day.

PC being used for gaming; around 2-4 hours a day. Maybe 6-10 hours on a really good Sunday.
Related resources
a c 377 U Graphics card

The 6870 and 6950 use the same power at idle so we can take that out of the equation. The 6950 only uses about 15w more than the 6870 under load. You seem to be saying about 25 hours of gaming a week. Over the course of a year the difference on your power bill using the national average for energy costs would only be about \$4.

Best solution

a c 322 U Graphics card

Under load the HD 6950 uses about 40w more than the HD 6870. While idling the difference is about 7w.

http://www.xbitlabs.com/articles/graphics/display/power...

Assuming you play games 35 hours per week, every week, that works out to a total of 72.8KWH difference in one year. If you are paying \$0.1 per KWH then that works out to \$7.28. If it's \$0.20 per KWH, then it would be \$14.56.
a b U Graphics card

here are two sample pages: