Sign in with
Sign up | Sign in
Your question
Solved

GTX 480 or GTX 570?

Last response: in Graphics & Displays
Share
June 13, 2012 6:47:07 PM

Guys help me out please! Which video card should i get? the gtx 480 for $209 or the gtx 570 for $259? I know the gtx 570 is a little bit faster (around 6%) and runs cooler and consumes less electricity, but is it worth the extra bucks?

More about : gtx 480 gtx 570

a b U Graphics card
June 13, 2012 7:09:42 PM

It all depends what it's worth to you. If $50 (cost of a video game to put it in perspective) is worth it for a cooler running, more efficient, slightly faster video card, then go for the 570. The 570 *should* overclock better BTW.
m
0
l

Best solution

a b U Graphics card
June 13, 2012 7:54:00 PM

Recon... I knew you'd be here... and I knew that would be your answer...

As far as performance... It is a tie
As far as price the GTX 480 is $210 while the GTX 570 is $270... a $60 difference. ($50 after MIR)
It would take years to make up $60 in power savings from the GTX 570. If you live in a fridgid climate and need to run the heater definitely get the GTX 480.
Share
Related resources
a b U Graphics card
June 13, 2012 8:17:34 PM

what psu do you have ?
m
0
l
a c 87 U Graphics card
June 13, 2012 8:22:16 PM

GTX 570 TI 2.5GB>GTX 480 1.5GB > GTX 570 1.25GB
Performance is about the same across the board, but 1.25GB is just really pushing your luck with VRAM. Considering the price, the 480 wins by a large margin and 1.5GB is sufficient VRAM capacity for more than adequate future-proofing. The power usage difference between the 570 and the 480 is also not large enough to make up the price difference in a realistic amount of time. The 480 is probably the best option among these three cards. Make sure that you have a PSU that can handle the 480 (preferably a 650w to 700w) and that the 480's cooling fan(s) is(are) not blocked by other devices in the case.
m
0
l
a c 185 U Graphics card
June 13, 2012 8:42:40 PM

GTX480 all day long!BTW GTX480 Overclocks better than GTX 570 ;) 
m
0
l
a b U Graphics card
June 13, 2012 9:03:30 PM

I was about to post about how it's at least a 100W power difference, but actually the 570 is not THAT efficient. You'd be saving 100W under load by going for a 7850, though. Because I was finally curious about it, I figured that 1 kWh of electricity around me (New York City) is about $.18, which means for every 100 hours of gaming I'm saving $1.80. So it's only a few bucks a year of difference, at most. How disappointing!
m
0
l
a c 87 U Graphics card
June 13, 2012 9:34:39 PM

Actually, the 7850 saves quite a bit per year, several dozen dollars at the least. $.18 per KWh means that if you play for 3 hours a day five days a week all year, you save over $140 per year just during the time spent gaming, let alone idle power savings. It's every ten hours that you'd save $1.8, not every 100 hours. The power usage difference is also greater than 100w at load. TDP is not nearly exact power usage and AMD likes to be a good deal lower than their TDPs than Nvidia does.

100 times 0.18 is 18, not 1.8

However, except for the two most highly factory overclocked 7850s that I know of, the GTX 480 is a little faster at stock than the 7850.
m
0
l
a c 87 U Graphics card
June 13, 2012 9:38:33 PM

What do you mean by that, recon-uk?
m
0
l
a c 87 U Graphics card
June 13, 2012 9:42:46 PM

That's true, but there is no Kepler competitors between the GT 640s and the GTX 670 right now. I never said that GCN uses less power than Kepler in gaming. GCN is far more efficient than Fermi, which is it's only competitor from Nvidia at this time. Besides, it's not much of a difference with Kepler FP32 versus GCN like it is with GCN versus Fermi in GF11x such as the 570 and especially GF10x such as the 480.
m
0
l
a b U Graphics card
June 13, 2012 10:47:07 PM

blazorthon said:
Actually, the 7850 saves quite a bit per year, several dozen dollars at the least. $.18 per KWh means that if you play for 3 hours a day five days a week all year, you save over $140 per year just during the time spent gaming, let alone idle power savings. It's every ten hours that you'd save $1.8, not every 100 hours. The power usage difference is also greater than 100w at load. TDP is not nearly exact power usage and AMD likes to be a good deal lower than their TDPs than Nvidia does.

100 times 0.18 is 18, not 1.8

However, except for the two most highly factory overclocked 7850s that I know of, the GTX 480 is a little faster at stock than the 7850.


Yes, it could be more than 100W difference. Just good as an estimate. E.g. this site suggests it's more like 150W: http://www.guru3d.com/article/msi-radeon-hd-7850-power-...

However you're neglecting that it's kilowatt-hours. 100W difference means it's 10 hours before you hit 1 kilowatt. So after 10 hours, you're now at a difference of eighteen cents. If you play 100 hours, it's $1.80. 100 hours is low for a full year of gaming, but it's order-of-magnitude. Even if you game 20 hours a week on average so you get about 1000 hours a year (~50 weeks/yr), that's $18 in difference. It'd take about the life of the card (3-4 years or so) to make up the $50 difference. This is all estimates, of course. And I think 1000 hours a year is a lot--for me, at least, I doubt I game even 10 hours a week *on average* now that I have a real job, though of course there are plenty of weeks where I meet or exceed that.
m
0
l
a c 87 U Graphics card
June 14, 2012 8:44:43 AM

motorneuron said:
Yes, it could be more than 100W difference. Just good as an estimate. E.g. this site suggests it's more like 150W: http://www.guru3d.com/article/msi-radeon-hd-7850-power-...

However you're neglecting that it's kilowatt-hours. 100W difference means it's 10 hours before you hit 1 kilowatt. So after 10 hours, you're now at a difference of eighteen cents. If you play 100 hours, it's $1.80. 100 hours is low for a full year of gaming, but it's order-of-magnitude. Even if you game 20 hours a week on average so you get about 1000 hours a year (~50 weeks/yr), that's $18 in difference. It'd take about the life of the card (3-4 years or so) to make up the $50 difference. This is all estimates, of course. And I think 1000 hours a year is a lot--for me, at least, I doubt I game even 10 hours a week *on average* now that I have a real job, though of course there are plenty of weeks where I meet or exceed that.


3 hours a day, five days a week. 15 times 52 (about how many weeks are in a year, 52x7 is 364) is 760. 760 hours times a 150W difference is 114000Wh aka 114KWh. 114KWh times $0.18 is $20.52. Yes, it seems that I did make a mistake in these calculations. We're also neglecting the fact that unless this computer is completely off when not gaming, it the 7850 is still using less power than the other graphics cards do, even if the difference is smaller and for much longer periods of time too.
m
0
l
June 14, 2012 2:51:04 PM

Best answer selected by kennyzoia.
m
0
l
June 14, 2012 3:57:30 PM

just get the gtx 480 it's a great price for almost 580 performance.
m
0
l
!