Sign in with
Sign up | Sign in
Your question

Best Graphics Cards For The Money: January 2012

Tags:
  • Gaming
  • Graphics Cards
  • Graphics
  • AMD
  • Components
  • Nvidia
  • GPUs
  • Buyer's Guides
Last response: in Reviews comments
Share
Anonymous
September 14, 2014 9:30:16 PM

The new year promises availability of AMD's new Radeon HD 7970 3 GB. Also, we see a spate of surprise price shifts that change some of our most consistent recommendations. We're also getting more information about what will happen in the coming months.

Best Graphics Cards For The Money: January 2012 : Read more

More about : graphics cards money january 2012

a b U Graphics card
September 14, 2014 9:49:39 PM

AMD rules the charts :) 
m
7
l
Related resources
September 14, 2014 9:52:10 PM

Please update Intel graphics. They can also game, you know.
m
3
l
September 14, 2014 10:19:25 PM

Because TitanZ is included in the hierarchy chat, please include it in the performance/dollar chart for us to make an impression on card's performance.
m
0
l
September 14, 2014 11:07:59 PM

a little tip, never ever ever ever get AMD for Linux :) 
m
-1
l
a b 4 Gaming
a b U Graphics card
September 15, 2014 12:00:21 AM

"The GeForce GT 750, 740 GDDR5, and 730 DDR3 shed $5 each to $15, $95, and $65."
Where can I get these $15 gtx 750's?

[EDIT by Cleeve]
Derp! Fixed to $65 :p 
[/EDIT]
m
2
l
a b 4 Gaming
a b U Graphics card
September 15, 2014 12:04:08 AM

Also, why does the r9 290x need a 500W PSU and the GTX 780ti need a 600W one when the 780ti uses less power?
m
5
l
September 15, 2014 12:18:41 AM

is this just a repost of last month? the link was already purple for me. lol

[EDIT by Cleeve]
Not a repost. We simply re-use the same URL so people don't have to change links to the article. :) 
[/EDIT]
m
0
l
September 15, 2014 12:35:11 AM


Quote:
a little tip, never ever ever ever get AMD for Linux :) 


I can second that for gaming at least. However hardware acceleration in video is pretty great still under the open source drivers. My e350 still works perfectly at 1080p in about any format using those.

I really which and got its shot together on the Linux driver front though.
m
0
l
September 15, 2014 12:38:28 AM

in the hierarchy chart if new card added in the list
can you make a bold font of it
to highlight that new card has been added it
to ease people look up on the list since so many card in one column

this is just a suggestion though...thx
m
6
l
September 15, 2014 5:05:56 AM

Quote:
AMD rules the charts :) 


Yep, AMD did it again!
m
1
l
a b 4 Gaming
a b U Graphics card
September 15, 2014 5:25:43 AM

toms my babys daddy said:
is this just a repost of last month? the link was already purple for me. lol

For quarterly roundups, they simply update the existing article and re-post it instead of creating a new article from a template and a new forum thread - see how this thread's title still says January 2012?

Many people, myself included, find this quite annoying since it makes it impossible to go back to past versions to compare them against each other for things like tracking how recommendations progressed over time unless you archive them for yourself.
m
0
l
a b 4 Gaming
a b U Graphics card
September 15, 2014 5:39:42 AM

blackmagnum said:
Please update Intel graphics. They can also game, you know.

Not very well though.

But yeah, Haswell's HD4600 is up to 50% faster than Ivy's HD4000 and that should be enough to earn it a spot a few tiers higher, which should be noteworthy since it rules out some of the lowest-end and more ancient GPUs as viable "upgrades."
m
0
l
a b 4 Gaming
a b U Graphics card
a b À AMD
September 15, 2014 5:59:18 AM

This will be the third article in a row that I pose the same question. Why in the multi card section do you mention the 290 crossfire only from AMD? Is 2 x r9 280 for $400 not a good deal? What about 2 x r9 280x for $550? There are a lot of people out there that would possibly consider adding an extra one of these cards. That is the reason for that section right? So why recommend pretty much all Nvidia's SLI solutions, but only mention the r9 290 from AMD's side for crossfire?
m
1
l
September 15, 2014 6:59:53 AM

Several of the "gaming" laptops on the market are using the GeForce GT 750M. Could you add it to the hierarchy chart so we can see where it falls?
m
2
l
September 15, 2014 7:04:56 AM

Guru Meditation #00000025.65045048
You lost the GTX670 in the Hierarchy Chart. Press left mouse button to continue.
m
0
l
a b 4 Gaming
a b U Graphics card
September 15, 2014 7:09:56 AM

sincreator said:
So why recommend pretty much all Nvidia's SLI solutions, but only mention the r9 290 from AMD's side for crossfire?

Because AMD's GPUs still have too many unresolved crossfire performance consistency issues so people buying from scratch are more likely to get an enjoyable playing experience out of a 290X than 2x280 for about the same price?
m
0
l
a b 4 Gaming
a b U Graphics card
a b À AMD
September 15, 2014 7:24:24 AM

InvalidError said:
sincreator said:
So why recommend pretty much all Nvidia's SLI solutions, but only mention the r9 290 from AMD's side for crossfire?

Because AMD's GPUs still have too many unresolved crossfire performance consistency issues so people buying from scratch are more likely to get an enjoyable playing experience out of a 290X than 2x280 for about the same price?


What unresolved issues? Frame pacing was the big one, but that's fixed now afaik. So why mention the 290 xfire then? So you would recommend that someone that allready has a r9 280x to sell it and spend another $500+ on a single GPU card instead of picking up a second card. That is the main reason for the mutiple graphics section, is it not?

m
0
l
a b 4 Gaming
a b U Graphics card
September 15, 2014 7:45:15 AM

sincreator said:
So you would recommend that someone that allready has a r9 280x to sell it and spend another $500+ on a single GPU card instead of picking up a second card. That is the main reason for the mutiple graphics section, is it not?

That's why I specified FROM SCRATCH - no existing GPU to start with.

Once you are invested in one particular GPU and do not want to give it up, your upgrade path is already set regardless of where your existing GPU stands on the bang-per-buck chart.
m
0
l
September 15, 2014 7:54:23 AM

sincreator said:
This will be the third article in a row that I pose the same question. Why in the multi card section do you mention the 290 crossfire only from AMD? Is 2 x r9 280 for $400 not a good deal? What about 2 x r9 280x for $550? There are a lot of people out there that would possibly consider adding an extra one of these cards. That is the reason for that section right? So why recommend pretty much all Nvidia's SLI solutions, but only mention the r9 290 from AMD's side for crossfire?


Sorry, I didn't notice this question before.

I do this because AMD cards scale very poorly, until the Hawaii GCN update.

Nvidia had frame pacing hardware built into their GPUs from some time, but Tahiti generation GCN parts have latency problems in multi-card configs. This is fixed in the bridge-free Hawaii (and presumably Tonga, although we haven't tested it yet) Crossfire configs.

Hope that answers your question,

- Don (Cleeve)

m
2
l
a b 4 Gaming
a b U Graphics card
a b À AMD
September 15, 2014 7:55:54 AM

InvalidError said:
sincreator said:
So you would recommend that someone that allready has a r9 280x to sell it and spend another $500+ on a single GPU card instead of picking up a second card. That is the main reason for the mutiple graphics section, is it not?

That's why I specified FROM SCRATCH - no existing GPU to start with.

Once you are invested in one particular GPU and do not want to give it up, your upgrade path is already set regardless of where your existing GPU stands on the bang-per-buck chart.


Maybe you should change this statement then:

"We'll call out some of the most viable options though, mostly for folks with one of these cards already installed: two GeForce GTX 660s in SLI for $360, two GeForce GTX 760s in SLI for $450, two GeForce GTX 770s in SLI for $660, and finally, two Radeon R9 290s in CrossFire for $780."

m
0
l
a b 4 Gaming
a b U Graphics card
a b À AMD
September 15, 2014 8:13:36 AM

cleeve said:
sincreator said:
This will be the third article in a row that I pose the same question. Why in the multi card section do you mention the 290 crossfire only from AMD? Is 2 x r9 280 for $400 not a good deal? What about 2 x r9 280x for $550? There are a lot of people out there that would possibly consider adding an extra one of these cards. That is the reason for that section right? So why recommend pretty much all Nvidia's SLI solutions, but only mention the r9 290 from AMD's side for crossfire?


Sorry, I didn't notice this question before.

I do this because AMD cards scale very poorly, until the Hawaii GCN update.

Nvidia had frame pacing hardware built into their GPUs from some time, but Tahiti generation GCN parts have latency problems in multi-card configs. This is fixed in the bridge-free Hawaii (and presumably Tonga, although we haven't tested it yet) Crossfire configs.

Hope that answers your question,

- Don (Cleeve)



There was a phase 2 frame pacing fix that catered to GCN 1.0 cards back a few months ago in the Cat 14.4 driver. I guess you guys haven't tested it yet? Here is a link to the article that Guru3d did with the first update to frame pacing in the 13.8 driver. http://www.guru3d.com/articles_pages/amd_framepacing_re...

Even Hilbert says that he don't feel bad about recommending crossfire anymore, as long as the trend that started in 13.8 continues in future titles. Maybe now would be a good time to investigate the issue again. With the prices falling on the r9 280/280x a lot of people may be considering crossfire and it would be good to know for these buyers. ;) 

m
0
l
September 15, 2014 12:30:36 PM

I personally still feel crossfire is not the way to go, I hate ending up with problems in that one game you need the performance the most. Not the great graphical masterpieces as generally you can get away with some lower settings but the resource hogs that are not very optimized and yet those are the ones where crossfire and sli tends to fail. That's not really the fault of amd / nvidea but just the game devs themselves.

If this problem didn't exist I would have easily used crossfire on my 7950 as a second one is only €160 however it still runs great with a pretty much maxed out oc and with a good cooler is quiet as well.
m
0
l
September 15, 2014 1:53:04 PM

sincreator said:

There was a phase 2 frame pacing fix that catered to GCN 1.0 cards back a few months ago in the Cat 14.4 driver. I guess you guys haven't tested it yet?


Actually, I *personally* tested it many times. But frankly, it doesn't make the problem go away to the extent that I'd recommend anything less than the Hawaii generation or better. There are still use cases where we see problems with the first GCN implementation.

[edit - bad example, older driver]
m
0
l
September 15, 2014 2:02:23 PM

sincreator said:
InvalidError said:
sincreator said:
So you would recommend that someone that allready has a r9 280x to sell it and spend another $500+ on a single GPU card instead of picking up a second card. That is the main reason for the mutiple graphics section, is it not?

That's why I specified FROM SCRATCH - no existing GPU to start with.

Once you are invested in one particular GPU and do not want to give it up, your upgrade path is already set regardless of where your existing GPU stands on the bang-per-buck chart.


Maybe you should change this statement then:

"We'll call out some of the most viable options though, mostly for folks with one of these cards already installed: two GeForce GTX 660s in SLI for $360, two GeForce GTX 760s in SLI for $450, two GeForce GTX 770s in SLI for $660, and finally, two Radeon R9 290s in CrossFire for $780."



Invaliderror never made that statement, actually. I did. :) 

m
0
l
a b 4 Gaming
a c 190 U Graphics card
a b À AMD
a c 88 Î Nvidia
September 15, 2014 2:28:26 PM

GTX-580 is faster than GTX-760?!?!?!?!?!
m
2
l
September 15, 2014 3:41:52 PM

Again, why no 770?
m
0
l
September 15, 2014 8:11:00 PM

coolitic said:
Again, why no 770?


Again, because the 280X is only slightly slower but significantly cheaper.
m
0
l
a b U Graphics card
September 15, 2014 8:51:11 PM

just curious but why is the GTX 770 not featured anymore?
m
0
l
September 15, 2014 9:35:39 PM

BlankInsanity said:
just curious but why is the GTX 770 not featured anymore?


Answered one post above your question! :) 

m
0
l
September 16, 2014 7:31:38 AM

Yet anything with 600 Watts or more should require the Power Supply be set on 240 Volt Setting given how much power they take, and I know that is not "Power Bill Friendly", which I am sure I going going to get Negative Votes on my post for.

Moderator Edit: NO! The voltage setting on a PSU is set to the input voltage of the mains AC where it is used, e.g. 120V in the USA and 240V in the UK. It has nothing to do with the output power of the PSU. If this switch is present (modern PSUs with Active PFC don't need it), setting it incorrectly can damage the PSU and/or attached components.
m
-1
l
a b 4 Gaming
a b U Graphics card
September 16, 2014 7:51:03 AM

Jeffrey H said:
Yet anything with 600 Watts or more should require the Power Supply be set on 240 Volt Setting given how much power they take, and I know that is not "Power Bill Friendly", which I am sure I going going to get Negative Votes on my post for.

If you want to set your PC's power supply on 240V input, you will need a 240V circuit going to that outlet. This is fine for countries that run regular outlets on 220-250V where you would be using the 240V input anyway but in 100-125V countries, this is generally not an option.

Most modern devices and decent quality power supplies have universal input so there is no 120/240V switch on them anymore.
m
1
l
a b 4 Gaming
a b U Graphics card
a b Î Nvidia
September 16, 2014 8:17:45 AM

It'd be nice to know how Iris 5X00 compares. Just saying. It's been YEARS since Intel on-chip graphics have been updated.
m
1
l
September 16, 2014 8:52:52 AM

Not a fan of AMD processors, but love their GPUs. Twice in a row I on the red team. Although next time I will jump to the green team only because I usually shift between the two if I have the chance. Speaking of chance, hope Nvidia gives me one and prices them competitively, I am not saying wining in terms of price but just be more competitive.
m
0
l
a b 4 Gaming
a b U Graphics card
September 16, 2014 9:07:22 AM

babernet_1 said:
It'd be nice to know how Iris 5X00 compares. Just saying. It's been YEARS since Intel on-chip graphics have been updated.

Haswell's IGP (HD4600/GT2) is just over a year old and about 50% faster than Ivy's HD4000. The GT3 almost doubles that but does not get much of a chance to shine until paired with eDRAM in its GT3e variant which is currently only available as mobile/embedded where discrete GPU is often not an option.

So aside from HD4600, there will be no Intel IGP to add until Broadwell-K or Skylake comes out late next year.
m
0
l
a b 4 Gaming
a b U Graphics card
a b Î Nvidia
September 16, 2014 10:57:55 AM

InvalidError said:
babernet_1 said:
It'd be nice to know how Iris 5X00 compares. Just saying. It's been YEARS since Intel on-chip graphics have been updated.

Haswell's IGP (HD4600/GT2) is just over a year old and about 50% faster than Ivy's HD4000. The GT3 almost doubles that but does not get much of a chance to shine until paired with eDRAM in its GT3e variant which is currently only available as mobile/embedded where discrete GPU is often not an option.

So aside from HD4600, there will be no Intel IGP to add until Broadwell-K or Skylake comes out late next year.


Okay, but the 4200, 4400, and 4700 and more are available now.
m
0
l
September 16, 2014 2:00:13 PM

Dollars per performance point are:
6.20
5.00
5.42
6.00
5.36
5.59
5.50
6.09
7.06
8.06
9.67
10.00

Bit of a sweet spot at the Radeon R9 280.
m
0
l
a b 4 Gaming
a b U Graphics card
September 16, 2014 2:04:46 PM

only thing i see is all the -R- card threads here at toms kinda makes me lean towards nvidia just on that point
m
0
l
September 16, 2014 3:31:48 PM

Ahmadjon said:
GTX-580 is faster than GTX-760?!?!?!?!?!


Yes, pretty much. Rather much so , in fact.
GTX 580 owner here.

Base bench comparison I always refer to when comparing cards is :
http://www.videocardbenchmark.net/high_end_gpus.html

These numbers are base no overclock on that chart.
GeForce GTX 760 4,981
GeForce GTX 580 4,975
THose are passmark video bench figures.

Yes, it is a gross (as in crude) approx of rating but is in my experience accurate enough to understand
the difference in potential performance in any given game. Of course one always compares if possible
to real application benchmarks. That is not always possible, so I rely on Passmark quite a bit if I"m not
familiar with 'hardware X'.
In fact on my system :
Windows 7 64 | 8GB RAM, Asus MB P8Z68-V PRO Gen3
CPU: Intel i7-2600K @4.3 GHz o/c
GPU: Nvidia GTX 580 1.5GB VRAM o/c @ 925Mhz 2120Mhz Memory 1.115V
Main Display: Dell U2410 1920x1200,

I see video Passmark score of 6100 (at GPU overclock above, with CPU overclock) , which is near GTX 770 (non overclock system) numbers on their scale.

Don't laugh. I recently found a game that for the first time has given me a reason to research finding a new card,
European Truck Simulator 2. Game runs fine with max settings at 55 to 60 FPS for me but if
I want to use 'Scaling" in the game at 300 FPS drops to 30 to 55 FPS depending (with no map mods installed)
, the far preferred scaling setting of 400 my card can't really handle.

I researched a lot using reports on forums elsewhere that
there was zero benefit to buy a GTX770, as those people report pretty much the same figures.
Only a GTX780 black or 780 TI would offer tangible rewards for me, with this game at the screen resolution I run at.

So yes, A GTX 580 is still a beast, I have zero reason to upgrade this card as of September 2014 other than that one game , ETS2. LOL. I can wait it runs well enough for me if I put up with 200 scaling. (actually it does okay at 300 scaling but that is only if I run with no map mod installed)

A long reply, perhaps unneeded. but it just highlights that people often falsely assume the numbers of the model number are bigger that something is better. (In case someone nitpicks me I did say do not rely alone on
passmark numbers, one always cross refers to real application performance)

I paid near 500 bucks back in the day for my GTX 580.
The only area it suffers in , from my perspective, is it is only 1.5GB Vram.

Of course, if I had to replace it in an emergency with a near performing Nvidia card, it would be a GTX770 but I have my eyes on the cards Nvidia is expected to release later this year.



















m
0
l
September 16, 2014 5:06:39 PM

Looks like we're missing the Intel 7th gen Chipsets - where do these appear on the scale?
HD 5200
HD 5100
HD 5000
HD 4600
HD 4400
HD 4200
HD 2500

m
0
l
September 16, 2014 5:12:33 PM

Also Missing the Intel 8th gen Chipsets
HD 6300
HD 6200
HD 6100
HD 6000
HD 5600
HD 5500
HD 5300
m
0
l
September 16, 2014 5:12:45 PM

I'm looking to buy a new pc soon and starting to do some research on components and wondering if 2 770s in sli out perform a single 780 ti for the same money.

Thanks!
m
0
l
September 16, 2014 5:37:30 PM

I'm building a new PC, i'll be using it for gaming and some video editing, i wanted to have a system that i can play games on the max settings till 2 or 3 years so i can do the next upgrade, which graphics card you guys recommend me to get ?
m
0
l
September 16, 2014 5:41:26 PM

You updated this 3 days ago and didn't benchmark the R9 285? Why?
m
0
l
September 16, 2014 6:31:55 PM

yea boy~ R9 290 represent!
m
0
l
September 17, 2014 4:36:14 AM

Dollars per performance point are:
6.20
5.00
5.42
6.00
5.36
5.59
5.50
6.09
7.06
8.06
9.67
10.00

Bit of a sweet spot at the Radeon R9 280.
m
0
l
September 17, 2014 6:31:26 AM

I just purchased an XFX R9 280, and it will run Battlefield 4 on Ultra settings at around 50fps....Great card for the money!
m
0
l
September 18, 2014 5:33:17 AM

580 and 660 were in the same tier for moths/years and now the 580 jump above the 760 ??? what?
m
0
l
September 18, 2014 8:57:19 PM

InvalidError said:
Jeffrey H said:
Yet anything with 600 Watts or more should require the Power Supply be set on 240 Volt Setting given how much power they take, and I know that is not "Power Bill Friendly", which I am sure I going going to get Negative Votes on my post for.

If you want to set your PC's power supply on 240V input, you will need a 240V circuit going to that outlet. This is fine for countries that run regular outlets on 220-250V where you would be using the 240V input anyway but in 100-125V countries, this is generally not an option.

Most modern devices and decent quality power supplies have universal input so there is no 120/240V switch on them anymore.


Well seeing Onus that I broke one of the Rules for making a "False Statement", I figure he needs to understand the Higher the Wattage on Video Cards Reduces the Energy Efficiency it would be like plugging in a Microwave having it running for a full 24 hours thus using more Power, which is not something I like because it would make someone's Power Bill go up to $100 more a Month, and it is just something that if you want "Realistic" 3D and VR, then it would mean a higher Power Bill compared to someone who is not interested in games like that, I know Onus might warn me again for breaking one of the rules here for Discussion but I am someone who is both on a Budget and not like Computers to be like Microwaves.
m
0
l
      • 1 / 2
      • 2
      • Newest
!