Best Graphics Cards For The Money: January 2012
Tags:
-
Gaming
-
Graphics Cards
-
Graphics
-
AMD
-
Components
- Nvidia
-
GPUs
- Buyer's Guides
Last response: in Reviews comments
Anonymous
September 14, 2014 9:30:16 PM
The new year promises availability of AMD's new Radeon HD 7970 3 GB. Also, we see a spate of surprise price shifts that change some of our most consistent recommendations. We're also getting more information about what will happen in the coming months.
Best Graphics Cards For The Money: January 2012 : Read more
Best Graphics Cards For The Money: January 2012 : Read more
More about : graphics cards money january 2012
-
Reply to Anonymous
Archived comments are found here: http://www.tomshardware.com/forum/id-2250433/graphics-c...
-
Reply to adamovera
m
0
l
Related resources
- The Best Gaming Video Cards for the Money: January 2007 - Forum
- best graphics card(s) for the money - Forum
- Which of these graphics cards will perform the best for the money - Forum
- Best Graphics Card for the Money $100-$150 - Forum
- Best graphics card for money - Forum
blackmagnum
September 14, 2014 9:52:10 PM
Marius_Bota
September 14, 2014 10:19:25 PM
kamhagh
September 14, 2014 11:07:59 PM
toms my babys daddy
September 15, 2014 12:18:41 AM
qlum
September 15, 2014 12:35:11 AM
Quote:
a little tip, never ever ever ever get AMD for Linux
I can second that for gaming at least. However hardware acceleration in video is pretty great still under the open source drivers. My e350 still works perfectly at 1080p in about any format using those.
I really which and got its shot together on the Linux driver front though.
-
Reply to qlum
m
0
l
chaosmassive
September 15, 2014 12:38:28 AM
cd000
September 15, 2014 5:05:56 AM
toms my babys daddy said:
is this just a repost of last month? the link was already purple for me. lolFor quarterly roundups, they simply update the existing article and re-post it instead of creating a new article from a template and a new forum thread - see how this thread's title still says January 2012?
Many people, myself included, find this quite annoying since it makes it impossible to go back to past versions to compare them against each other for things like tracking how recommendations progressed over time unless you archive them for yourself.
-
Reply to InvalidError
m
0
l
blackmagnum said:
Please update Intel graphics. They can also game, you know. Not very well though.
But yeah, Haswell's HD4600 is up to 50% faster than Ivy's HD4000 and that should be enough to earn it a spot a few tiers higher, which should be noteworthy since it rules out some of the lowest-end and more ancient GPUs as viable "upgrades."
-
Reply to InvalidError
m
0
l
This will be the third article in a row that I pose the same question. Why in the multi card section do you mention the 290 crossfire only from AMD? Is 2 x r9 280 for $400 not a good deal? What about 2 x r9 280x for $550? There are a lot of people out there that would possibly consider adding an extra one of these cards. That is the reason for that section right? So why recommend pretty much all Nvidia's SLI solutions, but only mention the r9 290 from AMD's side for crossfire?
-
Reply to sincreator
m
1
l
hippenmoor
September 15, 2014 6:59:53 AM
DrNLS
September 15, 2014 7:04:56 AM
sincreator said:
So why recommend pretty much all Nvidia's SLI solutions, but only mention the r9 290 from AMD's side for crossfire? Because AMD's GPUs still have too many unresolved crossfire performance consistency issues so people buying from scratch are more likely to get an enjoyable playing experience out of a 290X than 2x280 for about the same price?
-
Reply to InvalidError
m
0
l
InvalidError said:
sincreator said:
So why recommend pretty much all Nvidia's SLI solutions, but only mention the r9 290 from AMD's side for crossfire? Because AMD's GPUs still have too many unresolved crossfire performance consistency issues so people buying from scratch are more likely to get an enjoyable playing experience out of a 290X than 2x280 for about the same price?
What unresolved issues? Frame pacing was the big one, but that's fixed now afaik. So why mention the 290 xfire then? So you would recommend that someone that allready has a r9 280x to sell it and spend another $500+ on a single GPU card instead of picking up a second card. That is the main reason for the mutiple graphics section, is it not?
-
Reply to sincreator
m
0
l
sincreator said:
So you would recommend that someone that allready has a r9 280x to sell it and spend another $500+ on a single GPU card instead of picking up a second card. That is the main reason for the mutiple graphics section, is it not?That's why I specified FROM SCRATCH - no existing GPU to start with.
Once you are invested in one particular GPU and do not want to give it up, your upgrade path is already set regardless of where your existing GPU stands on the bang-per-buck chart.
-
Reply to InvalidError
m
0
l
sincreator said:
This will be the third article in a row that I pose the same question. Why in the multi card section do you mention the 290 crossfire only from AMD? Is 2 x r9 280 for $400 not a good deal? What about 2 x r9 280x for $550? There are a lot of people out there that would possibly consider adding an extra one of these cards. That is the reason for that section right? So why recommend pretty much all Nvidia's SLI solutions, but only mention the r9 290 from AMD's side for crossfire? Sorry, I didn't notice this question before.
I do this because AMD cards scale very poorly, until the Hawaii GCN update.
Nvidia had frame pacing hardware built into their GPUs from some time, but Tahiti generation GCN parts have latency problems in multi-card configs. This is fixed in the bridge-free Hawaii (and presumably Tonga, although we haven't tested it yet) Crossfire configs.
Hope that answers your question,
- Don (Cleeve)
-
Reply to cleeve
m
2
l
InvalidError said:
sincreator said:
So you would recommend that someone that allready has a r9 280x to sell it and spend another $500+ on a single GPU card instead of picking up a second card. That is the main reason for the mutiple graphics section, is it not?That's why I specified FROM SCRATCH - no existing GPU to start with.
Once you are invested in one particular GPU and do not want to give it up, your upgrade path is already set regardless of where your existing GPU stands on the bang-per-buck chart.
Maybe you should change this statement then:
"We'll call out some of the most viable options though, mostly for folks with one of these cards already installed: two GeForce GTX 660s in SLI for $360, two GeForce GTX 760s in SLI for $450, two GeForce GTX 770s in SLI for $660, and finally, two Radeon R9 290s in CrossFire for $780."
-
Reply to sincreator
m
0
l
cleeve said:
sincreator said:
This will be the third article in a row that I pose the same question. Why in the multi card section do you mention the 290 crossfire only from AMD? Is 2 x r9 280 for $400 not a good deal? What about 2 x r9 280x for $550? There are a lot of people out there that would possibly consider adding an extra one of these cards. That is the reason for that section right? So why recommend pretty much all Nvidia's SLI solutions, but only mention the r9 290 from AMD's side for crossfire? Sorry, I didn't notice this question before.
I do this because AMD cards scale very poorly, until the Hawaii GCN update.
Nvidia had frame pacing hardware built into their GPUs from some time, but Tahiti generation GCN parts have latency problems in multi-card configs. This is fixed in the bridge-free Hawaii (and presumably Tonga, although we haven't tested it yet) Crossfire configs.
Hope that answers your question,
- Don (Cleeve)
There was a phase 2 frame pacing fix that catered to GCN 1.0 cards back a few months ago in the Cat 14.4 driver. I guess you guys haven't tested it yet? Here is a link to the article that Guru3d did with the first update to frame pacing in the 13.8 driver. http://www.guru3d.com/articles_pages/amd_framepacing_re...
Even Hilbert says that he don't feel bad about recommending crossfire anymore, as long as the trend that started in 13.8 continues in future titles. Maybe now would be a good time to investigate the issue again. With the prices falling on the r9 280/280x a lot of people may be considering crossfire and it would be good to know for these buyers.
-
Reply to sincreator
m
0
l
qlum
September 15, 2014 12:30:36 PM
I personally still feel crossfire is not the way to go, I hate ending up with problems in that one game you need the performance the most. Not the great graphical masterpieces as generally you can get away with some lower settings but the resource hogs that are not very optimized and yet those are the ones where crossfire and sli tends to fail. That's not really the fault of amd / nvidea but just the game devs themselves.
If this problem didn't exist I would have easily used crossfire on my 7950 as a second one is only €160 however it still runs great with a pretty much maxed out oc and with a good cooler is quiet as well.
If this problem didn't exist I would have easily used crossfire on my 7950 as a second one is only €160 however it still runs great with a pretty much maxed out oc and with a good cooler is quiet as well.
-
Reply to qlum
m
0
l
sincreator said:
There was a phase 2 frame pacing fix that catered to GCN 1.0 cards back a few months ago in the Cat 14.4 driver. I guess you guys haven't tested it yet?
Actually, I *personally* tested it many times. But frankly, it doesn't make the problem go away to the extent that I'd recommend anything less than the Hawaii generation or better. There are still use cases where we see problems with the first GCN implementation.
[edit - bad example, older driver]
-
Reply to cleeve
m
0
l
sincreator said:
InvalidError said:
sincreator said:
So you would recommend that someone that allready has a r9 280x to sell it and spend another $500+ on a single GPU card instead of picking up a second card. That is the main reason for the mutiple graphics section, is it not?That's why I specified FROM SCRATCH - no existing GPU to start with.
Once you are invested in one particular GPU and do not want to give it up, your upgrade path is already set regardless of where your existing GPU stands on the bang-per-buck chart.
Maybe you should change this statement then:
"We'll call out some of the most viable options though, mostly for folks with one of these cards already installed: two GeForce GTX 660s in SLI for $360, two GeForce GTX 760s in SLI for $450, two GeForce GTX 770s in SLI for $660, and finally, two Radeon R9 290s in CrossFire for $780."
Invaliderror never made that statement, actually. I did.
-
Reply to cleeve
m
0
l
Ahmadjon
a
b
4
Gaming
a
c
190
U
Graphics card
a
b
À
AMD
a
c
88
Î
Nvidia
September 15, 2014 2:28:26 PM
coolitic
September 15, 2014 3:41:52 PM
Jeffrey H
September 16, 2014 7:31:38 AM
Yet anything with 600 Watts or more should require the Power Supply be set on 240 Volt Setting given how much power they take, and I know that is not "Power Bill Friendly", which I am sure I going going to get Negative Votes on my post for.
Moderator Edit: NO! The voltage setting on a PSU is set to the input voltage of the mains AC where it is used, e.g. 120V in the USA and 240V in the UK. It has nothing to do with the output power of the PSU. If this switch is present (modern PSUs with Active PFC don't need it), setting it incorrectly can damage the PSU and/or attached components.
Moderator Edit: NO! The voltage setting on a PSU is set to the input voltage of the mains AC where it is used, e.g. 120V in the USA and 240V in the UK. It has nothing to do with the output power of the PSU. If this switch is present (modern PSUs with Active PFC don't need it), setting it incorrectly can damage the PSU and/or attached components.
-
Reply to Jeffrey H
m
-1
l
Jeffrey H said:
Yet anything with 600 Watts or more should require the Power Supply be set on 240 Volt Setting given how much power they take, and I know that is not "Power Bill Friendly", which I am sure I going going to get Negative Votes on my post for. If you want to set your PC's power supply on 240V input, you will need a 240V circuit going to that outlet. This is fine for countries that run regular outlets on 220-250V where you would be using the 240V input anyway but in 100-125V countries, this is generally not an option.
Most modern devices and decent quality power supplies have universal input so there is no 120/240V switch on them anymore.
-
Reply to InvalidError
m
1
l
The_Icon
September 16, 2014 8:52:52 AM
Not a fan of AMD processors, but love their GPUs. Twice in a row I on the red team. Although next time I will jump to the green team only because I usually shift between the two if I have the chance. Speaking of chance, hope Nvidia gives me one and prices them competitively, I am not saying wining in terms of price but just be more competitive.
-
Reply to The_Icon
m
0
l
babernet_1 said:
It'd be nice to know how Iris 5X00 compares. Just saying. It's been YEARS since Intel on-chip graphics have been updated.Haswell's IGP (HD4600/GT2) is just over a year old and about 50% faster than Ivy's HD4000. The GT3 almost doubles that but does not get much of a chance to shine until paired with eDRAM in its GT3e variant which is currently only available as mobile/embedded where discrete GPU is often not an option.
So aside from HD4600, there will be no Intel IGP to add until Broadwell-K or Skylake comes out late next year.
-
Reply to InvalidError
m
0
l
InvalidError said:
babernet_1 said:
It'd be nice to know how Iris 5X00 compares. Just saying. It's been YEARS since Intel on-chip graphics have been updated.Haswell's IGP (HD4600/GT2) is just over a year old and about 50% faster than Ivy's HD4000. The GT3 almost doubles that but does not get much of a chance to shine until paired with eDRAM in its GT3e variant which is currently only available as mobile/embedded where discrete GPU is often not an option.
So aside from HD4600, there will be no Intel IGP to add until Broadwell-K or Skylake comes out late next year.
Okay, but the 4200, 4400, and 4700 and more are available now.
-
Reply to babernet_1
m
0
l
Oscaron
September 16, 2014 2:00:13 PM
Foozie
September 16, 2014 3:31:48 PM
Ahmadjon said:
GTX-580 is faster than GTX-760?!?!?!?!?!Yes, pretty much. Rather much so , in fact.
GTX 580 owner here.
Base bench comparison I always refer to when comparing cards is :
http://www.videocardbenchmark.net/high_end_gpus.html
These numbers are base no overclock on that chart.
GeForce GTX 760 4,981
GeForce GTX 580 4,975
THose are passmark video bench figures.
Yes, it is a gross (as in crude) approx of rating but is in my experience accurate enough to understand
the difference in potential performance in any given game. Of course one always compares if possible
to real application benchmarks. That is not always possible, so I rely on Passmark quite a bit if I"m not
familiar with 'hardware X'.
In fact on my system :
Windows 7 64 | 8GB RAM, Asus MB P8Z68-V PRO Gen3
CPU: Intel i7-2600K @4.3 GHz o/c
GPU: Nvidia GTX 580 1.5GB VRAM o/c @ 925Mhz 2120Mhz Memory 1.115V
Main Display: Dell U2410 1920x1200,
I see video Passmark score of 6100 (at GPU overclock above, with CPU overclock) , which is near GTX 770 (non overclock system) numbers on their scale.
Don't laugh. I recently found a game that for the first time has given me a reason to research finding a new card,
European Truck Simulator 2. Game runs fine with max settings at 55 to 60 FPS for me but if
I want to use 'Scaling" in the game at 300 FPS drops to 30 to 55 FPS depending (with no map mods installed)
, the far preferred scaling setting of 400 my card can't really handle.
I researched a lot using reports on forums elsewhere that
there was zero benefit to buy a GTX770, as those people report pretty much the same figures.
Only a GTX780 black or 780 TI would offer tangible rewards for me, with this game at the screen resolution I run at.
So yes, A GTX 580 is still a beast, I have zero reason to upgrade this card as of September 2014 other than that one game , ETS2. LOL. I can wait it runs well enough for me if I put up with 200 scaling. (actually it does okay at 300 scaling but that is only if I run with no map mod installed)
A long reply, perhaps unneeded. but it just highlights that people often falsely assume the numbers of the model number are bigger that something is better. (In case someone nitpicks me I did say do not rely alone on
passmark numbers, one always cross refers to real application performance)
I paid near 500 bucks back in the day for my GTX 580.
The only area it suffers in , from my perspective, is it is only 1.5GB Vram.
Of course, if I had to replace it in an emergency with a near performing Nvidia card, it would be a GTX770 but I have my eyes on the cards Nvidia is expected to release later this year.
-
Reply to Foozie
m
0
l
HughMann
September 16, 2014 5:06:39 PM
HughMann
September 16, 2014 5:12:33 PM
Danizzle
September 16, 2014 5:12:45 PM
Hoyzon
September 16, 2014 5:37:30 PM
ss7900
September 16, 2014 5:41:26 PM
flawlessturd
September 16, 2014 6:31:55 PM
Oscaron
September 17, 2014 4:36:14 AM
Kannibal Kraut
September 17, 2014 6:31:26 AM
Furiano
September 18, 2014 5:33:17 AM
Jeffrey H
September 18, 2014 8:57:19 PM
InvalidError said:
Jeffrey H said:
Yet anything with 600 Watts or more should require the Power Supply be set on 240 Volt Setting given how much power they take, and I know that is not "Power Bill Friendly", which I am sure I going going to get Negative Votes on my post for. If you want to set your PC's power supply on 240V input, you will need a 240V circuit going to that outlet. This is fine for countries that run regular outlets on 220-250V where you would be using the 240V input anyway but in 100-125V countries, this is generally not an option.
Most modern devices and decent quality power supplies have universal input so there is no 120/240V switch on them anymore.
Well seeing Onus that I broke one of the Rules for making a "False Statement", I figure he needs to understand the Higher the Wattage on Video Cards Reduces the Energy Efficiency it would be like plugging in a Microwave having it running for a full 24 hours thus using more Power, which is not something I like because it would make someone's Power Bill go up to $100 more a Month, and it is just something that if you want "Realistic" 3D and VR, then it would mean a higher Power Bill compared to someone who is not interested in games like that, I know Onus might warn me again for breaking one of the rules here for Discussion but I am someone who is both on a Budget and not like Computers to be like Microwaves.
-
Reply to Jeffrey H
m
0
l
- 1 / 2
- 2
- Newest
Related resources
- Solved200$ budget, best graphics card for the money. Forum
- SolvedBest Graphics Card for MY Money? Forum
- Buying a best value for money graphics card below 50$ Forum
- SolvedBest CPU heatsink for the money July 2012 Forum
- Best nvidia graphics card for gaming 2012 under 5000 Forum
- Solvedwhat is the best graphics card money can buy for budget of £200-300 Forum
- Best graphics card upgrade for the money Forum
- Best nvidia graphics card for gaming 2012 Forum
- SolvedLooking for best graphics card for the money! Help Forum
- Which graphics card is best for the money? Forum
- best graphics card for the money Forum
- SolvedBest graphics card for the money... Forum
- Best Graphics cards for the money December 2010 guide Forum
- Where's Octobers "Best Graphics Cards For The Money" Forum
- Best way to spend money on new graphics card(s) Forum
- More resources
!