Sign in with
Sign up | Sign in

Best Graphics Cards For The Money: May 2015

Best Graphics Cards For The Money: May 2015
By , Don Woligroski

Only a few weeks remain before Computex 2015 kicks off in Taipei, Taiwan. Is the gaming graphics card market poised to heat up? Based on the past month of silence from both AMD and Nvidia, it'd be fair to bet that both companies have imminent plans.

Detailed graphics card specifications and reviews are great, assuming you have the time to do the research. But at the end of the day, a gamer needs to know what the best graphics card is for their money. So, if you don’t have the time to research the benchmarks, or if you don’t feel confident enough in your ability to pick the right card, then fear not. We've compiled a simple list of the best gaming cards offered in any given price range.

May Updates:

With one month to go before Computex 2015, both AMD and Nvidia are keeping their powder dry. Neither GPU vendor had much to talk about in April, and we’re not expecting any excitement in May, either.

Presumably, enthusiasts are similarly sitting on their upgrade budgets, waiting to see what surfaces in stormy Taipei. And we don’t blame you. New high-end hardware typically compels more competitive pricing, putting pressure on existing components. Then again, we’ve also seen the run-up to a major graphics launch accompanied by cuts intended to move inventory before it’s rendered less valuable. So, how has the market changed since the last time we summarized our graphics card recommendations?

To start, availability of Nvidia’s GeForce GTX Titan X isn’t much better. The company’s own geforce.com site has the lowest price ($1000), but now stock isn't expected until May 14th. Ouch. At least there is a handful of cards on Amazon if you’re willing to pay a slight premium on this already-expensive flagship. Or maybe you’d be better off exercising a little patience. There may be something on the horizon better suited to gamers with cash to spend.

GeForce GTX 980s certainly aren’t any cheaper. If anything they cost slightly more. The GTX 970 is still mostly a $330 board, though discounts and rebates push certain models down closer to $310, pulling even with the least-expensive Radeon R9 290X cards. As with last month, it’s difficult for us to declare one offering better than the other. Their strengths and weaknesses are already well-known, and the face-off between two well-matched contenders often becomes a passion play (yes, even when one uses significantly more power). Choose a favorite based on your requirements and allegiances.

Last month, the Radeon R9 280X took a recommendation for its $240-$250 price tag, which fell closer to $230 after rebates. Now you can find the 280X for $230, and rebates nudge at least one model nearer to $200. Availability looks to be waning (not surprising given Tahiti’s age). But scoring a deal today gets you playable performance at 2560x1440.

The value looks even better considering Nvidia’s GeForce GTX 960 hasn’t budged; it continues flirting with last month’s price point. Meanwhile, the slightly slower Radeon R9 280 can be found a bit cheaper. Plus, it’s being discounted to the tune of $170 with rebates. We’re adding that board as an honorable mention.

The GeForce GTX 750 Ti marches on as a predominantly $150 card. Nvidia appears to be playing the rebate game though, knocking anywhere from $10 to $30 off its price. If you’re willing to fill out the paperwork, we like the 750 Ti at $120 to $130. With that said, we recently tapped AMD’s Radeon R9 270 for its superior performance. Our recommendation carries forward in May, since several 270s also include $20 mail-in rebates to rival the best GeForce deal.

Certain models of the Radeon R7 260X are down $10 or more, landing them around $110 (or in one case, $90 after a big mail-in rebate). That compares favorably to the slower and more expensive GeForce GTX 750. As a result, AMD keeps its recommendation.

The short of it is this: nothing happened in the last month to trigger landscape-changing movement. So, our recommendations mostly carry over unmolested. Count on next month’s update to include more exclamatory analysis.

Some Notes About Our Recommendations

A few simple guidelines to keep in mind when reading this list:

  • This list is for gamers who want to get the most for their money. If you don’t play games, the cards on this list are more expensive than what you really need. We've added a reference page at the end of the column covering integrated graphics processors, which is likely more apropos for home, office, and basic multimedia usage models.
  • Be sure to check out our new performance per dollar comparison page, where you can overlay the benchmark data we’ve generated with pricing, giving you a better idea where your ideal choice falls on the value curve. The criteria to get on this list are strictly price/performance.
  • Recommendations for multiple video cards, such as two Radeon cards in CrossFire mode or two GeForce cards in SLI, typically require a motherboard that supports CrossFire/SLI and possibly a chassis with plenty of space to install multiple graphics cards. These setups also usually call for a beefier power supply than what a single card needs, and will almost certainly produce more heat than a single card. Keep these factors in mind when making your purchasing decision. In most cases, if we have recommended a multiple-card solution, we try to recommend a single-card honorable mention at a comparable price point for those who find multi-card setups undesirable.
  • Prices and availability change on a daily basis. We can’t base our decisions on always-changing pricing information, but we can list some good cards that you probably won’t regret buying at the price ranges we suggest, along with real-time prices for your reference.
  • The list is based on some of the best U.S. prices from online retailers. In other countries or at retail stores, your mileage will almost certainly vary.
  • These are new card prices. No used or open-box cards are in the list. While these offers might represent a good deal, it’s simply outside the scope of what we’re trying to do.
Add a comment
Ask a Category Expert
React To This Article

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 39 comments.
  • 0 Hide
    adamovera , May 2, 2015 11:06 AM
    Archived comments are found here: http://www.tomshardware.com/forum/id-2604356/graphics-cards-money-october-2014.html
  • 5 Hide
    CO builder , May 2, 2015 1:39 PM
    We need Intel HD 4600 and HD 2500 added. This will let BIYers know how much card they'll need to see improvement over the new integrated graphics.

    Is a GT 720 or 730 worth adding when you have the IGP already?
  • 2 Hide
    Larry Litmanen , May 2, 2015 3:25 PM
    This is how to show for a card the RIGHT WAY.

    If you want to play big Triple A games on medium settings from now for the next 2 years get a GTX 750ti (or it's AMD competitor), if you want to play any game on max from now for the next two years get a GTX 960 (or it's AMD competitor) , and if you want to play every game on max for the next 3 years get a GTX 970(or it's AMD competitor) or higher.

    Indies games run on any card because they are not too demanding on graphics. This solution works for 2 years and in 2 years or so you will need to upgrade to a new GPU.
  • -1 Hide
    Tylerr , May 2, 2015 3:47 PM
    @Larry Litmanen

    What if i just want to be able to play any game on the lower settings for the next 4-5 years? Don't really care that much about graphics as long as i'm able to play the game with a decent frame rate.

    Was able to do that with my gts 8800 640 until about a year or so ago when games started taking 1gb+ of vram to play.
  • 1 Hide
    Gurg , May 2, 2015 6:30 PM
    My suggestion would be to lop off the old bottom 2/3 of cards below the 5770. Those are some very old cards. For the cards above that I would suggest color coding the groupings by the resolution game pack max play ability. ie 4k high, medium, low resolution, 1440 H,M,L 1080 H, M, L, Then someone looking at the chart would know that for a given monitor resolution and game settings the GPU will generally meet those requirements,
  • 2 Hide
    jesh4622 , May 2, 2015 8:11 PM
    These recommendations are off. Rebates count for Nvidia's bottom line but not AMD's? If you count rebates for both brands, the 290 is going for ~240 and the 290x is going for ~280, while the gtx 970 starts at ~300.
  • 0 Hide
    alextheblue , May 2, 2015 9:12 PM
    Quote:
    We need Intel HD 4600 and HD 2500 added. This will let BIYers know how much card they'll need to see improvement over the new integrated graphics.

    Is a GT 720 or 730 worth adding when you have the IGP already?


    Depends on your needs. For example for an HTPC build the NVIDIA cards would be better, especially in the driver department but also in terms of decode offloading of newer codecs.
  • 2 Hide
    maratc , May 3, 2015 2:50 AM
    The "GPU Hierarchy Chart" page and "Performance per Dollar" page bring the same page, which is Hierarchy chart. Performance per dollar now 404's.
  • 4 Hide
    Cryio , May 3, 2015 4:43 AM
    You should really update the Intel GPU chart guys.
  • 0 Hide
    manhalfgod , May 3, 2015 6:09 AM
    Newegg has way better deals than Amazon go look and see all those prices above are all worthless

    SAPPHIRE TRI-X OC 100361-2SR Radeon R9 290X 4GB 512-Bit GDDR5 =$269.99 After $20.00 MIR
    SAPPHIRE 100361-4L Radeon R9 290X 4GB 512-Bit GDDR5 PCI Express 3.0 Tri-X OC(UEFI) Video Card=$279.99 After $20.00 MIR
    SAPPHIRE 100362-3L Radeon R9 290 4GB 512-Bit GDDR5 CrossFireX Support Tri-X OC Version (UEFI) = $239.99 after $20.00 rebate
    http://promotions.newegg.com/NEemail/May-0-2015/MayThe4th-01/index-landing.html?utm_medium=Email&utm_source=IGNEFL050115&nm_mc=EMC-IGNEFL050115&cm_mmc=EMC-IGNEFL050115-_-EMC-050115-Index-_-E0-_-PromoWord&et_cid=17730&et_rid=2583378&et_p1=

    http://promotions.newegg.com/NEemail/Apr-0-2015/PowerIntoMay30/index-landing.html?utm_medium=Email&utm_source=IGNEFL043015&nm_mc=EMC-IGNEFL043015&cm_mmc=EMC-IGNEFL043015-_-EMC-043015-Index-_-E0-_-PromoWord&et_cid=17701&et_rid=2583378&et_p1=

    http://promotions.newegg.com/NEemail/Apr-0-2015/Top8HallofFame28/index-landing.html?utm_medium=Email&utm_source=IGNEFL042815&nm_mc=EMC-IGNEFL042815&cm_mmc=EMC-IGNEFL042815-_-EMC-042815-Index-_-E0-_-PromoWord&et_cid=17660&et_rid=2583378&et_p1=
  • -1 Hide
    johnnyb105 , May 3, 2015 6:19 AM
    If you haven't noticed by now and I like toms but this is more less NVidia and Intel site AMD will always be runner up even if they are first... I belive you call that hmmm whats the word???? Bias.....
  • 3 Hide
    arielmansur , May 3, 2015 6:42 AM
    When are you going to add AMD APUS? those integrated gpus need to be here too! and add all the missing intel integrated gpus too!.
  • -9 Hide
    rav_ , May 3, 2015 2:34 PM
    This is all so much garbage.


    DX12 changes EVERYTHING.

    In fact a SINGLE AMD A6-7500k running API Overhead Benchmarks running DX12 OUTPERFORMS an Intel i7-4960 with an nVidia GTX 980 running DX11.

    Using the 3dMark V1.5 API Overhead Benchmark with DX12 AMD's $100 APU A6-7400k produces 4.4MILLION drawcalls.

    Using the same 3dMark benchmark test in DX11 a $1700Intel and nVidia system ONLY produces 2.2 MILLION drawcalls.

    This little artical herre is Soooooooooo lame that it is laughable.
  • -5 Hide
    rav_ , May 3, 2015 2:57 PM
    Dx12: is it the best friend that AMD has?

    About a month ago Anandtech ran some extensive 3dMarkv1.5 API Overhead benchmarks. They tested both dGPU and integrated APU's and IGP.

    http://bit.ly/1GCjLzU

    Here is an interesting fact.

    Using DX11 as a baseline to compare the performance delta the following was undertood.

    Intel i7 4960 and GTX980 can produce 2.2MILLION draw calls running DX11.

    i7-4960 has 6 cores and 12 threads.

    Intel i7-4960 = $1200

    nVidia GTX-980 = $540

    Total = $1740

    Of course DX11 is the API that all benchmarks have been running up until now.

    However when you run 3dMark API Overhead test using DX12 something interesting happens.

    AMD's A6-7400 APU can produce 4.4 million draw calls.

    AMD A6-7400 costs $90-150 depending upon outlet.

    A6-7400k has 2 cores. Hmmmmm..... 2 cores vs 6 cores? $100 vs $1200?

    Of course when you run the same benchmark on A6 using DX11 API the Draw Call Overhead drops to 513,000. When compared to the Intel/nVidia system costing $1700 the justifcation becomes clear. You spend the money for 2.2 million draw calls or a 4x performance increase over a $100 cpu!!!

    Seriously? $1700 just for a 400% peformance increase over a $100 APU?

    Mantle and DX12 has changed the game.

    Last year the media was comparatively benching very expensive dGPU silicon just gain a few percentage points for a score that NOW can be achieved with a $100 AMD APU. Not ONLY achieved but can gain a 100% increase in performance over the more expensive system.

    Still think DX12 will have no impact?

    Intel and nVidia has been ripping off the consumer using DX11 when a much better API; Mantle and now DX12 makes low priced and low performing $100 APU's OUTPERFORM the "BEST ON THE MARKET".

    Now that XBOX will be adopting DX12 the gain in performance will be far better than ANY combination of Intel CPU and nVidia GPU you can put together and currently running DX11.

    In otherwords.....

    ...if you are happy and satisfied with the performance of your current DX11 $2000 gaming system then you should be ecstatic to achieve 2x the performance with a $400 DX12 AMD gaming system.

    How does this relate to how AMD can save itelf?

    1. AMD needs to change the way the media benchmarks it's silicon. Now the media bench's Radeon against nVidia on an Intel platform.

    Well if I am going to spend $1500 on a new DX12 dGPU card the I want to know EXACTLY what CPU is best for my investment.

    In fact if Tom's hardware can't tell me as a consumer the best SYSTEM suing BOTH Intel AMD mainboards then they should just dry up and blow away.

    Two proprietary design features of GCN give AMD Radeon APU''s and dGPU a considerble performance advantage: Asynchronous Shader Pipeline and and Asynchronous Compute Engines. Unlike Intel CPU's AMD CPU's can execute graphics instruction on ALL CPU cores.
  • -4 Hide
    fnmunoz9969 , May 3, 2015 4:36 PM
    I'm sorry but the R9 290 actually beats a GTX 970 in many benchmarks, not even counting the 290x here, and won't have any issue utilizing its full memory buffer. Also the TDP for even a reference 970 is way higher than 145w. 2x 6 pin (2x75w = 150w) plus the 75w through pci-e = 225w the card was room to draw. Personally I've never seen my 290 pull higher than 220w and that uses 8 pin (150) + 6 pin (75w)
  • 1 Hide
    jdw_swb , May 3, 2015 7:41 PM
    Quote:
    I'm sorry but the R9 290 actually beats a GTX 970 in many benchmarks, not even counting the 290x here, and won't have any issue utilizing its full memory buffer. Also the TDP for even a reference 970 is way higher than 145w. 2x 6 pin (2x75w = 150w) plus the 75w through pci-e = 225w the card was room to draw. Personally I've never seen my 290 pull higher than 220w and that uses 8 pin (150) + 6 pin (75w)


    Where are the many benchmarks in which the 290 beats out the GTX 970?
  • 0 Hide
    jamisont1 , May 3, 2015 9:20 PM
    Quote:
    Quote:
    I'm sorry but the R9 290 actually beats a GTX 970 in many benchmarks, not even counting the 290x here, and won't have any issue utilizing its full memory buffer. Also the TDP for even a reference 970 is way higher than 145w. 2x 6 pin (2x75w = 150w) plus the 75w through pci-e = 225w the card was room to draw. Personally I've never seen my 290 pull higher than 220w and that uses 8 pin (150) + 6 pin (75w)


    Where are the many benchmarks in which the 290 beats out the GTX 970?


    he probably saw benchmarks on certain games that amd cards performs better such as evolve.
    overall 970 beats 290x in FHD, pretty equal on QHD, 290x beats 970 on UHD but both cards simply sux for UHD.
    best 290x card is probably 290x lightning and vapor-x OC, and top 970 cards like HOF/FTW+/superjetstream/G1/TOP/EXOC/ampex beats those easily.
  • 0 Hide
    fnmunoz9969 , May 3, 2015 9:33 PM
    Quote:
    Quote:
    I'm sorry but the R9 290 actually beats a GTX 970 in many benchmarks, not even counting the 290x here, and won't have any issue utilizing its full memory buffer. Also the TDP for even a reference 970 is way higher than 145w. 2x 6 pin (2x75w = 150w) plus the 75w through pci-e = 225w the card was room to draw. Personally I've never seen my 290 pull higher than 220w and that uses 8 pin (150) + 6 pin (75w)


    Where are the many benchmarks in which the 290 beats out the GTX 970?



    Here's a futuremark firestrike 1.1 benchmark from just now with my Core i7 4770k @ 4.5Ghz and R9 290 @ 1100/1400 compared to a "highend PC" with a GTX Titan

    http://www.3dmark.com/compare/fs/4740941/fs/594794

    20% faster in almost all scores

    Here's a 970, above it in the charts is a 290x with its stock 3dmark 11 score

    http://www.futuremark.com/hardware/gpu/NVIDIA+GeForce+GTX+970/review
  • 0 Hide
    fnmunoz9969 , May 3, 2015 9:43 PM
    Quote:
    Quote:
    I'm sorry but the R9 290 actually beats a GTX 970 in many benchmarks, not even counting the 290x here, and won't have any issue utilizing its full memory buffer. Also the TDP for even a reference 970 is way higher than 145w. 2x 6 pin (2x75w = 150w) plus the 75w through pci-e = 225w the card was room to draw. Personally I've never seen my 290 pull higher than 220w and that uses 8 pin (150) + 6 pin (75w)


    Where are the many benchmarks in which the 290 beats out the GTX 970?


    Also I meant to specify 4k benchmarks mostly, otherwise a 290 and 970 are very close in perfomance. A 290x will be better.
  • 0 Hide
    jamisont1 , May 3, 2015 10:12 PM
    Quote:
    Quote:
    Quote:
    I'm sorry but the R9 290 actually beats a GTX 970 in many benchmarks, not even counting the 290x here, and won't have any issue utilizing its full memory buffer. Also the TDP for even a reference 970 is way higher than 145w. 2x 6 pin (2x75w = 150w) plus the 75w through pci-e = 225w the card was room to draw. Personally I've never seen my 290 pull higher than 220w and that uses 8 pin (150) + 6 pin (75w)


    Where are the many benchmarks in which the 290 beats out the GTX 970?


    Also I meant to specify 4k benchmarks mostly, otherwise a 290 and 970 are very close in perfomance. A 290x will be better.



    its not hard to find benchmarks.
    but results are similar to this.
    http://www.tomshardware.com/reviews/sapphire-vapor-x-r9-290x-8gb,3977-6.html


    in FHD 970 beats 290x, QHD pretty equal, UHD 290x beats 970.
    benchmarks you saw usually are done by reference cards, and here's the thing.
    reference 290x = 1000mhz
    reference 970 = 1178mhz (boost)

    top 290x cards like lightning = 1080mhz (+80mhz factory oc'ed)
    top 970 cards like hof = 1380mhz (+202mhz factory oc'ed)

    so if benchmarks for reference cards were equal, there's gonna be difference with the cards that you actually buy.

    btw your oc'ed 290 scored 12231.0 GS, top 970 cards score around that without overclock.
    +13000s easily if oc'ed, ive seen upto 14000s.

    http://www.3dmark.com/3dm/6850796

    970 super jetstream 12496 GS Core clock 1,190 MHzMemory bus clock1,800 MHz (no overclocked, just factory oc'ed)
    you can check its spec here
    http://www.gpuzoo.com/GPU-Xenon/GeForce_GTX970_Super_JETSTREAM_D5_4GB.html

Display more comments
React To This Article