Radeon 6950 vs. Geforce 560 ti?

bgdiner

Distinguished
Nov 14, 2011
11
0
18,510
Hi guys,

I'm thinking about upgrading my video card (currently a Radeon HD 5670) to either a Geforce 560ti or a Radeon 6950. Both seem like pretty good deals, benchmark around the same (from what I've seen), and hit the sweet spot in terms of my budget this holiday season. Right now the 5670 is running fine, Skyrim is smooth and plays at mostly high settings, but I can still feel it chugging about sometimes. I've got a i5 750 running at 2.67 ghz, 4gb memory DDR3, the 5670, 1 TB WD hard drive, and I believe a 650w PSU. Most of all, I would like to play Battlefield 3 well, and I see that I do not hit the recommended requirements (a 6950 or a 560ti). Any opinions on the cards would be greatly appreciated.

Edit: One more thing. Is the 1GB version of either card good enough, or is 2GB worth the money? I know that VRAM is important in gaming (my main use for this computer), so would it be worth the jump in price? Thanks
 
Solution
Either card you decide on you have made a good choice, they are pretty much on par with each other. From experience the 560ti performs well and overclocks very well too.
The 6950 is a great card itself, and alot of users will agree there.
Comes down to which games you play, I would have a look at the perfromance charts for the games you play and compare the cards and pick which one performs better for you.

monsta

Splendid
Either card you decide on you have made a good choice, they are pretty much on par with each other. From experience the 560ti performs well and overclocks very well too.
The 6950 is a great card itself, and alot of users will agree there.
Comes down to which games you play, I would have a look at the perfromance charts for the games you play and compare the cards and pick which one performs better for you.
 
Solution

jp37

Distinguished
Feb 22, 2010
6
0
18,510



For my monitor I'm just using a tv...I don't know too much about monitors and things look good on this so I haven't wanted to make a switch yet unless it makes a huge difference. This is the tv I'm using:

http://www.newegg.com/Product/Product.aspx?Item=N82E16889253160
TOSHIBA 32" 720p LCD HDTV 32AV502U
the max resolution listed is 1366 x 768. One question about resolutions - why is it that when I do put 1900 x 1200 on a game, like starcraft 2, it seems to work? it doesnt give me any errors and fills up the entire thing widescreen. I'm not sure if it is just saying it goes to those resolutions but in reality does not.. in any case it hasn't seemed to cause me problems so I'm not upgrading the monitor just yet, unless someone shows me the error of my ways :)

Most of the stuff you asked for otherwise is in my original two posts so I will refer you to that to not spam (budget, motherboard, PSU)

Thanks for the replies so far guys, I do appreciate the help
 

sidnitzerglobin

Distinguished
Sep 23, 2008
21
0
18,510


I'd bet your output is being scaled down either at the display driver or the display itself (most likely the driver).

It's quite possible you'd get a decent bump up in framerate if you run the games at the native resolution of your TV. Worth a shot.
 

TRENDING THREADS