Sign-in / Sign-up
Your question
Closed

GTX 560 vs HD 6950 1gb??

Tags:
  • Graphics Cards
  • Gtx
  • Performance
  • HD
  • Graphics
  • Product
Last response: in Graphics Cards
a b U Graphics card
January 26, 2011 10:21:35 AM

HI,
Everyone was waiting for GTX 560 to release and they have expected the same performance it it providing now but have ATI turned Nvidia's big card down by releasing HD 6950 1GB that performs better than GTX 560????

More about : gtx 560 6950 1gb

a b U Graphics card
January 26, 2011 10:25:13 AM

Hi, from what i can see on reviews, the 560 is a really excellent buy based off it's own merit.

However, the 6950 performs equally (maybe slightly better due to the increased memory capacity at higher resolutions) and is priced very very competitively in light of this.

Some are saying they are 50/50 torn, others are edging towards the 6950.

At this level, both cards will be an excellent buy and you will not be disappointed by either, it's just up to you to make the choice.
Score
0
January 26, 2011 10:42:34 AM

HD6950 would be better ;) 
Score
0
Related resources
a b U Graphics card
January 26, 2011 11:36:34 AM

6950 flashed to a 6970 (doesn't equal a full 6970 usually btu will unlock the dormant parts that are functional) is really a hard deal to beat, however it depends on what your intended use is

if you need cuda (if you have to ask you don't) then nvidia

if general gaming then 6950 flashed to 6970

if specific few titles need to know the titles as some games prefer nvidia archetecture over amd/ati (usually this mostly means if you playWoW/SCII primarily as they prefer nvidia )
Score
0
a c 413 U Graphics card
January 26, 2011 12:25:56 PM

Both perform equally well overall so it's a tough call. Assuming your monitor is 1920 x 1080...

If you play Battlefield 2, Lost Planet 2, BattleForge then the GTX 560 performs better; enough to choose it over the HD 6950.

If you play Aliens v.s Predator or F1 2010 then the HD 6950 performs better enough for you to buy it instead.

If you are still undecided, then buy whichever is cheaper. You really can't loose.

Click the following for benchmarks:

http://www.xbitlabs.com/articles/video/display/geforce-...
Score
0
January 26, 2011 1:32:42 PM

I may add 6950 produces better video quality.
Score
0
a b U Graphics card
January 26, 2011 1:47:00 PM

bujuki said:
I may add 6950 produces better video quality.


Please elaborate?
Score
0
a b U Graphics card
January 26, 2011 1:49:56 PM

bujuki said:
I may add 6950 produces better video quality.



thats funny considering the video quality settings amd/ati were using to artificially inflate thier fps by LOWERING image quality

while admitadly it was small and basically insignifigant it really did change the image so it should have been an optimisation setting as an option to turn on by the user nto as the defaults

so what i'm saying is where's your evidence here?

benchmarks and screens showign image quality screens plz
Score
0
a c 241 U Graphics card
January 26, 2011 2:42:26 PM

g00fysmiley said:
thats funny considering the video quality settings amd/ati were using to artificially inflate thier fps by LOWERING image quality

while admitadly it was small and basically insignifigant it really did change the image so it should have been an optimisation setting as an option to turn on by the user nto as the defaults

so what i'm saying is where's your evidence here?

benchmarks and screens showign image quality screens plz


I don't know what is better, but I can say that this whole idea of AMD/ATI reducing quality was sensationalistic.

AMD has this setting called "Catalyst A.I." This setting has special performance boosting capabilities. At anything below "high quality" setting, it will make visual compromises in order to give better performance. This is actually a good thing, as many people need that extra performance. The compromises are usually not noticeable. If you set it to "high quality" it gives at least as good of visual quality as Nvidia does.

Because AMD sets this setting to "quality" instead of "high quality", Nvidia made claims that AMD is lowering image quality for higher performance. Most benchmark sites know about this slider, and set it to "High Quality", so everything is evenly compared.
Score
0
a b U Graphics card
January 26, 2011 3:02:11 PM

yea like i said it was small and basically insignificant but defaul should have been highest quality, hating to beat a dead horse but i'm calling a spade a spade in refrence to his assertion that one has a beter image quality than the other when they are in fact for all intents and purposes as distinguishable by even the most sensitive of human eyes the same
Score
0
a c 332 U Graphics card
January 26, 2011 5:19:45 PM

bystander said:
I don't know what is better, but I can say that this whole idea of AMD/ATI reducing quality was sensationalistic.

AMD has this setting called "Catalyst A.I." This setting has special performance boosting capabilities. At anything below "high quality" setting, it will make visual compromises in order to give better performance. This is actually a good thing, as many people need that extra performance. The compromises are usually not noticeable. If you set it to "high quality" it gives at least as good of visual quality as Nvidia does.

Because AMD sets this setting to "quality" instead of "high quality", Nvidia made claims that AMD is lowering image quality for higher performance. Most benchmark sites know about this slider, and set it to "High Quality", so everything is evenly compared.

Actually most if not all sites were saying that they leave things at the default settings which is why this was seen as a "cheat" from AMD/ATi.
Score
0
a c 1175 U Graphics card
January 26, 2011 5:37:59 PM

AMD themselves acknowledged they were not playing fairly with the image quality hack, as they have increased the default settings on the new Catalyst 11.1 drivers. It would be nice to see reviews using those drivers at default settings.
Score
0
a c 241 U Graphics card
January 26, 2011 5:59:05 PM

17seconds said:
AMD themselves acknowledged they were not playing fairly with the image quality hack, as they have increased the default settings on the new Catalyst 11.1 drivers. It would be nice to see reviews using those drivers at default settings.


I'm not sure this was a good thing for the end user. Sure, it's nice that the benchmarks can be more easily compared, but the reality is, the default comprises were good for most situations. You can't see a noticeable difference the vast majority of the time and gain a small FPS bonus. That's what most people would want.
Score
0
a c 1175 U Graphics card
January 26, 2011 6:05:00 PM

bystander said:
I'm not sure this was a good thing for the end user. Sure, it's nice that the benchmarks can be more easily compared, but the reality is, the default comprises were good for most situations. You can't see a noticeable difference the vast majority of the time and gain a small FPS bonus. That's what most people would want.

If you are not sure this is a good thing, then you have not been paying attention. Without posting links to the numerous review sites that have commented on this topic, the overwhelming consensus is that this was NOT a good thing for the consumer. A race to decrease image quality in order to gain benchmarking success is NOT in the best interests of the gaming community. Extrapolating from your opinion, Nvidia should then lower their default image quality settings, then AMD would lower theirs to match, then Nvidia, then AMD, then we are all looking at blocky pixels on the screen.
Score
0
a c 332 U Graphics card
January 26, 2011 6:08:00 PM

bystander said:
I'm not sure this was a good thing for the end user. Sure, it's nice that the benchmarks can be more easily compared, but the reality is, the default comprises were good for most situations. You can't see a noticeable difference the vast majority of the time and gain a small FPS bonus. That's what most people would want.

I would have thought that most people would want to see fair benchmarks which you won't see if the the driver settings have to be changed by the individual who is doing the testing.
Score
0
a b U Graphics card
January 26, 2011 6:10:59 PM

bystander i agree that i would turn on this kind of setting for better fps at a imperceptable image quality deduction... that said i want to be the one to turn it on... i don't want benchmakrs being skewed by this sort of thing and it if accepted would be part of a VERY slippry slope that i for one don't want to see
Score
0
a c 332 U Graphics card
January 26, 2011 6:18:21 PM

Quote:
Mousy.. Thanks for posting that little info. I didn't know that about AMD drivers and just turned all image settings to max

If only all the review sites did that for both camps! It could make for some interesting results. :lol: 
Score
0
a c 1175 U Graphics card
January 26, 2011 6:35:30 PM

· Catalyst AI Texture Filtering updates

· The Quality setting has now been improved to match the High Quality setting in all respects but one; it enables an optimization that limits tri-linear anisotropic filtering to areas surrounding texture mipmap level transitions, while doing bilinear anisotropic filtering elsewhere. This optimization offers a way to improve filtering performance without visibly affecting image quality

· The Performance setting has also been updated to address comments about the sharpness of the default Quality setting causing shimmering in certain cases. It now provides a smoother filtering option that eliminates most shimmering while preserving the improved detail provided by anisotropic filtering.
Score
0
a c 241 U Graphics card
January 26, 2011 6:49:45 PM

Mousemonkey said:
I would have thought that most people would want to see fair benchmarks which you won't see if the the driver settings have to be changed by the individual who is doing the testing.


There is nothing stopping them from setting that to "High Quality" for the benchmark, so they are compared evenly, while the average user would use "Quality" setting for better overall performance per visual quality. Benchmarkers are generally better informed and know to adjust a setting, but your average user isn't.

While this was boohooed by the benchmarkers, who had to pay attention to this, to the end user, it's usually best they leave it at the old default.
Score
0
a c 241 U Graphics card
January 26, 2011 6:52:27 PM

17seconds said:
If you are not sure this is a good thing, then you have not been paying attention. Without posting links to the numerous review sites that have commented on this topic, the overwhelming consensus is that this was NOT a good thing for the consumer. A race to decrease image quality in order to gain benchmarking success is NOT in the best interests of the gaming community. Extrapolating from your opinion, Nvidia should then lower their default image quality settings, then AMD would lower theirs to match, then Nvidia, then AMD, then we are all looking at blocky pixels on the screen.


Read my reply to MM's post. For comparisons, it's nice to have it set even, and those doing benchmarks are quite capable of doing so. The end user, on the other hand, usually needs more hand holding, and are better off with the old default.
Score
0
a c 1175 U Graphics card
January 26, 2011 6:55:25 PM

For consistency, reviewers have stated that they prefer to leave settings at default, which is the whole point of the controversy.
Score
0
a c 332 U Graphics card
January 26, 2011 7:14:00 PM

bystander said:
There is nothing stopping them from setting that to "High Quality" for the benchmark, so they are compared evenly, while the average user would use "Quality" setting for better overall performance per visual quality. Benchmarkers are generally better informed and know to adjust a setting, but your average user isn't.

While this was boohooed by the benchmarkers, who had to pay attention to this, to the end user, it's usually best they leave it at the old default.


Yes there is something stopping them, it's called "time". You obviously don't work in the industry and don't have to benchkmark lots of cards by a particular deadline.
Score
0
a c 241 U Graphics card
January 26, 2011 8:04:09 PM

Mousemonkey said:
Yes there is something stopping them, it's called "time". You obviously don't work in the industry and don't have to benchkmark lots of cards by a particular deadline.


1 guy doing a single benchmark can't spend 10 seconds to change a setting? I find that hard to believe.

I do see another issue they probably find the most difficult. What setting is the best setting to compare? Nvidia doesn't have this performance boosting setup, should it not be considered a positive feature? How do you approach it, as it does have a visual impact, even if it's not visible (which makes it hard to decide what to do with it).

Perhaps Nvidia needs to come up with a similar feature. The concept is a good one, because both don't have it, it does make it harder to compare.
Score
0
a c 332 U Graphics card
January 26, 2011 8:09:53 PM

bystander said:
1 guy doing a single benchmark can't spend 10 seconds to change a setting? I find that hard to believe.

I do see another issue they probably find the most difficult. What setting is the best setting to compare? Nvidia doesn't have this performance boosting setup, should it not be considered a positive feature? How do you approach it, as it does have a visual impact, even if it's not visible (which makes it hard to decide what to do with it).

Perhaps Nvidia needs to come up with a similar feature. The concept is a good one, because both don't have it, it does make it harder to compare.

So all the websites only have a single benchmark? I think not.
Score
0
a c 241 U Graphics card
January 26, 2011 8:22:49 PM

Mousemonkey said:
So all the websites only have a single benchmark? I think not.


It's a single setting that defaults for all programs.
Score
0
a c 332 U Graphics card
January 26, 2011 9:10:53 PM

bystander said:
It's a single setting that defaults for all programs.

That's assuming that all the testing is done at the same time, in this kind of case I tend to listen to what the reviewers say rather than take any notice of your misguided beliefs.
Score
0
January 26, 2011 9:46:23 PM

I am confused. Which one is better? (including physx, etc.)
Score
0
a c 332 U Graphics card
January 26, 2011 9:48:03 PM

wigglerthefish said:
I am confused. Which one is better? (including physx, etc.)

If you include PhysX then the GTX560 is the one to go for as no hacks would be required.
Score
0
a c 241 U Graphics card
January 26, 2011 10:01:15 PM

If you want PhysX, ya, go with the 560. If you don't have any games that use GPU accelerated PhysX, then I prefer the 6950 due to MLAA and an official SSAA.

You can't really go wrong either way.
Score
0
a c 332 U Graphics card
January 26, 2011 10:04:38 PM

Or get a Voodoo 3 and party like it's 1999! :lol: 
Score
0
January 26, 2011 11:51:09 PM

It's funny how newegg doesn't have any reference gtx560s on the menu.

For the MSRP of a reference gtx560 you can get MSI's Twin Frozr II version, which performs about on par with the Radeon 69502GB. Seen here: http://www.guru3d.com/article/msi-n560gtx-ti-twin-froze...

For the price of a factory overclocked Radeon 69501GB you can get Gigabyte's gtx560 SuperOverclock version, which stomps all over everything else. Seen here: http://www.guru3d.com/article/gigabyte-gtx-560-ti-soc-r...

It's a good time to be in the market for some new cards.
Score
0
a c 332 U Graphics card
January 27, 2011 8:06:31 AM

bystander said:
Oh, btw, have you broken your piggy bank for a new set of 560's yet?

Not yet, I never like to buy at launch time because prices may come down a bit as more models become available but I shall get a pair at some point.
Score
0
a b U Graphics card
January 27, 2011 8:24:35 AM

Just with regard to the 560's, what is the card length and where are the power connectors located?

I am thinking of putting one in my Micro-Atx build, however my case will take up to 10.5" cards at a squeeze providing the power connectors are located on the top and not the side. It's currently running a Palit 460 that fits nicely.
Score
0
a c 332 U Graphics card
January 27, 2011 8:38:14 AM

Griffolion said:
Just with regard to the 560's, what is the card length and where are the power connectors located?

I am thinking of putting one in my Micro-Atx build, however my case will take up to 10.5" cards at a squeeze providing the power connectors are located on the top and not the side. It's currently running a Palit 460 that fits nicely.

About 9" and the power connectors are at the rear of the cards.
Score
0
a b U Graphics card
January 27, 2011 8:57:17 AM

Ah ok, so i'll be able to squeeze one in just about. I hope the IB's keep to that reference length though i suspect that some may increase the length to accommodate beefier VR phases etc.

Thanks for the heads up MM.
Score
0
a c 332 U Graphics card
January 27, 2011 8:58:21 AM

NP.
Score
0
a c 332 U Graphics card
January 27, 2011 9:16:03 AM

Griffolion said:
Ah ok, so i'll be able to squeeze one in just about. I hope the IB's keep to that reference length though i suspect that some may increase the length to accommodate beefier VR phases etc.

Thanks for the heads up MM.

It seems that some of them have the power connectors on the side of the card as well.


Score
0
a b U Graphics card
January 27, 2011 9:19:41 AM

That looks pretty beastly! Was that the Gigabyte one Toms used in the review that hit 1GHz?

While this may defeat the point of the quietness of the coolers in the 5 series, i could do with one that has a blower fan rather than co-axial ones as my case employs positive pressure cooling (two 120mm fan intakes at the front with a 15 CFM contraflow exhaust fan at the back) so i could do with some more outward bound air rather than simply blowing it around in the case.
Score
0
a c 332 U Graphics card
January 27, 2011 10:09:58 AM

Griffolion said:
That looks pretty beastly! Was that the Gigabyte one Toms used in the review that hit 1GHz?

While this may defeat the point of the quietness of the coolers in the 5 series, i could do with one that has a blower fan rather than co-axial ones as my case employs positive pressure cooling (two 120mm fan intakes at the front with a 15 CFM contraflow exhaust fan at the back) so i could do with some more outward bound air rather than simply blowing it around in the case.

I'm not sure, there was a picture of a reference card but over the last couple of days there have been a whole slew of cards showing up on different websites that when I do come round to choosing for myself it's going to be difficult enough and trying to point one out to someone else who is after a particular feature is going to be mind bending! :lol: 
Score
0
a b U Graphics card
January 27, 2011 10:22:39 AM

Haha indeed, i'll keep a look out, EVGA released a blower fan designed 460 (the best selling 460 in the UK i think) so i hope they lather, rinse, repeat with the 560.

But then again i'm totally neglecting AMD; im sure the 6 series has some really good performance to offer, however their cards have always been quite long and space is an issue for my M-ATX case!
Score
0
a b U Graphics card
January 27, 2011 11:47:00 AM

i still have a voodoo 3 in my windows 98 SE rig (my highschool rig put together from money made flipping burgers) >_< i use it to play some older games (ms dos type stuff)

I am really liking the 560's... but since I mostly play WoW i'm having a hard time justifying it over my sli'd 450's .. oh well ... 4 months left till student loans paid up anyway so i'll have to make that call then

btu to the OP pretty much the cards are both good, if you list the games you play we can actually get off the driver and image quality debate and actually try to answer your question better :D 
Score
0
January 27, 2011 2:26:30 PM

g00fysmiley said:
i still have a voodoo 3 in my windows 98 SE rig (my highschool rig put together from money made flipping burgers) >_< i use it to play some older games (ms dos type stuff)

I am really liking the 560's... but since I mostly play WoW i'm having a hard time justifying it over my sli'd 450's .. oh well ... 4 months left till student loans paid up anyway so i'll have to make that call then

btu to the OP pretty much the cards are both good, if you list the games you play we can actually get off the driver and image quality debate and actually try to answer your question better :D 


I mainly play WoW as well, but I would like to get into more FPS like BF2, Crysis, Just Cause. Right now I am using a 8800GTS from way back so i'd like to upgrade.

1. How hard is it to flash the 6950?
2. 560 or 6950. Which overall card is better for more FPS (I play on 22" now, 24" later)?
3. How hard, for a noob, would it be to overclock a Frozr to the SOC's stock numbers?
Score
0
a c 241 U Graphics card
January 27, 2011 2:31:58 PM

nged72 said:
I mainly play WoW as well, but I would like to get into more FPS like BF2, Crysis, Just Cause. Right now I am using a 8800GTS from way back so i'd like to upgrade.

1. How hard is it to flash the 6950?
2. 560 or 6950. Which overall card is better for more FPS (I play on 22" now, 24" later)?
3. How hard, for a noob, would it be to overclock a Frozr to the SOC's stock numbers?


1. It's pretty easy. I did it. But the safer bet, after seeing some other results, would be to unlock the shaders, which simply requires you to backup your BIOS just in case, and run a script.
2. The 6950 would for most games, especially with the flash or shader unlock, but WoW heavily favors Nvidia cards. For that game the 560 would be better.
3. I wouldn't recommend overclocking, unless you run into games you can't get the performance you want. There is no reason to risk damaging or shortening the life of your card when you don't need the boost.
Score
0
a b U Graphics card
January 27, 2011 2:53:34 PM

1) fairly easy to do lots of guides out there
2) 560 for wow but 6950 in general ... also 22" vs 24" doesn't matter unless the resolution change
3) why overclock if it runs everythign fine... when a 570/6950 starts to provide lower fps then overclock and righ tnow there isn't much tat would cause this

also still no word from op rocky >_<
Score
0
a b U Graphics card
January 27, 2011 2:54:52 PM

g00fysmiley said:
1) fairly easy to do lots of guides out there
2) 560 for wow but 6950 in general ... also 22" vs 24" doesn't matter unless the resolution change
3) why overclock if it runs everythign fine... when a 570/6950 starts to provide lower fps then overclock and righ tnow there isn't much tat would cause this

also still no word from op rocky >_<


He has a hell of a thread to come back to!
Score
0
January 27, 2011 4:15:10 PM

I'd probably only go up to 1920 x 1080 ***
Score
0
a b U Graphics card
January 27, 2011 4:53:59 PM

if the current 22" is a 1920x1080 and the 24" will be 1920 x 1080 there won't be any performance different at all
Score
0
January 27, 2011 5:12:13 PM

Ino, my current 22 is 1680 x 1050 I think. But someday I'll upgrade
Score
0
      • 1 / 2
      • 2
      • Newest