GTX 560 vs HD 6950 1gb??

Status
Not open for further replies.

rocky41

Distinguished
Jul 21, 2010
719
0
19,060
HI,
Everyone was waiting for GTX 560 to release and they have expected the same performance it it providing now but have ATI turned Nvidia's big card down by releasing HD 6950 1GB that performs better than GTX 560????
 
Solution
Even if the GTX 560 can no longer be overclocked, an overclocked HD 6950 can't even beat GTX 560 OCed manually (MSI GTX 560 Ti or Reference GTX 560 Ti OCed). Radeon's overclocking ability can't beat NVIDIA's, and I believe that GTX 560 Ti SOC can still be overclocked a bit. However I do not think that it is required, since you will experience flawless graphics (30+) on most games, and once you are above that, it's just a matter of pride in FPS. I do suggest on looking at image quality of both cards. I think that NVIDIA has the overall advantage in image quality, and also the new thermal design means less heat and noise, which is very good for NVIDIA. IMO Radeon's lifespan is lower than NVIDIA, so for a long run I pick the GTX 560 Ti...

Griffolion

Distinguished
May 28, 2009
1,806
0
19,960
Hi, from what i can see on reviews, the 560 is a really excellent buy based off it's own merit.

However, the 6950 performs equally (maybe slightly better due to the increased memory capacity at higher resolutions) and is priced very very competitively in light of this.

Some are saying they are 50/50 torn, others are edging towards the 6950.

At this level, both cards will be an excellent buy and you will not be disappointed by either, it's just up to you to make the choice.
 

g00fysmiley

Distinguished
Apr 30, 2010
2,175
0
19,860
6950 flashed to a 6970 (doesn't equal a full 6970 usually btu will unlock the dormant parts that are functional) is really a hard deal to beat, however it depends on what your intended use is

if you need cuda (if you have to ask you don't) then nvidia

if general gaming then 6950 flashed to 6970

if specific few titles need to know the titles as some games prefer nvidia archetecture over amd/ati (usually this mostly means if you playWoW/SCII primarily as they prefer nvidia )
 
Both perform equally well overall so it's a tough call. Assuming your monitor is 1920 x 1080...

If you play Battlefield 2, Lost Planet 2, BattleForge then the GTX 560 performs better; enough to choose it over the HD 6950.

If you play Aliens v.s Predator or F1 2010 then the HD 6950 performs better enough for you to buy it instead.

If you are still undecided, then buy whichever is cheaper. You really can't loose.

Click the following for benchmarks:

http://www.xbitlabs.com/articles/video/display/geforce-gtx-560-ti_7.html
 

g00fysmiley

Distinguished
Apr 30, 2010
2,175
0
19,860



thats funny considering the video quality settings amd/ati were using to artificially inflate thier fps by LOWERING image quality

while admitadly it was small and basically insignifigant it really did change the image so it should have been an optimisation setting as an option to turn on by the user nto as the defaults

so what i'm saying is where's your evidence here?

benchmarks and screens showign image quality screens plz
 


I don't know what is better, but I can say that this whole idea of AMD/ATI reducing quality was sensationalistic.

AMD has this setting called "Catalyst A.I." This setting has special performance boosting capabilities. At anything below "high quality" setting, it will make visual compromises in order to give better performance. This is actually a good thing, as many people need that extra performance. The compromises are usually not noticeable. If you set it to "high quality" it gives at least as good of visual quality as Nvidia does.

Because AMD sets this setting to "quality" instead of "high quality", Nvidia made claims that AMD is lowering image quality for higher performance. Most benchmark sites know about this slider, and set it to "High Quality", so everything is evenly compared.
 

g00fysmiley

Distinguished
Apr 30, 2010
2,175
0
19,860
yea like i said it was small and basically insignificant but defaul should have been highest quality, hating to beat a dead horse but i'm calling a spade a spade in refrence to his assertion that one has a beter image quality than the other when they are in fact for all intents and purposes as distinguishable by even the most sensitive of human eyes the same
 

Actually most if not all sites were saying that they leave things at the default settings which is why this was seen as a "cheat" from AMD/ATi.
 


I'm not sure this was a good thing for the end user. Sure, it's nice that the benchmarks can be more easily compared, but the reality is, the default comprises were good for most situations. You can't see a noticeable difference the vast majority of the time and gain a small FPS bonus. That's what most people would want.
 

If you are not sure this is a good thing, then you have not been paying attention. Without posting links to the numerous review sites that have commented on this topic, the overwhelming consensus is that this was NOT a good thing for the consumer. A race to decrease image quality in order to gain benchmarking success is NOT in the best interests of the gaming community. Extrapolating from your opinion, Nvidia should then lower their default image quality settings, then AMD would lower theirs to match, then Nvidia, then AMD, then we are all looking at blocky pixels on the screen.
 

I would have thought that most people would want to see fair benchmarks which you won't see if the the driver settings have to be changed by the individual who is doing the testing.
 

g00fysmiley

Distinguished
Apr 30, 2010
2,175
0
19,860
bystander i agree that i would turn on this kind of setting for better fps at a imperceptable image quality deduction... that said i want to be the one to turn it on... i don't want benchmakrs being skewed by this sort of thing and it if accepted would be part of a VERY slippry slope that i for one don't want to see
 
· Catalyst AI Texture Filtering updates

· The Quality setting has now been improved to match the High Quality setting in all respects but one; it enables an optimization that limits tri-linear anisotropic filtering to areas surrounding texture mipmap level transitions, while doing bilinear anisotropic filtering elsewhere. This optimization offers a way to improve filtering performance without visibly affecting image quality

· The Performance setting has also been updated to address comments about the sharpness of the default Quality setting causing shimmering in certain cases. It now provides a smoother filtering option that eliminates most shimmering while preserving the improved detail provided by anisotropic filtering.
 


There is nothing stopping them from setting that to "High Quality" for the benchmark, so they are compared evenly, while the average user would use "Quality" setting for better overall performance per visual quality. Benchmarkers are generally better informed and know to adjust a setting, but your average user isn't.

While this was boohooed by the benchmarkers, who had to pay attention to this, to the end user, it's usually best they leave it at the old default.
 


Read my reply to MM's post. For comparisons, it's nice to have it set even, and those doing benchmarks are quite capable of doing so. The end user, on the other hand, usually needs more hand holding, and are better off with the old default.
 


Yes there is something stopping them, it's called "time". You obviously don't work in the industry and don't have to benchkmark lots of cards by a particular deadline.
 


1 guy doing a single benchmark can't spend 10 seconds to change a setting? I find that hard to believe.

I do see another issue they probably find the most difficult. What setting is the best setting to compare? Nvidia doesn't have this performance boosting setup, should it not be considered a positive feature? How do you approach it, as it does have a visual impact, even if it's not visible (which makes it hard to decide what to do with it).

Perhaps Nvidia needs to come up with a similar feature. The concept is a good one, because both don't have it, it does make it harder to compare.
 

So all the websites only have a single benchmark? I think not.
 
Status
Not open for further replies.