Hey guys! I'm looking into getting a GTX 560Ti or an HD 6950 1GB, any idea of which would produce better image quality? If I go into the AMD control center settings and switch everything to quality will it match the GTX 560Ti quality? I really want to be able to enjoy the full experience when November hits if you know what I mean XD
Gigabyte H55M-UD2H mobo i3 530 at 3.4ghz Mushkin Enhanced DDR3 4GB (2GBx2) 1333mhz Sapphire HD 5770 Vapour-X at 930 Core / 1350 Memory (Stocks are 860 Core / 1200 Memory) 1600x900 res always Thermaltake 500 watt psu / 43amps combined on 12V rail
More aboutnvidia image quality
Their image quality is pretty much identical if everything is set for max. The only exception is AA, but no one in their right mind needs more than 4-8x AA anyways.
the picture quality is identical but in physics there is a little bit differnce.
do guys mean the picture quality would be identical if there are both maxed at that settings or at the default?
Pretty much? Same settings=same picture. Difference would be performance, ati performs better at some games and worst in others compared to nvidia. Both 560ti and 6950 bench pretty much the same for bfbc2, so benches should be similar to bf3.