what dictates the image quality when i look at something on my computer? whether it's 3d, still, 3d animation?
right now i have a HG216D hdmi Hanns.G 22 inch widescreen lcd monitor
but i do plan on upgrading in the future to perhaps a hd 28 or above lcd.
i'm a bit confused. 'cuz both the msi graphic card (160 bucks only) and the 300 dollar graphic cards i linked in the previous thread. both of them are able to handle a game such as modern warfare 2 (or so i hear) with good image and performance quality...
so why do all these gamers spend 400-500 on graphic cards? if one 150 bucks graphic cards can give them good enough performance and picture quality?
my goal right now is to get a good card that will allow me to play such games while getting good performance... but i also want to make sure i can have very good picture quality when i use my computer.. for image viewing, 3d character design, animation. i want to be able to view the finished product in its best form.
and even though right now the monitor i have is not THAT great, i do plan on updating it in the future.
just wondering if i should get the cheaper graphic card or that 300 dollars one.
is it all about resolutions? the more high end graphic card allows even higher resolution above 1680x1050? and is resolution the thing that allows an image to be clearer and sharper on the screen? that the 300 dollar graphic card probably allows higher resolution than the 150 dollar msi card?
then again, my monitor also has to be good to be able to fully appreciate it right?
the difference between the 5770 and 5850 you linked is the 5850 has the ability to handle monitors at higher resolutions. At 1680 x 1050 both cards can handle the games you mentioned just fine. But once you upgrade to the larger monitor as you stated above, the 5850 will out shine the 5770 in game performance. Make sure you have a power supply capable of running the card and a solid CPU to go with it. I would personally go for the 5850 if I was planning to upgrade my monitor at a later date.
but aside from ingame performance. if we're just talking about the quality of image that we see on our screen... there really is not much difference between using these two cards? or at on 1680x1050 and one at higher?
if my goal is to just be able to play the game fine... but with emphasis on doing 3d character design work and animation, and wanting to see the final product in good quality.
the image would be exactly same be it an ATI 5570, 5770, or 5850 (assuming you set the graphic options the same)
the difference would be the Frames Per Second you play the game at and the monitor native resolution. Lower cards would run the game at slide show speeds, mid level cards at playable speeds, and higher end cards will play the game at ease.
Obviously lowering resolution and graphic settings would ease the performance on the specific video card.
The cards you've mentioned do not correlate to the apps you are targeting.. These apps such as 3DS Max, 3D Maya etc.., have their own render engines which support extensive plugins such as z-depth, hardware occulsion, 10 bit color depth and many more.. A workstation class GPU is required to utilize those features.. Gaming cards do not target such needs.. In fact such needs are useless for majority of the consumers ( including budding 3D design professionals also )... The main target for a gaming card is to accelerate games based on direct 3D/opengl.. The better a card can do that, higher the cost of it.. The cost is also inflenced by manufacturing process, availability and demand but that is a separate sector all together.. So as you mentioned, the in game performance with today's games, determines the longevity of the card with the future wherein the games are going to become more complex and graphic intensive.. There lies your price to performance measure as a powerful card today will take a certain time to require a replacement..