Questions about AA and image quality

mx597turbo

Distinguished
Jun 15, 2005
13
0
18,520
My question is why does anyone want to use AA? I have seen it enabled on highend cards running games at 1600x1200 and the image looks worse to me. There is no noticeable jaggedness at resolutions above 1024x768. AF adds to image quality, but I think AA takes away from the sharp look of high res.

If I had to choose between 1024x768 and 4xAA or 1280x1024 and no AA, I would definitely pick the later. What are your opinions?
 

Gary_Busey

Distinguished
Mar 21, 2006
1,380
0
19,280
Yeah, if you deciding between a slide show or a playable framerate, always take the framerate. But it does noticably change the quality of visuals in a game. I can't use it in Oblivion though, not yet at least.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
IMO, it really is game specific. That link showed above (hl2) needs AA. the source engine (to me) is very succeptable to aliasing w/ all the fine lines and edges. Doom3 on the other hand does not show pronounced aliased lines. (dont really like the geometry of that engine, but the lighting is unequaled) Oblivion does not show as much aliasing, but it is there if you look. I do run w/ AA in that game, but have found "just" high-res w/ no AA to be perfectly acceptable.

GRAW, which I do love and enjoy the look of the game; does not work w/ any AA yet it suffers... oh well.

it is all subjective man, that is why it is an "option" and not a fixed parameter. ;)
 

mx597turbo

Distinguished
Jun 15, 2005
13
0
18,520
IMO the above screenshots show exactly why I wouldn't want AA turned on. All the edges and lines have a "soft" look to them. I don't like it.

It was correctly stated that this is only my opinion.

Another question... approximately how would Chronicles of Riddick perform on a AMD dual core 3800, 7950GTX @ 1600x1200 8xAF, no AA and all sliders/options to max?