Hardware v Application Anti-aliasing

the_stiffmeister

Distinguished
Jun 5, 2006
142
0
18,680
Prob been asked before here but I just wondered whether it makes a difference if you enable AA within a game or whether you enable within your GFX card control panel? Is one more efficient than the other, is one done in hardware and one in software?
Thanks.
 

Morton

Distinguished
Aug 19, 2005
404
0
18,790
Note that in your display control panel you can select between 'Application Controlled' or force to which ever mode you choose regardless of what setting you choose in the game. The one in display control panel will normally have more detailed options to choose from than in game.

Can you force HDR + AA on an Nvidia video card that way?
 

wavetrex

Distinguished
Jul 6, 2006
254
0
18,810
AA is driver based, it's a function that is performed by the video card.

HDR is a method for rendering your game's graphics. The game must know how to do it, it's not something you can set in the driver.

Even more, until DirectX 10 which will have "guaranteed" features, the game depends on what DirectX reports that the card can do. If the card ( NVidia in our case ) sais "NEY", no matter what you try it won't enable it in the game.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
another reason to not force it in the driver (instead of in-game) is that it will perform AA on everything, including the desktop. This can result in many apps crashing. One off the top of my head is UnrealEd. (editor for UT) I have used this alot for building and such and it does not like AA on the desktop at all. Things freak out and options stop working. Other apps do similar things. That and it heats the card up more then "normal" lol

but if the game does not support it like RO, then that is the only option. I just reccomend turning it off after the game is over. ;)