Sign in with
Sign up | Sign in
Your question

Ghostbusters Anti-aliasing with nvidia cards. Is it implemented yet?

Last response: in Graphics & Displays
Share
October 7, 2009 5:19:24 PM

Hello all, I did a quick forum search on this and didn't come up with anything, so I thought I'd ask:

When Ghostbusters came out several months ago the only way to enable anti-aliasing was to manually change a text file in appData I believe. This method worked, but it slowed the framerate to unplayable levels. Since then I've heard of ATI's driver updates allowing you to force anti-aliasing via the CCC for the game, but I never heard of such an update for Nvidia drivers. Has one of the recent Nvidia driver updates included forced anti-aliasing through the Nvidia profiles for Ghostbusters?

I'm running a GTX 260 with the 190. drivers (haven't installed the 191 update yet). I'm pretty sure my version of the game is the original release version.

Thanks ahead of time!
October 7, 2009 10:15:35 PM

I did some of my own research and realized the game uses deferred rendering, making it difficult if not impossible to apply MSAA the way many traditional PC games do. In one interview an Infernal engine developer said hardware AA would only be possible with DX11 in Ghostbusters, but for me at least, that leaves more questions than answers.
m
0
l
a b Î Nvidia
October 8, 2009 1:31:52 AM

I've noticed that Anti-Aliasing becomes less important as the resolution increases. It's kind of ironic if you understand how it works.

Anyway, when i game at 1600x1200 with no AA I can see a slight jaggedness but if resolutions went above this it would disappear.

AA works by processing the video as if it were on a much larger monitor, processing it and scaling it back down to the proper resolution. That seems like a VERY inefficient process in my mind.

I expect to see a better solution than AA in a few years. I think some form of Ray Tracing and other tools will kill it off.
m
0
l
October 8, 2009 2:56:52 AM

That's the case for standard supersampling (upscaling the frame two or more times higher than the screen resolution), but multisampling isn't nearly as inefficient. Ray tracing still produces aliasing.
m
0
l
October 8, 2009 11:43:17 AM

And of course, we just returned to the "Unreal 3 Engine lacks AA" discussion...

The only AA method avaliable is supersampling, which is very inefficent. You could always try to force other methods through you're driver control panel, but results vary.
m
0
l
October 8, 2009 2:12:11 PM

Actually I just got to try this at home and lone behold the latest Nvidia drivers CAN force MSAA, meaning no jaggies while retaining high framerates even at 8x.

While ray-tracing is awesome in terms of potential quality, as a 3d animator I know it still produces aliasing. When using ray-tracing in Mental Ray you have to use multiple samples per pixel (anywhere from 16 to 256 in some extreme cases) to interpolate edges based on a multi-pixel filter (Gaussian, Lanczos, etc...) of your choice to achieve the nice pretty edges we're used to in feature films or animated shows. Talk about render intensive!

As for the Unreal engine AA debacle I'll steer clear since my knowledge there isn't nearly what you guys have. Is the Ghostbusters' Infernal engine based on it or was it a new engine from the ground up?
m
0
l
!