how much does antialiasng affect performance?

jayztwocentsmannn

Commendable
Jun 29, 2016
22
0
1,510
Hello,

I was just curious on how much antialiasing affects performance of games at a resolution of 3440x1440p. Will you notice a difference running at that resolution with antialiasing?

I am upgrading my system's graphics card to either a 1080 or 1070. I know that a 1070 overclocked (aftermarket) can run games fairly well at 2560x1440p with mild antialiasing, but what about 3440x1440p. Will I have to run no antialiasing on some heavy games? I'm looking for not dropping below 60 fps and will an overclocked 1070 do that efficiently without bottlenecks or should I just go with a 1080 and call it a day?

 
Solution
3440x1440 (2k, 21:9) has about 40% less pixels than normal 3840x2160 (4k).
OCed 1070 can still handle 3440x1440 but 1080 is the better choice.
Assuming 34" 3440x1440, AA is not really noticeable on such pixel density.

fabiodrm

Reputable
Feb 12, 2016
659
0
5,360
running games at 4K resolution antialiasing dont need to be enable since the visual difference will be almost nothing and you will lose some FPS if it is enable.

Most o people o know who run games at 4k doesnt enable AA
 
I wouldn't run 4K without a 1080 unless you want to compromise graphical details. AA, depending on the level and especially MSAA which seems to be the best to this day, reduce framerates up to 30%+ compared to not using it at all.

Fortunately, pixel density on the 4K monitors reduces the need for anti-aliasing. With a 27" or 28" 4K monitor you'll be able to run with no AA or 2x MSAA because of the pixel density. On the other hand, if you're using a large 4K TV or monitor (I'd say 40" or greater though it's subjective and can be affected by the monitor/TV chosen), that whole pixel density advantage goes away and your need for AA increases.