General question about fps and AA

knobcreekman

Honorable
Jun 26, 2012
29
0
10,540
Hello, I built a new computer last week and just ordered my video card yesterday. Luckily, I found a site that has tested my card on a system with the same specs as mine giving me a good idea of how my rig will perform. I gamed last about a decade ago and then it was mostly on consoles. I am pretty ignorant about these things nowadays. So, my question is what level of fps and AA makes for a decent experience. According to the tests, I should get over 61 fps with 4xAA on Battlefield 3. The intro to the test says that all of the results were obtained with the game on the highest settings unless noted otherwise. So, is 60+fps and 4xAA enough for a good experience? My fps should be higher because I'll only be playing at 1920 x 1080. Just curious and thanks for any insight you have. Also, any *MUST PLAY* game recommendations are welcome.
 

knobcreekman

Honorable
Jun 26, 2012
29
0
10,540



Sorry... didn't put that because I already had the hard data. I was just wondering if 60+ fps was decent or not.

Specs:

i7 3770k
16gb G.Skill RAM
MSI GeForce 660ti PE OC
Sandisk Extreme 240gb SSD
 
sorry m8 your not gonna get a rock solid 60 fps with x4 aa on bf3. it will vary between 45 and 80 depending on what mode you play and what map your on. on single player your more likely to get close to 60 fps but in multiplayer you more than likely not to.
reason being you dont have 63 other players on single player running around the maps at random.
dont get me wrong thats a nice system that will perfom very well but you have become a victim of sales hype.
you will have to turn things like blur down and maybe reduce aa to x2 to get closer to 60 fps in multiplayer with a 660ti. dont worry you wont be loosing much in visual quality. and you can always add an fx injector which will enable fxaa which gives a less demanding form of aa an with it you can actualy gain fps and turn ingame fsaa off and still gain image quality....
anyways enough of me blabbering enjoy your system and your game. welcome back to pc gaming...
http://www.techpowerup.com/forums/showthread.php?t=154145 a config file creator... limit your max fps to 60 and gain ultra smooth gameplay...
http://forums.electronicarts.co.uk/battlefield-3-pc/1454675-better-sharper-custom-fxaa-injector.html fx injector...
 

blakwidowrsa

Honorable
Aug 10, 2012
277
0
10,810
Hi knobcreekman, :bounce:

Depends on your genes mate,,,,no seriously,,,, and what TYPE of monitor you use. People see between +- 45-70 fps with our eyes. new LCD screens (not HD TV's) are usually 60 or 59 Hz. More advanced and Professional screens can reach 75 Hz and higher.

If you have an old friend with a CRT Tube TV look slightly over the TV and youll notice a flicker effect in your peripheril vision.Tiny cells in your eye around the center point that are very sensitive towards 'motion' so as to draw your attention to an oncoming object for example^^(Thanks evolution).
But if you look straight at the TV the flicker will seem to disappear. That TV is quickly flashing from the top left pixel,pixel-by-pixel left-to-right, row below that left-to-right again until it reaches a resolution of 320 pixels across, by 240 pixels down around 60 frames a second onto the phosphor grill to create the illusion on continous motion. (For the generations that missed the Cathode Ray Tube type screen era)

You dont see this in LCD's as they normally have 5ms response time, meaning the pixel stays that color shade for 5ms before a new shade is realised by your eye. A 2ms LCD is good and less strenious on your eyes. I remeber when the very first bulky LCD came out with 16ms reponse, It was aweful for fast paced games as the image literally left a smear-trail, almost like a Motion-Blur effect.

On the other hand if you are aiming for prolonged gaming and a nice Big 40" and bigger HDTV option, get a 200Hz or better like the Bravia's from Sony.There are even 600Hz's available. Because the flicker effect is more notable on larger screens.


Its not such an easy question to answer :hello:
But now you have a broader perpective of factors, at least a few anyway ^^
 

blakwidowrsa

Honorable
Aug 10, 2012
277
0
10,810
Oh and BTW, a game running at your LCD's native resolution is allways best. And AA is actually just overkill as each game-pixel is a monitor-pixel.... AA was an old technique back in the day when your monitor was maximum 1024x768 resolution and you ran your game at 800x600. It created a sawtooth effect on diagnal lines and shapes as your screen pixels where effectively being used to draw onto more adjacent pixels to keep the image to scale and 'size'. The AA basically compared the next 2-4 frames being drawn by the gfx card into graphics memory and blended the odd ovelapping pixels into a new frame,that then in turn got drawn to the screen,smoothing the final image appearance.Thats why AA has a performace hit on slower systems when enabled. So do you really want to smooth a perfectly crisp 1920x1080 image ? By enabling 2x 4x or even 8x or 16x AA ?? ^^
 

knobcreekman

Honorable
Jun 26, 2012
29
0
10,540
Thanks again for all of the responses. Just to be clear I wasn't trying to build a killer gaming system. Gaming was third on my list of priorities. Coding and virtualization being numbers one and two, respectively. I was just trying to put the fps numbers in perspective since I really didn't have anything to compare it to. I guess the good thing about not being a hardcore gamer all of these years is that the games look AMAZING to me now. I've only played Crysis (borrowed) and The Witcher 2 on my new system, but it looks awesome to me and silky smooth with the settings up. The settings are all the way up on Crysis, but I haven't found a place to adjust them on The Witcher 2. It only has two options for brightness and gamma I think. Anyways, again I was just looking for a frame of reference. My monitor is only a 1920 x 1080 22".

BlakwidowRSA, are you saying that AA doesn't really do anything on modern systems or that it's just not worth it?
 

blakwidowrsa

Honorable
Aug 10, 2012
277
0
10,810



As i described it all dependes on system and distance from screen etc. I use a 24" full hd res. So If you only have a 22" running full HD resolution,sitting infront of it like I do probably ^^. Your pixels are even closer together than my 24. Meaning more crisp image. I wish I could draw a picture here...but anyway, AA , yes is Overkill on 24" and smaller screens on max / native resolutiion,as the Antialiasing effect it creates is almost unnoticable(if thats even a word,lol).

PS. I turn AA off in FPS shooters as I can see the other players quicker by spotting smaller object further.Sometimes a clearer image is more desired.Depends on scenario.

Here's what i mean
http://nn.wikipedia.org/wiki/Fil:Anti-aliasing.jpg
the image with the line, dot and shape AA'd and not, from far away say 10m(30 feet), they will look almost identical.