GTX 970 - FPS in Arma 3? Steam sale 50% off

Check the SBM article that posted yesterday. (http://www.tomshardware.com/reviews/newegg-system-builder-marathon-q3-2015-amd-mini-pc,4307.html) It shows an 860K and i7 both paired with a GTX 970. The i7 gets 85fps at 1080p on ultra detail settings. Arma 3 is very heavy on the CPU, so an i5 might be a little behind that, but shouldn't be more than a few frames.
 
So it's fine, however do make sure you know how to TWEAK the game settings for the best experience.

If screen tear is an issue then I suggest using Adaptive VSYNC (forced per game). In that case:

1) Start game, and turn VSYNC OFF

2) Adjust settings based on in-game experience... and tweak until you get 60FPS (for 60Hz monitor) at least 90% of the time.

3) NVidia Control Panel-> Manage 3D settings-> "ADD game"-> Adaptive VSYNC-> SAVE

*Now what you should get is the game mostly locked to 60FPS and when it can't output high enough from the GPU then VSYNC is automatically turned off (to avoid STUTTER due to a mix of frame times since the same frame is drawn again if you miss the update window).

If you want VSYNC OFF then simply aim for a minimum 40FPS average. You'll get screen tear but you may not care.

(If you have a GSYNC monitor then still aim for about 40FPS but you won't have screen tear of course.)
 


I'm not sure if that's a server issue. It's probably a CPU bottleneck on the client side due to processing all the data for the other people. More people in the game means more CPU calculations. It also explains why single-player benchmarks are often way off multiplayer benchmarks.

A server either does all those calculations for you (which is rare) or you do them on your own computer. Titanfall for example has server-side processing (at least on the XBOX... not sure about PC) so the number of people in the map should not affect frame rate locally.

(Unless Server A is sending its data to the gamer and using more data to say the same thing than Server B I'm not sure how there can be a difference. I could be wrong of course, but shouldn't the data be IDENTICAL since it's all coded the same?)
 

Reaper_7799

Distinguished
I have 4790K and 980ti...there's no bottleneck on my end and still got 25 fps in one server and a lot with like 30-40 and that was after dropping from 4k to 1080P and trying to lower settings but fps stayed the same...lol I don't know how they have them set up but it was bad, I gave up on it a month or two ago.

You have to do a lot of digging to find a decent one with players, maybe I'm lookin in the wrong places but it got annoying cause it takes awhile to load up each server, like battlefield 4 and then you get in and everyone has 30 fps or the only ones with good fps is playing a mode you don't want. Then you take another 5 min to try another one and same thing and after 30 min you finally find one that has decent fps and then everyone starts leaving cause they've been playing for hours and you're back at square one.