Hi guys & gals, I thought I knew what vertical sync did boy was I wrong! I have an NEC Multisync LCD 1970GX with 8ms responds time. Last night I let my friend play one of my games and later I sat down to play Ghost Recon AW/GRAW and noticed something well was great, the frame rates seemed higher than normal. So I enabled Fraps and to my confustion my frame rates went from rock solid 38 fps vertical sync on, now there 68 fps avg. up to over 100 fps with vertical sync off????? There was a little image tearing nothing to complain about given the frame rate increase. The funny thing is I thought my MX1000 laser mouse had lost it tracking senser and had gone bad. Can someone PLEASE tell me what the hell VERTICAL SYNC is or does, perpose for using it? All I know is it really kills your performance, big time! THANKS FOR ANY HELP!
PC specs: P4 3 Ghz, 1GB 533Mhz, 7800GTX 256MB, Audigy 2 ZS, one fusion reactor.
It's basically to produce good image so it limits and or slow the framerates in the process. But disabling it will give more framerates but might tear or damage the image. So if you can disable the Vsync and have no defective images then that's good and keep it that way.
By the way you can add another 1Gb on your ramage for a noticeable increase in performance during GRAW on high quality settings.
You said by disabling v sync would give me more framerates and might damage the image, but can I damage the LCD? Also do you think I should spend some $$ for 1GB of extra ram considering that my PC is a year and a half old? What performance increases would you think that I would notice? Frame rates, faster loading of images... etc. Also I don't think I can match the ram by brand, which means that I would have to have two brands of ram, is that a bad thing? Can you recomend a brand of ram that is not to much $$, I have Dell branded ram. Yes I know I should have built my own PC dam it! Self hit, bang that head on the wall! :?
Vertical sync was a term meant for CRTS. What it really means is "Don't start drawing the next frame until you've given the current frame to the monitor"
Frames are draw off screen (in a back buffer) and sent to the monitor top to bottom so each tear you see is a new back buffer being sent to the front buffer while the front buffer is being sent to the monitor.
If you think in terms of a movie projector. The front buffer is the image with the light going through it, and the back buffer is the next image to display. With a computer the next image is being drawn right before it goes infront of the lamp.
Keep in mind is turning vsync won't really give you smoother motion, because your monitor is the limiting factor. BUT what it will do is reduce the latency between your input and the actions on screen, which is really worth it in action games.
You might want to check out the GRAW portion of the forums at Ghostrecon.net for some discussions as to why 2Gb of memory is advantageous with GRAW. Does your ret ever wander when aiming? esp with the SCAR-L and optical scope(red dot),or sniper rifle?
Secondly I would advise you to get Coolbits 2.0 and after installing it find the settings for "FRAMES TO RENDER FWD" and set it to 1 (one) this will help your FPS also.
LOL I didnt know lcd panels had adjustable refresh rate. Another description of vsynch is that with CRT monitors your refresh rate was adjustable from ~60-120 hz. If you played your 3d games with the vsynch locked, the MAX fps you could get was equal to your monitors refresh rate frequency. Have monitor refrsh rate at 85hz? max fps would be 85fps. and so on
Fellow OGR and GRAW player
vsync is necessary everytime the vertical scan rate of the monitor (internal vertical scan rate) is bigger or smaller than the number of efective frames displayed in a moment of time (second). this only applies in full screen mode.
example: your vertical scan rate is 60, meaning your refresh rate is also 60...so when the image displayed has lower or bigger fps an horizontal cut effect (frame) appears
the most frequent scenario is a game (full screen)...and the reason why the fps decreases is because the video sistem applies more computational power to obtain vsync........ideally if your screen refresh rate is 60 when you apply vsync your game will only display 60 fps...
why do we need vsync?
why decreasing fps ?
well same games do need it same don`t (of course we are talking about direct3D here)....same fast paced games do need it ..generally first person games...well it`s up to gamers preferences here..
you`ll find the setting vertical sync vsync in catalist at 3D settings (view all) and in forceware in 3D settings / advanced....you can either turn it on or off...but you can also choose to set it in app (application preference)
I spent the same amount of $$ for my 7800GTX256MB card that i spent on my NEC 19" 8ms 60-75Hz rate adjustable. Your right when v-sync is off my framerates durring inventory its as high as 253 fps, because the card is not stressed at all. About Cool bits is it an overclocking tool? I wouldn't think of overclocking it unless i had a waterblock! Or can i get away with a little overclocking?
Half Life 2 stess test recomended it off and well i turn it on then later i turned it off and there was no tearing and the video didn't shutter as much, but fps was the same on HL2. As i understand now thx to you guys that v-sync is a latency issue between mouse input and what you see next onscreen. As for GRAW it 2xtimed my fps because i decreased the latency between the cards buffer and the LCD with some image tearing on GRAW, but i would prefer more frames to just a little tearing.THANKS GUYS FOR YOUR HELP! Please continue to discuse the issue, its nice to hear everyone's own view.
"CoolBits" is an extended control application. I personally use an ATI X800PRO vid card,so I use the "ATI Tray Tools" control app. They are both similar in the flexibility in that they offer the ability to set variables not found in the standard apps. ATI Tray Tools does offer "soft overclocking" and I believe CoolBits offers that option as well. It is more important to get CoolBits for the "Frames to Render Forward" (set it to 1 (one))setting though.
Here's the important part copied/pasted from the TweakGuide:
Max Frames to Render Ahead
This tweak gained prominence with Oblivion, and is covered on the last page of my Oblivion Tweak Guide. I'll reproduce the same description here as it applies in precisely the same way: The setting in question is called 'Max Frames to Render Ahead', and it is usually hidden from the normal Nvidia Forceware Control Panel options. To enable it, I recommend using NVTweak 1.71, since NVTweak is easier to use to enable/disable this option (compared to Coolbits) and has other functionality which can come in handy. Run NVTweak and tick the 'Additional Direct3D' box, then close NVTweak. Now go into your Forceware Control Panel, and under the 'Performance & Quality Settings' item you will find a new item called 'Additional Direct3D Settings', click on it and you will see this option.
You can find out more about how to use NVTweak on this page of my Nvidia Forceware Tweak Guide. You will also find more information about the 'Max Frames to Render Ahead' option on that page, as well as other hidden options for the Forceware Control Panel. For ATI users, you can adjust the 'Flip Queue' setting which is the equivalent setting to the one above by installing and using ATI Tray Tools. The same values apply to both cards.
The amount of data produced by the GRAW engine means it can saturate your CPU and/or Video RAM with pre-rendered frames (whole screens ready to be displayed). On most systems this can result in noticeable mouse and/or keyboard lag, even when your FPS is reasonably high enough not to usually suffer from this (e.g. 25-30FPS+). By reducing the maximum number of frames to render in advance, you reduce this bottlenecking effect and hence significantly reduce or remove the mouse lag.
However, here's the important part: this is not a tweak for improving FPS as such. The general recommendation to set 'Max Frames to Render Ahead' (or 'Flip Queue') from its default of 3 down to 0 will actually reduce performance on many systems, particularly Dual Core or HyperThreading CPUs. Secondly, even on single core systems, you may notice reduced performance in certain areas. So on balance I strongly recommend setting this value to 2 to start with, and if you still have mouse lag, dropping it down to 1. Remember, not all mouse lag is due to this setting: in areas where you have very low FPS (i.e. below 10-15FPS), you will get lagginess - this occurs in virtually any game where your FPS drops down to the low teens or single digits. You will have to consider changing other settings to increase overall FPS. Also refer to the Troubleshooting Tips section for more details.
I tried the nvtweek and it work, no image tearing with V-Sync off. I did get little more shutter, is it because I only have 1gb of ram? My FPS did not suffer at all but I have been having shuttering in my games as long as I've had my computer 1 year now, just didn't know why.
i don't think you quite understand or at least i don't. the v-sync just makes sure you gfx card draws x amount of even frames that match the "refresh rate" of your monitor.
you are right in that you get "more" FPS but as long as you understand that there is a max FPS that you monitor can draw full frames of. even if it says you are getting 200FPS you aren't. the pixels on a LCD can only change so quick,
So is having an LCD better or worse for your game quality???
June 11, 2009 10:03:25 AM
Of course it is. And better to your eyes too. Of course, when you buy a new LCD, you should buy new video card too, if your previous was a little slow.
I'll also say that vsync is a good thing. It does limit your fps, but you really don't need that much, because your monitor can't draw that much in a second. So set your monitor refresh rate to fastest you can (usually 75Hz) and keep your vsync on. It also helps your video card to gain less heat.
I didnt read through the whole thread just skimmed to see if what i posted had been posted or not, nearly but not quite as it turns out.
Now i knew what i posted wasnt 3 years old so instead of trawling through th elinks i thought i would just ask.
Whats wrong with that ?