VSync and CS Source

mob

Distinguished
Aug 21, 2008
10
0
18,510
Hi Everyone. Thanks in advance for taking time to read this. Please note that I get quite specific about CS Source. If you are not familiar with the game then you might still be able to answer my 4th question, is there a way to get rid of tearing without VSync..

So I have a little problem that I am trying to get my head around and I was wondering if you guys can help.

A little about my spec:
Intel Q6600@3.2Ghz
Asus Maximus Extreme mobo
2Gb OCZ DDR3@1666Hz
Geforce 8800 GTS 512Mb
Samsung Syncmaster LCD 2032BW (connected via DVI) 2ms response time.

I am a CS Source player, playing at a competitive level and I want to be able get my game running at optimum. I play on a machine that is over spec'd for the job at hand. This means my graphics card is able to churn out many more frames a second than my monitor is able to display, causing screen tearing.

Tearing is something I just can't stand so I turn on Vsync in most games and problem solved. My monitor can handle 60Hz (and yes, I know it is an LCD so it doesn't strictly have a refresh rate, nevertheless this cap exists) so I am stuck with 60 FPS. For me, this is fine in most games. Sure, I'd like it higher but a constant 60 FPS and no tearing is better than 120-300 FPS with tearing.

Most CS Source (and other HL2 games) players will know that servers come at a variety of tickrates, 100 being the best (stock tickrate). This means that potentially , providing client and server variables allow it, the server and clients can send and receive up to 100 updates a second. It is my understanding that the more updates being sent and received, the better.

Now comes the problem. Take a look at the following two images taken on a 100 tick server with variables allowing 100 max cmd and update rates. My client has 100 cmd and update rates specified.

The values in the red boxes are the ones of interest.

Image 1: VSync Off
vsyncoffez5.jpg


This image shows my FPS is over 60 (so I get tearing when I run around, which I obviously can't show you in a screenshot) but the number of updates I am sending is around 100, which is ideal.

Image 2: VSync On
vsynconoc8.jpg


This time you can see my FPS is restricted (no tearing now, woo!) however the amount of updates I am sending has dropped to 60 per second.

It would appear from this that Vsync is affecting my clients ability to send data to the server. This, I am assuming, will have an impact on the gameplay. Perhaps not a visible one, as a shot fired may just as easily be a miss as it is one that would have hit if I had been sending more updates, but based on the numbers, there is a 40% drop and this has to have some sort of impact.

1) Is there some way to have Vsync on and ensure I am still able to send 100 updates a second? I can't stand the tearing, but I want to make sure my game is as accurate as possible.

2) Does it even matter that I can't send 100 updates a second? Like I said above, I am just going by numbers as I don't know how else to test it, so maybe it is just a waste of time.

3)
What do players at the highest competitive level do? Do they just put up with the tearing and to hell with VSync?

4)
Is there something I am missing that will allow me to play without VSync and without tearing? I've done a lot of research on the subject but I may have missed something. Perhaps I need a better monitor but as LCD's don't come with a refresh rate value (seeing as they don't strictly refresh like old CRT's do) how do I know which to get?

Sorry for making you read all this but I have been trying to find a solution to this for some time and any help you can offer will be appreciated.
 
V-sync does nothing but limit how often the display is sent to your screen (all moniters default at 60). This is done to ensure the entire screen is updated at the same time, which prevents the tearing that you see.

Personallly, I leave V-sync on. Theres not much point to a refresh rate of over 60, as it only makes thing look nicer.
 

mob

Distinguished
Aug 21, 2008
10
0
18,510


Yeah, I understand what VSync does. Normally I would just leave it on but as you can see from my original post, it has some sort of impact on CS Source's numbers.



The only options I get in Display Properties > Advanced > Monitor are 59 Hz or 60 Hz. If my monitor does support 75 Hz, I'm not sure how to increase it. If I was able to do this, it wouldn't solve my problem entirely but would certainly be an improvement. Any ideas? I can't seem to find the manual for it, but will have a look online.

And thanks for the responses guys:)
 

mob

Distinguished
Aug 21, 2008
10
0
18,510
EDIT: Managed to find my monitors manual online. It states that to change the refresh rate (up to 75Hz, so you were right on that front) you must change it on your graphics card settings. I've tried all sorts of jiggery pokery and the only two options I have are 59Hz and 60hz. I have another monitor here (which I often use as a dual screen), an old LG Flatron L1930S. I am able to select 75Hz on it without a problem. What gives?

Please note, I am ensuring that "Hide modes that this monitor can't display" is not ticked.

EDIT EDIT: Now managed to create a custom resolution in the nVidia control panel. I was able to choose 75 Hz for the GDI refresh rate. However, this customer resolution now shows:

1680 x 1050 at 75.0Hz (60Hz as reported by OS), Progressive.

Monitor's info (via the buttons on the front) shows 60 Hz still!

EDIT EDIT EDIT: Ok, this is a bit odd. After deleting my custom resolution, 75Hz is now available. After checking the monitor's info it shows that it is running at 1680 x 1050 at 75Hz. Woo!

EDIT EDIT EDIT EDIT:

So it turns out that my monitor can't support 1680 x 1050 at 75Hz. It gets all flickery. Back to 60Hz and the veritable drawing board!

 

andyKCIUK

Distinguished
Jun 18, 2008
153
0
18,690
I'm pretty sure that if you use analog cable instead of DVI you'll be able to set it to 75Hz. I had the same problem with my old 7800gt.
 

mob

Distinguished
Aug 21, 2008
10
0
18,510


I'll give this a whirl. Thanks
 

spanner_razor

Distinguished
Nov 24, 2006
468
0
18,780
Can you not do the classic console command fps_max 101 and that should limit the fps to 101 except when you don't get that many. You'd have to turn vsync off, also by tearing do you mean ghosting as if so then that will happen if your monitor has a low response time and you get too many fps.
 

mob

Distinguished
Aug 21, 2008
10
0
18,510


I can use fps_max to restrict the fps to 101, but I still get the same effect.

I mean tearing, as in vertical lines (such as the corner of a wall) will look disjointed. If there are a few vertical lines in a row, these will all "tear" along the same horizontal line. That's the best way I can describe it.

I am assuming you mean high response time, as older LCD's with 20ms response times used to have ghosting problems. Mine is 2ms response. Quite adequate :)
 

andyKCIUK

Distinguished
Jun 18, 2008
153
0
18,690


I'm running out of ideas...

The only thing that comes to my mind now is to reinstall nvidia drivers with that analog cable plugged in. I've had the same problem and I've made my Samsung 940bf to work @ 75Hz by using an analog dongle. Your monitor supports 75Hz, so there has to be a way to set it at this refresh rate.
 

mob

Distinguished
Aug 21, 2008
10
0
18,510
From reading the documentation I've established that it doesn't support 75Hz at the resolution I was playing at. Unfortunately, the only resolutions that I can drop to are 4:3 and they look all funky on a widerscreen.

Going back to my original post though, while getting things to run at 75Hz (and I have an old LCD I could use) will give me an improvement, it still isn't solving the problem at hand.

From the research I've done, I am thinking the only way to play with Vsync off and not get tearing, is to invest in a 100Hz CRT (as as far as I know, the 100Hz LCD's are just 40" TVs atm). This makes me a sad panda.