Barracuda115 :
Soooo, i wont get any input lag, right? Any confirmation?
It depends on the game. At least in theory, and based on many threads on the subject.
V-sync adds some latency no matter what you do, as it is forced to wait until vertical blanking mode starts before it can show the image, but unless you get much higher FPS than your refresh rate without v-sync, v-sync will not add a lot in this way.
Where things get bad is when triple buffering is used in DirectX and your FPS are higher than your refresh rate. DirectX forces every image rendered to be displayed. With triple buffering and FPS higher than your refresh rate, you will end up with two images completed in the back buffer, and DirectX forces the oldest image to be displayed. The end result is you get an additional 17ms of latency on a 60hz screen.
OpenGL does not require that all frames be displayed, so in the same case, OpenGL grabs the newest image rendered and discards the older one. This way no additional latency is added.
Possible solution:
In DirectX, with triple buffering, when getting FPS potentially higher than your refresh rate, you can limit the FPS to one below your refresh rate, preventing your GPU from rendering fast enough to have two complete images in the back buffer, preventing a case were the extra frames stack up.