Limit FPS on AMD cards? Without V-Sync?

JukeBox360

Honorable
May 25, 2012
117
0
10,690
Trying to figure out a way to limit the FPS in all games like you can nvidia. Just built a new amd gaming rig. So far i'm liking nvidia more just due to it's simplicity.

So is there a way to limit the FPS without turning on V-Sync? If so could you explain how this can be done? Thanks!

-Juke
 

killerhurtalot

Distinguished
Aug 16, 2012
1,207
0
19,460


great job on taking it from the screen tearing page of wiki which does nothing to explain what actually happens or how vsync is achieved....

http://hardforum.com/showthread.php?t=928593

read this..... and you're as wrong as wrong can be lol.

Then there's adaptive vsync and dynamic vsync.
 


but you avoid severe screen tearing, which imo is worse on the eyes than running 30fps. Just trim your in game details back to lift the fps so your hitting 60fps most of the time. Everyone is always on about wanting to max out game settings, but without the highest end equipment you aren't going to be able to do that and hit 60fps most of the time. You can also try dropping to 50hz on your refresh rate then you only have to hit 50fps and its still pretty smooth.
 


From your link "What is VSync? VSync stands for Vertical Synchronization. The basic idea is that synchronizes your FPS with your monitor's refresh rate.". All of the other factors he cites only occur when your FPS drops below your monitor's refresh rate, something he explains in detail.
In what regard was I incorrect?
 

killerhurtalot

Distinguished
Aug 16, 2012
1,207
0
19,460


you said that if it isn't 60 fps then its 30 fps as a result of vsync is incorrect.

unless you have adaptive or dynamic vsync, your computer will lock you to 60/30/15/etc (or whatever your screen's refresh rate is and etc)... fps once the frame rate falls lower than each threshold...
 


Is that what you were trying to say? My apologies. From the way you'd phrased that, I took it as the (rather odd) assertion that standard VSync locks framerate to 30 FPS.

Edit: For reference, I parsed that incorrectly due to missing the "if", and thus reading it as "cause it's not 60 fps it's 30", thus causing my response.
 

JukeBox360

Honorable
May 25, 2012
117
0
10,690
So in short what I'm seeing is AMD just sucks for compatibly or having custom settings. So far it seems nvidia is light years ahead when it comes to user experience. Which sucks because IMO AMD has the better GPU. Yet nvidia just seems a ton better to use overall.
 


Most games today will not do that. Most games today use a form of triple-buffering that is built into the game engine. This prevents v-sync from dropping FPS all the way down to the nearest divisible refresh rate. However, you'll see see alternating between 16ms and 33ms frame times, which may seem jittery to some.

That said, RadeonPro has a couple options to limit frames. Either with Dynamic Vsync that works similar to adaptive v-sync, and dynamic frame control or something similar in name, which is strictly a FPS cap.

MSI Afterburner and EVGA PrecisionX also allow you to set max FPS, but this method does not let you set them differently for each game.
 

killerhurtalot

Distinguished
Aug 16, 2012
1,207
0
19,460


triple buffering is a terrible thing especially in fps games... input lag is really bad the way they implement it...
 
Triple buffering doesn't add "really bad" input latency. It adds at most 17ms, though as I said, you can use a FPS cap if you don't want v-sync at all with RadeonPro, MSI Afterburner or EVGA PrecisionX. I was merely mentioning that v-sync rarely drops your FPS down to 30 FPS like you had said. That is an old school problem, something that rarely exists today.
 

a_seymour

Distinguished
Dec 2, 2014
29
0
18,530
Very aware this is an extremely old thread, but here's my solution in regards to dealing with V-sync.

If playing a game where high frames is important, ie; cs go then I will leave it off.

But if I'm playing Shadow of Mordor for example and want to get rid of tearing, I'll cap fps at one frame under monitor refresh rate, and enable v-sync in game.
 


From what I've gathered over the years of seeing the fanatics around, it is the exploit that makes them shoot for a specific FPS over 300 (forget the exact number), there are a few other FPS points which are better than others as well. Though they'd still want over 100 FPS even if it wasn't for the exploit. That exploit allows for faster moving and involves jumping.

NOTE about my previous reply about triple buffering being predominant today; I think I may be wrong there. I've realized, with the way SLI/CF works, you always get at least 3 buffers in games, which explains why I've not encountered the problem in a long time. It may still be a problem for single card users. (An actual benefit for multi-GPU's, other than more FPS).
 
is that applicable in multi player games when different machine might have different FPS? also is that legal in actual tournament? and i do a bit confuse about this triple buffering stuff. i heard it only works with OpenGL games and you need third party tool to enable them in DirectX based games. also i don't agree about if the FPS is not 60 then it is 30 when V-Sync is enabled. because multi gpu or single gpu i can still see different smoothness between 50, 40 and 30.
 


Yes, this is why they do it. It is an extremely old game with an odd engine that has a bug that allows for faster movement at specific FPS. I believe it involves strafe jumping. I don't play the game, but that is what I've gathered. No one would care if it wasn't multiplayer.

The triple buffering deal is only force-able in OpenGL, but DirectX games do use 3 buffers in some cases. Most the time, games have 1 front buffer and 1 buffer for each GPU you have. So if you have 2-way SLI, you are using 3 buffers at a minimum. Otherwise your GPU's could not both be generating frames at the same time and not mess up what is being displayed.

Now as to seeing a difference in smoothness between 30 40 and 50 FPS in games, that becomes subjective, and often is really 2 different things people notice if talking about having V-sync on. At 30 FPS, frames are consistent, at 40 or 50 FPS, some frames last twice as long as others, which can be jittery, but at the same time, latency is reduced when you can get frames off faster. When you are at those FPS, it is also questionable as to whether it is better to use V-sync or not. What is better is debatable. PCPer tested this, and found that it was controversial as to what was better.

http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Visual-Effects-Vsync-Gaming-Animation
 

Linx77

Reputable
Jan 12, 2015
5
0
4,510
So there have been a lot of arguements about Vsync on this page and I think they need to be settled. Vsync does not automatically synchronize to your monitors refresh rate. It synchronizes to certain intervals like 60 or 30 or 20 FPS. Therefore Vsync should only be applied when the monitor is above 60 FPS but frame rates aren't always consistent and when it drops below 60 FPS you lose half your FPS which is even worse than screen tearing. Which is why the best option is the limit your frame rate to your monitors refresh rate, something which does not seem to be possible with AMD's catalyst control center
 


First off, V-sync does automatically synchronize to your monitors refresh rate. That doesn't mean you will get the same FPS as your refresh rate, it just means that no frame can be sent to the monitor while it is refreshing. It is forces the GPU to wait for vertical blanking mode.

It also doesn't force you to 60, 30 or 20 FPS. It is quite possible you will get 50 FPS with it on, and instead get a mix of 16.7ms, 33.3 ms. And if you drop to the 20-30 FPS range, you will get a mix of 33.3ms and 50ms frames. This is more common with triple buffering. Without triple buffering, it is more likely you drop to a divisible of your refresh rate, but even that isn't a hard and fast rule.

What is best is a personal opinion, one that varies depending on who you talk to and the game it is used on.