Do all video games come with triple buffering built into the game?

dssdghthd

Honorable
Jan 14, 2013
63
0
10,640
I'm wondering cause I've never had my framerate dip to 30 when it went below 60, and yet I've never used triple buffering (triple buffering in the Nvidia CP is disabled and I have never even downloaded D3dOverrider to apply triple buffering to DirectX games), I only use regular V-sync alone.

I record framerate with Fraps. Is it possible that Fraps tells me I'm getting 50fps when in fact I'm getting 30?
I can definitely feel the difference between 30fps and 50fps.
 

zyky

Distinguished
Sep 12, 2006
174
0
18,710
Most games will use triple buffering to avoid stalls in rendering or tearing from VRAM resources being simultaneously used by outputting to a display, but neither double nor triple buffering would guarantee a 1/2ing of frame rate from 60 to 30 with vsync on.
 

dssdghthd

Honorable
Jan 14, 2013
63
0
10,640

Double buffering doesn't guarantee 1/2ing of framerate when vsync is on? Then what does? What are the necessary conditions in addition to [Vsync on + double buffering] to 1/2 the framerate?
 
V-sync will do that all by itself. If the GPU can't render the frame within 16.67ms (for a 60Hz display), it has to wait for a second refresh cycle. At that point you've waited 33.3ms for the frame to be rendered onscreen and therefore can't possibly see more than 30fps on your monitor. Anything between 30fps and 60fps on a 60Hz display won't be synchronised (1.5 frames per 2 refresh cycles for example isn't v-synced). The confusion with FRAPS I suppose comes from the stage in the rendering pipeline at which FRAPS counts the frames.
 

dssdghthd

Honorable
Jan 14, 2013
63
0
10,640

I'm getting confusing answers here. One tells me that Vsync and Double buffering don't necessarily cause the framerate to 1/2, while this one says that they most certainly will.
And I didn't quite understand what you said about FRAPS, does it show the actual framerate or not?
 
It shows the actual number of frames being rendered by the GPU, but that's not the same as the number of frames outputted to the monitor while you're using v-sync. It's a slightly complicated topic and not easy to explain. Essentially, a monitor has its own framerate (known as the refresh rate) and on a typical monitor that will be 60Hz, or 60 refresh cycles per second. If the card is outputting 50fps or 70fps, the output of those frames isn't (and can't be) synchronised with the refresh rate of the monitor. That's why you get tearing.

For synchronisation, your GPU needs to render 1 frame per refresh cycle of the monitor. If it can't achieve that in 1/60th of a second (16.67ms) then to achieve sync, the frame must be held over to the next refresh cycle. So then you have one frame per two refresh cycles. Two refresh cycles take 1/30th of a second to take place with a 60Hz monitor, so 30fps is the framerate you'll actually see if you can't maintain 60fps. I'd suggest not using v-sync if you don't have a 120Hz display but depends how important tearing is to you.
 

dssdghthd

Honorable
Jan 14, 2013
63
0
10,640


How is the output of the GPU any different than the monitor? It sends a frame --> monitor gets that frame; it doesn't drop the frame anywhere on the way; and if the monitor is getting 30fps, then 30fps is the speed at which the card is drawing those frames on the two buffers and then sending them to the monitor. Doesn't the card stop drawing frames when the second buffer is full and is waiting to be unloaded? If so then how would FRAPS detect 30fps as anything other than 30fps? Second and more importantly I can definitely feel the difference when FRAPS says 50fps and when it says 30fps, I'm positive it's not a placebo effect.

Edit: Unless perhaps if FRAPS measures the speed at which the frames are drawn on the buffer rather than the speed at which they are outputted? But that still doesn't explain why I feel the difference between 30fps and 50fps.
 
The only way you can be seeing the difference between 30fps and 50fps is if you're not using v-sync. You're not using adaptive v-sync or something? And I'm not sure what stage FRAPS measures at (I don't know about the details of that), though Tom's did an explanation of it when they examined framerates vs frame latency (fairly recently article in the graphics section).
 

dssdghthd

Honorable
Jan 14, 2013
63
0
10,640


Then it stands to reason that if Vsync alone most definitely halves the framerate and no triple buffering is enabled, then the game has to have triple buffering built into it already if the framerate is not being halved? Could there be another explanation? (no adaptive vsync is not enabled)

Edit: I don't think it's possible that FRAPS measures the speed at which the frames are drawn on the buffer rather than the speed at which they are outputted like I mentioned before, cause then FRAPS would show a fluctuating fps way above 60 when the game is static at 60fps with vsync enabled.
 

dssdghthd

Honorable
Jan 14, 2013
63
0
10,640
If vsync wasn't in effect then my framerate would be fluctuating well above 60, which it doesn't, it stays at 60 when it reaches 60; and it would also show screen tearing everywhere which I never get either. (Plus vsync is globally enabled)
 

dssdghthd

Honorable
Jan 14, 2013
63
0
10,640


Unless games have triple buffering built into them already? (Something I've repeated ten times in this thread, and it's in the title too).