Sign in with
Sign up | Sign in
Your question

Do all video games come with triple buffering built into the game?

Last response: in Graphics & Displays
Share
June 24, 2013 5:29:03 AM

I'm wondering cause I've never had my framerate dip to 30 when it went below 60, and yet I've never used triple buffering (triple buffering in the Nvidia CP is disabled and I have never even downloaded D3dOverrider to apply triple buffering to DirectX games), I only use regular V-sync alone.

I record framerate with Fraps. Is it possible that Fraps tells me I'm getting 50fps when in fact I'm getting 30?
I can definitely feel the difference between 30fps and 50fps.
June 24, 2013 6:06:46 AM

Most games will use triple buffering to avoid stalls in rendering or tearing from VRAM resources being simultaneously used by outputting to a display, but neither double nor triple buffering would guarantee a 1/2ing of frame rate from 60 to 30 with vsync on.
m
0
l
June 24, 2013 6:13:05 AM

zyky said:
Most games will use triple buffering to avoid stalls in rendering or tearing from VRAM resources being simultaneously used by outputting to a display, but neither double nor triple buffering would guarantee a 1/2ing of frame rate from 60 to 30 with vsync on.

Double buffering doesn't guarantee 1/2ing of framerate when vsync is on? Then what does? What are the necessary conditions in addition to [Vsync on + double buffering] to 1/2 the framerate?
m
0
l
Related resources
June 24, 2013 6:27:00 AM

V-sync will do that all by itself. If the GPU can't render the frame within 16.67ms (for a 60Hz display), it has to wait for a second refresh cycle. At that point you've waited 33.3ms for the frame to be rendered onscreen and therefore can't possibly see more than 30fps on your monitor. Anything between 30fps and 60fps on a 60Hz display won't be synchronised (1.5 frames per 2 refresh cycles for example isn't v-synced). The confusion with FRAPS I suppose comes from the stage in the rendering pipeline at which FRAPS counts the frames.
m
0
l
June 24, 2013 6:39:57 AM

sam_p_lay said:
V-sync will do that all by itself. If the GPU can't render the frame within 16.67ms (for a 60Hz display), it has to wait for a second refresh cycle. At that point you've waited 33.3ms for the frame to be rendered onscreen and therefore can't possibly see more than 30fps on your monitor. Anything between 30fps and 60fps on a 60Hz display won't be synchronised (1.5 frames per 2 refresh cycles for example isn't v-synced). The confusion with FRAPS I suppose comes from the stage in the rendering pipeline at which FRAPS counts the frames.

I'm getting confusing answers here. One tells me that Vsync and Double buffering don't necessarily cause the framerate to 1/2, while this one says that they most certainly will.
And I didn't quite understand what you said about FRAPS, does it show the actual framerate or not?
m
0
l
June 24, 2013 6:48:25 AM

It shows the actual number of frames being rendered by the GPU, but that's not the same as the number of frames outputted to the monitor while you're using v-sync. It's a slightly complicated topic and not easy to explain. Essentially, a monitor has its own framerate (known as the refresh rate) and on a typical monitor that will be 60Hz, or 60 refresh cycles per second. If the card is outputting 50fps or 70fps, the output of those frames isn't (and can't be) synchronised with the refresh rate of the monitor. That's why you get tearing.

For synchronisation, your GPU needs to render 1 frame per refresh cycle of the monitor. If it can't achieve that in 1/60th of a second (16.67ms) then to achieve sync, the frame must be held over to the next refresh cycle. So then you have one frame per two refresh cycles. Two refresh cycles take 1/30th of a second to take place with a 60Hz monitor, so 30fps is the framerate you'll actually see if you can't maintain 60fps. I'd suggest not using v-sync if you don't have a 120Hz display but depends how important tearing is to you.
m
0
l
June 24, 2013 7:04:43 AM

sam_p_lay said:
It shows the actual number of frames being rendered by the GPU, but that's not the same as the number of frames outputted to the monitor while you're using v-sync. It's a slightly complicated topic and not easy to explain. Essentially, a monitor has its own framerate (known as the refresh rate) and on a typical monitor that will be 60Hz, or 60 refresh cycles per second. If the card is outputting 50fps or 70fps, the output of those frames isn't (and can't be) synchronised with the refresh rate of the monitor. That's why you get tearing.

For synchronisation, your GPU needs to render 1 frame per refresh cycle of the monitor. If it can't achieve that in 1/60th of a second (16.67ms) then to achieve sync, the frame must be held over to the next refresh cycle. So then you have one frame per two refresh cycles. Two refresh cycles take 1/30th of a second to take place with a 60Hz monitor, so 30fps is the framerate you'll actually see if you can't maintain 60fps. I'd suggest not using v-sync if you don't have a 120Hz display but depends how important tearing is to you.


How is the output of the GPU any different than the monitor? It sends a frame --> monitor gets that frame; it doesn't drop the frame anywhere on the way; and if the monitor is getting 30fps, then 30fps is the speed at which the card is drawing those frames on the two buffers and then sending them to the monitor. Doesn't the card stop drawing frames when the second buffer is full and is waiting to be unloaded? If so then how would FRAPS detect 30fps as anything other than 30fps? Second and more importantly I can definitely feel the difference when FRAPS says 50fps and when it says 30fps, I'm positive it's not a placebo effect.

Edit: Unless perhaps if FRAPS measures the speed at which the frames are drawn on the buffer rather than the speed at which they are outputted? But that still doesn't explain why I feel the difference between 30fps and 50fps.
m
0
l
June 24, 2013 7:16:23 AM

The only way you can be seeing the difference between 30fps and 50fps is if you're not using v-sync. You're not using adaptive v-sync or something? And I'm not sure what stage FRAPS measures at (I don't know about the details of that), though Tom's did an explanation of it when they examined framerates vs frame latency (fairly recently article in the graphics section).
m
0
l
June 24, 2013 7:21:46 AM

sam_p_lay said:
The only way you can be seeing the difference between 30fps and 50fps is if you're not using v-sync. You're not using adaptive v-sync or something? And I'm not sure what stage FRAPS measures at (I don't know about the details of that), though Tom's did an explanation of it when they examined framerates vs frame latency (fairly recently article in the graphics section).


Then it stands to reason that if Vsync alone most definitely halves the framerate and no triple buffering is enabled, then the game has to have triple buffering built into it already if the framerate is not being halved? Could there be another explanation? (no adaptive vsync is not enabled)

Edit: I don't think it's possible that FRAPS measures the speed at which the frames are drawn on the buffer rather than the speed at which they are outputted like I mentioned before, cause then FRAPS would show a fluctuating fps way above 60 when the game is static at 60fps with vsync enabled.
m
0
l
June 24, 2013 7:26:01 AM

Simply that v-sync isn't in effect.
m
0
l
June 24, 2013 7:31:23 AM

If vsync wasn't in effect then my framerate would be fluctuating well above 60, which it doesn't, it stays at 60 when it reaches 60; and it would also show screen tearing everywhere which I never get either. (Plus vsync is globally enabled)
m
0
l
June 24, 2013 7:40:24 AM

You can't see 50fps with v-sync on a 60Hz monitor. Get a second opinion if you want.
m
0
l
June 24, 2013 7:44:17 AM

sam_p_lay said:
You can't see 50fps with v-sync on a 60Hz monitor. Get a second opinion if you want.


Unless games have triple buffering built into them already? (Something I've repeated ten times in this thread, and it's in the title too).
m
0
l
!