OK, WTH is going on with Vsync in DX10?
For what it's supposed to do, it doesn't make sense that what it affects is min fps, unless it's averaging portions of second where it's getting 150fps for half a second and then 30fps for the other half and then taking the average of the two instead of the 36fps it was doing before.
That's the weirdest result I've seen with V-sync in a long time.
And while the conclusion on the DX9 page test says: "Turning off the Vertical Sync setting on Windows XP produced a comparable improvement to BioShock's frame rates." That's not what the number show, it shows a dramatic impact on min fps (what's usually most important to games) in DX10/Vista and no effect on min fps in DX9/XP.
That to me just jumps out as something I'd love to hear an explanation about, because while I more than expected the max fps to skyrocket ([edit] and thus avg as well) , but I didn't expect such a healthy jump of the min fps by removing Vsync, and I don't understand it for what Vsync is supposed to be doing, and the benefits you should derive from not rendering the additional frames. It almost seems as if buffering were being hugely negatively impacted by turning on Vsync and that the GPU had to try and catch-up skipping frames in virtual space/time or something.
Weird. Anyone know WTH is happening with that?