Dissecting DX10, Part 3: BioShock

bornking

Distinguished
Jul 10, 2006
393
0
18,780
Gret reveiw...but I must say I did not see any real diff in the last two screenshots (DX10 and DX9.c).

I am still dissapointed with DX10 until better games are written for it...
 

wicko

Distinguished
May 9, 2007
115
0
18,680
Rob, how did you detect framerates higher than 60 if that's what its locked at? Does FRAPS detect them anyway or something, or is there a way to unlock the framerate?
 

robwright

Distinguished
Feb 16, 2006
1,129
7
19,285


Fraps detects frame rates higher than 60. You jusy need to tinker with the settings. It's surprising easy to use.

Gret reveiw...but I must say I did not see any real diff in the last two screenshots (DX10 and DX9.c).

I am still dissapointed with DX10 until better games are written for it...

I had the same impression when I first started playing the game on XP; it looked pretty much the same. But after going over the same section in both Vista and XP again and again, I began to see some noticeable differences. The screenshots, unfortunately, don't illustrate the differences all that well, though you can see some color and detail differences between the crazy Splicer Doctor (and notice how in Vista, he's wearing a light on his head and in XP, he just has one of the silly metal circles on his head. Not sure why that is).

You should be able to turn off vsync in the options.

Yes, Vsync can be adjusted in BioShock's graphics otpions but iyou also need to check the Nvidia control panel to see what vsync is set at there -- on, off, or let 3D application decide.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
I ran Bioshock on both XP and Vista with my 8800GTX and I did not notice any considerable difference in Bioshock. I believe Firingsquad was a bit more accurate on it's comparison of Bioshock Direct X9 vs. Direct X10.

dx9couch.png

dx10couch.png

dx9plant.png

dx10plant.png

dx9shadow3.png

dx10shadow3.png
 
OK, WTH is going on with Vsync in DX10?

For what it's supposed to do, it doesn't make sense that what it affects is min fps, unless it's averaging portions of second where it's getting 150fps for half a second and then 30fps for the other half and then taking the average of the two instead of the 36fps it was doing before.

That's the weirdest result I've seen with V-sync in a long time.

And while the conclusion on the DX9 page test says: "Turning off the Vertical Sync setting on Windows XP produced a comparable improvement to BioShock's frame rates." That's not what the number show, it shows a dramatic impact on min fps (what's usually most important to games) in DX10/Vista and no effect on min fps in DX9/XP.

That to me just jumps out as something I'd love to hear an explanation about, because while I more than expected the max fps to skyrocket ([edit] and thus avg as well) , but I didn't expect such a healthy jump of the min fps by removing Vsync, and I don't understand it for what Vsync is supposed to be doing, and the benefits you should derive from not rendering the additional frames. It almost seems as if buffering were being hugely negatively impacted by turning on Vsync and that the GPU had to try and catch-up skipping frames in virtual space/time or something.

Weird. Anyone know WTH is happening with that?
 

d0000h

Distinguished
Jul 13, 2006
31
2
18,535
I read on GURU3D to turn on tripple buffering if you want to enable vsync. Which is what I think I did at home and it runs fine on XP like this.

Also more info below from tweakguides.com

"Vertical Sync: Vertical Synchronization (VSync) is the synchronization of your graphics card and monitors' abilities to redraw an image on the screen a number of times each second, measured in Hz. It is explained clearly on this page of my Gamer's Graphics & Display Settings Guide. When VSync is enabled your maximum FPS will be capped at your monitor's maximum refresh rate at your chosen resolution, and more importantly in some areas your FPS may drop by as much as 50% or more if the graphics card has to wait to display a whole frame. Thus setting VSync to On can have a major negative performance impact, though you can counter the performance drop by enabling Triple Buffering - see this page for details of how to enable it.



Note that since VSync relies on the Refresh Rate used in the game, if you do leave VSync on you should set the correct refresh rate by altering the DesiredRefreshRate variable in the Bioshock.inii file - see the Advanced Tweaking section for details. Also note that at the moment BioShock appears to have a glitch whereby each time you load it in Vista, VSync may be reset to On regardless of what you set it to last time. Finally, VSync can be a major contributor to mouse lag, so disable it if that's the case, and see further below for more details"
 

Shin-san

Distinguished
Nov 11, 2006
618
0
18,980
I'm starting to wonder on the DirectX 10 games, mostly on the games that do better in DX10. I have a DX9 card (Geforce 7900GS) and was wondering if a game like this would be better in XP or Vista. I know that DirectX itself has a good deal of emulation capabilities on top of its hardware rendering.
 


Yeah I realize that, but usually the drop was regarding the avg, and also a small 1-2 fps with it either going up or down in min fps, but I don't think I've ever seen that large of an impact on min fps before. And the 'usual' result is the opposite action, just extremely surprising, and it makes the case all the more important to use triple buffering in D3D10 using AtiTrayTools etc, since otherwise it's only offered in OGL. If it's buffering like I suspect, and like they mention, that's a freakin' huge example of the impact.

nV can enable triple buffering in their control panel for DX right? It'd be interesting to see it contrasted.
 

wicko

Distinguished
May 9, 2007
115
0
18,680
Oh, i guess I had VSync on the whole time. I didn't think it would affect the framerate, just only display on a refresh cycle or something (ie it drops frames but the video card is still rendering the at whatever framerate it is capable of). Thanks for the clarification.
 

RayinDE

Distinguished
Mar 4, 2007
4
0
18,510
"minimum frame rate of 28 and maximum frame rate of 62. The average frame rate for the five-minute trial was 58.03."

Minimum frame rate info in a review is brilliant. It would be great to see it in hardware reviews also. Consider: someone playing the latest shooters on a 1280 x 1024 monitor might be advised on the board to buy an 8800 GTS 320 card for a new build. Their avg frame rates would be good but their mins would be quite low. The above mentioned min is from a quad core oc'ed to 3.49 with 2 8800 Ultras.

Minimum frame rates are key I think. I hate having them fall to an unplayable rate when I get attacked by a horde of Splicers or other.
 
Yeah, Cleeve recently added them to his reviews.

I like seeing Hystograms, because it gives you a good idea of how long those min dips last for and how frequent they are, but it's definitely asking for alot to do that for reviews.
 

Narg

Distinguished
Mar 15, 2006
115
0
18,680
2 points:

1600x1200 is not wide screen, as stated in the article.

nVidia drivers are aweful these days. Please try the test with ATI, just for grins if nothing else.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
Yes, but I'm pretty it only works with OpenGL when enabled in the drivers.
 

hesido

Distinguished
Aug 24, 2006
23
0
18,510
The benefits of Dx10 in the article is slightly exaggarated. First of all, they were looking at the wrong places for comparison. Like Heyyou27 mentions, Firingsquad's comparison was spot on, but Tom's hardware comparison is looking at the wrong places.

The non-clipping DX10 particle effects and less-flickery and crisper shadows (possibly better admired when in motion) are great improvements. However, I doubt if they merit an upgrade *if* you own a high end DX9 card.

My own personal view is, don't upgrade *yet* to a dx10 card if you own a high end setup. If you have a mainstream Dx9 card, you may go for mainstream-to-high segment in Dx10.



 

robwright

Distinguished
Feb 16, 2006
1,129
7
19,285


Point taken. I usually tend to agree that for pretty much all games these days, the Vista upgrade isn't quite worth it. And I still don't think the differences for BioShock merit an upgrade; it looks great on DX9, so you're not losing anything. Still, after playing the same section of the game dozens of times for days on end, the subtle differences began to stand out more. And you definitely notice the differences more when the game is in action than screenshots.

So yeah, upgrading to Vista for DX10 is still hard to justify. But the point I was trying to make is that, after testing the initial round of Vista titles and seeing a FPS performance decline on Vista as opposed to XP, we're finally seeing a game that is at least moving in the right direction by offering some visual upgrades as well as impressive frame rates. So that's good news for the future of Vista and DX10. We'll see what happens with Crysis....
 

wicko

Distinguished
May 9, 2007
115
0
18,680
A bit off topic, but has anyone been able to get AA running without any mouse lag in game? I enabled 4xAA using RivaTuner, to get some nice AA going in bioshock, but my mouse just lags for some reason. Kind of like if I go past 8xAA in Doom 3 or something. I have an 8800 GTS 640MB, so it should be no problem, and only running at 1280x1024.