Nvidia came out with their dx10 cards a few months ago. DX10 games have been in development for a lot longer. Does nvidia and ATI give (or sell) developers dx10 cards so they can make games, or do the developers make games on existing GPU's? I assume the former as we had the Crysis video for quite some time, which brings the next question...
If developers had their own pre-release DX10 cards, wouldn't they need decent, stable drivers? If so, then why do I read posts here about "when the new drivers come out..."?
i would imagine they just get the API off M$, if they dont have cards or DX10 emulation for existing cards im guessing they would write wrappers, then get prerelease hardware to test the new software, i dont work for ati or nvidia so i cant say thats 100% definate, but i do work in software
Performance is always the last thing tweaked which unfortunately in the PC world means s/w companies release software which performs really badly and expect the hardware to catch up later, which sux for any sort of persistent online game like when b2 came out and required 2Gig ram, pretty fast CPU for the time, very fast harddrive and of course decent 256Meg GPU, nowadays it runs well on even a cheap system
Here's my guess, as a developer: developers have libraries of functions that act as a wrapper over DirectX 9 and they use these to develop the game. In parallel with that, or at the end, there's some effort to rewrite that wrapper based on DirectX 10. You end up with games that work on both DX9 and DX10, which is excellent for selling to more customers. Most of the development of Crysis was probably done using 7950GX2 or X1950XTX cards and DirectX9.
A little more simply put it's like an emulator, to allow you to see/use/manipulate the features on hardware that doesn't natively support it.
The other thing to remember is that while they do get eaarly sample hardware, it's often only a select few samples, and even they are put on a select few machines for testing, so every setup is not testable by a long shot.
Some bugs also go un-noticed because the people testing them don't play the 1,000s of different scenarios that arise in the game (like Alt-Tabbing before crossing the river instead of after).
And also while the dev may hav pre-release hardware and alpha drivers (based on the recent drivers when they were given the card), since that time the IHV may have tweaked something for the current cards and a currently shipping game (or another test card in a current or beta game) in the way the card handles something, that in turn breaks something in how this new developing game handles the card and it's drivers. So that when the IHV bundles all the drivers together now in their latest build for product launch, then neither company knows about this impending problem until the two products meet again (hopefully in Beta testers' rigs), and then the IHV or game dev has to spend some time to fix what's broken without breaking something else in turn, which can sometimes take many weeks, and even months.
Lotsa variables means lots of potential failure points.
Also only once the game is finalized and then the IHVs can run it on their test systems, can they then disect it and say, oh, hey the game uses alot of this, and we were handling it A-B-C-D in our standard drivers, but if we tell the drivers/hardware to handle this type of code D-B-A-C instead then the hardware doesn't have to wait as many clock cycles at this point to do this function. Remember the game developers while adept and understanding of the hardware, they usually aren't as good as the hardware developers at maximizing their performance. An efficiently coded game helps alot, but the hardware guys can also decide what can be improved upon if there's some flexability in their implementation.