CUDA is mature now, it's being used and going to be released to mainstream very soon.If Nvidia won't support DX10.1 very soon, even GTX 280 doesn't support it and without Nvidia's support, no game developer is going to implement it. Assassins Creed removed it due to weak support. It's better to introduce it when both companies are on board.
What do you mean Assassins Creed "removed" it? Did they put out a patch to do this? I'm about to buy the game just to give 10.1 a whirl (and because it looks like it could be fun) but if that's true I might reconsider.
Sounds like an excuse for something, but I don't know what you're refering to that would apply really since both are already being implemented, so obviously technology isn't moving too fast for CUDA or DX10.1.
Can they even mix DX with CUDA? DX has always required exclusive control of the GPU. Trying to run a CUDA program on the GPU and then do the DX rendering each frame will cause conflicts, no? DX needs to keep track of what resources are already loaded to the GPU.
Besides, I thought the benefit of CUDA was to run lengthy algorithms that take longer on the CPU. It is also limited by the GPU instruction set, which does not include trig functions, nor the floating point precision of CPUs.
The fact that the 2900xt, a DX10 card, is faster than the 8800gtx and 9800gts tells me that, while 10.1 might be beneficial, the general Radeon architecture seems to be a better fit for Assassins Creed.
That archetecture was created with DX10 in mind, but M$ changed it because nVidia already had its cards out, the 8800GTX/GTS. Those cards, and every nVidia card made since then isnt capable of the DX10 (now DX10.1) implementation of AA thru their hardware, thats why AC pulled the DX10.1 and went backwards to just plain old DX10, which gave nVidia an advantage in that game. A game in which youll find on it, The Way its meant to be played (damn progress, full retreat ahead)