Sign in with
Sign up | Sign in
Your question

Assassin's Creed - First DX10.1 Title? what impact to ATI 3000 series?

Last response: in Graphics & Displays
Share
May 9, 2008 3:13:15 AM


SP1 vs no SP1 comparison on the 3870X2

We took our first step on the path of enlightement by comparing the performance of single and dual X2s with SP1 installed/uninstalled (or, if you will, with DX10.1 installed/uninstalled):
(refer to rage3d.com)







http://rage3d.com/articles/assassinscreed

Ubisoft comments on Assassin's Creed DX10.1 controversy
http://techreport.com/discussions.x/14707
a b U Graphics card
May 9, 2008 6:33:59 AM

Here it is in a nutshell, damn nVidia if they let this happen "After all, Nvidia recently signed on to the PC Gaming Alliance, whose charter involves pushing common standards like DX10.1 and increasing "the number of PCs that can run games really well." Assassin's Creed is nothing if not a perfect candidate for assistance on this front: a high-profile console port that's gaining a reputation for steep hardware requirements and iffy performance on the PC. How can such an alliance succeed if one of its members is working at cross-purposes with it in a case like this one? And what would the owner of an nForce-based system with a Radeon graphics card think upon learning that Nvidia's marketing dollars had served to weaken his gaming experience? " If nVidia is going to have to compete with Intel, this kinda crap wont hold up for an instant. So since ATI cards profit from DX10.1, and now we actually have two publications confirming: 1. DX10.1 cards benefit from true DX10.1 2: Assassins creed is a TWIMTBP'd game, and is currently being held back, only due to a glitch that Ubi made, which provides no enhancement, no positive effect, other than to punish and hinder DX10.1 and DX10.1 cards, or ATI. Ive about had it with nVidia. I dont like it when Intel pulls this with cpus, and Im not liking it now with this. To me, this is worse than any driver fix for a better bungholio mark, or a better fps mark in Crysis. Let them play all they want with their cards, just dont stunt the future of GFX for their usage, just to spite ATI

a b U Graphics card
May 9, 2008 9:26:10 AM

Im adding to this for this reason. When DX10 was first to be done on Vista, it included that non rendering pass, or DX10.1. At the time, only nVidia had their DX10 hardware out. It wasnt compatable, and ATI hadnt gotten their R600s out yet. So, M$ decided to do the next best thing in their eyes, go along with nVidia and let them claim their DX10 cards were truly DX10, when they couldnt meet all the actual requirements of the original DX10. M$ decided to wait for the full implentation of DX10 until Vistas SP1, where theyd just rename it DX10.1. ATI originally got screwed when M$ sided with nVidia, by not implementing full DX10, and how AA was done. AA curently is done on hardware only, or done twice as per the article on Toms, but only need be done once when done with DX10.1, which is the path ATI originally and still does to this day. When you do a arch rendering something a certain way, and that way is omitted before your hardware is out, what do you do? You drop it, to save money and time. Now all AA has to be done in AA drivers as their hardware isnt set up for it. Now once again, we see someone else coming to the aid of nVidia, the powerfull GFX king? And once again we see ATI taking it on the chin. Whos to blan\me? Youll hear blame the game makers, or you cant blame nVidia because why would a game maker side with them ? Well they certainly dont give a damn about radeon owners, and getting back to the beginning, blame nVidia, when they claimed their hardware was DX10 to begin with, which wasnt so. Also blame M$ for this mess, by not making nVidia tell people it couldnt do full DX10, but look how they play anyway. They screwed ATI, and now its being done again, just so nVidia could claim they have DX10 cards, what a shame
Related resources
May 9, 2008 11:12:24 AM

Hi5 to Jay. This man speaks the truth.

Ive been reading about this topic also (not only here but in other forums aswell) and there were a 20% FPS gain in Ati Cards compared to Nvidia (Version Russian Jerusalem , *cough*) with AA 4x activated. The thing is there were several diferent deteails when rendered by an ATI or a Nvidia (3870 vs 8800 ). Quality FX wise Ati would gain by small details( buildings, stairs the horsy seemed better, less jagged edges), FPS wise up to 20%. It is a nice bump. There was a rumour it was the first DX10.1 game.

Just adding that (tested my self) with Q6600 with 4gb Ram. The Diference was there. FPS wise and GFs wise.
If Ubisoft launched a patch to "nerf" ATI GPUs, well, thats not very nice. This one i defenantly wont buy.
May 9, 2008 12:45:51 PM

its simple
just don't apply the patch
i just cant get it why they are removing it rather than trying to improve it
May 9, 2008 1:24:38 PM

Ok... so... what? I got confussed with Jay of what side of the fence he is on.
May 9, 2008 1:52:39 PM

spaztic7 said:
Ok... so... what? I got confussed with Jay of what side of the fence he is on.


Our side of the fence. The gamers.
May 9, 2008 2:11:18 PM

Yeah pretty **** that Ubisoft wants to dump it in a patch. That makes me think there is definitely some nvidia marketing dollars behind that decision. I have the game and it runs damn good in DX10 mode with everything all the way up, so I'm definitely not going to get their patch if or when it comes out.

I do hope more games use 10.1 though, I think my ATI card would really shine in those games.
May 9, 2008 2:24:47 PM

So what I am gathering from this discussion is, Nvidia is paying (or compensating some how) game makers (Ubisoft in this case) to alter the games so ATI cards do not work as well. Isn't that illegal or violate some kind of antitrust laws?
May 9, 2008 2:32:03 PM

spaztic7 said:
Ok... so... what? I got confussed with Jay of what side of the fence he is on.



I'm pretty sure jaydeejohn is on the side of honesty. Nvidia did not produce a true DX10 card with the original 8800 series. I have a 8800 GTS 640 in one of my computers and can say that it is buggy at best, and even goes so far as to not work as well as it should with AMD K8 chips, as detailed in Nvidia's release notes. That's one of the reasons that I bought an ATI card the next time.

I think Ubisoft is acting like a lying slime ball in all this. Their game writers wrote the game in accordance to DX10.1 and it obviously works better than Nvidia's idea of DX10. But look at Ubisoft's statement that they are removing the DX10.1 support "due to the fact that our implementation removes a render pass during post-effect which is costly". The DX10.1 render pass is costly? How so? Or perhaps its costly because Nvidia threatened to remove their financial support, you know, that "the way its meant to be played" label that Nvidia pays to game companies who sign on to Nvidia's game.

Of course, M$ is playing Nvidia's game as well, by not enforcing the DX10.1 standard that was originally that DX10 standard and not doing anything about it even now. After all, its M$ that certifies the games and video cards as DX10 or DX10.1, as the case may be. I never did feel that my 8800 GTS was a DX10 card, but rather a very fast DX9 card. Apparently, Nvidia still does not make a true DX10 card (original specification), much less a DX10.1 card, and M$ is going along with the deception.

That leaves the question of "Why". Why is M$ certifying cards as DX10 when they are not? Why is Ubisoft, and perhaps other companies, actively not supporting the DX10.1 standard? The conclusion that I see is that Nvidia has been making a lot of payoffs, and we consumers are getting screwed as a result. I don't mean to say that Nvidia cards aren't fast, they are. Its just that they aren't all of what they claim to be.
May 9, 2008 3:44:46 PM

Gotcha! Ok, when I read his posts I just got more and more confused as he went on.

I am looking to replace my 8800GTX with a HD 4870 when it comes out (assuming the gains are there).

This also goes back to posts I made last year here and other places before the X2900 came out that the 8800 line is a hybrid card. I got flamed for that and got chewed out. Maybe I shouldn’t have said that on the Nvidia forum… Now it seems like I was right. Everyone new that the 8800 line was not a true DX10 card because Nvidia couldn’t get everything to work 100%. ATi could. But many of us wanted to believe that it was; that the crazy Canucks couldn’t have gotten it right. So, in the attempt for MS to be able to have DX10 cards for release of Vista (which I use and love), they relaxed their standard so the 8800 line would be there.

It has also been said that ATi cards will run slower in DX9 games but will become much faster when true (to the original standard or DX10.1 standard) DX10(.1) games come out because the way DX10(.1) works. They don't depend on as much as the parts the Nvidia emphasizes on but more of what ATi emphasizes on. Sorry that I can't name any specific parts or not... It’s been a while since I read that article and I can't remember where I read it. It was either THG/F, Maximum PC, or Tweak Town.

So seeing the 3870 line doing well in this game does not surprise me at all.



I do agree that it is bull **** that game developers are reducing quality and the real functionality of a game because they want to bend to the will of Nvidia. Someone should stand up and say "fix your cards for my game!". A developer should side with the ATi so we can get true DX10 games; none of this half-breed **** that we have.
May 9, 2008 4:53:44 PM

This makes me hope that fallout 3 gets the 10.1 standard though. Probably won't happen if it is a twimtbp game.
May 9, 2008 5:36:46 PM

Who's the winner here?
Us! The consumers! yeah!

PS: Congratulations ATI for doing something great after being bought by AMD
a b U Graphics card
May 9, 2008 9:43:02 PM

Sorry for all the confusion, and thanks Radnor. It was rant/facts and frustration in a huge mix, with lots of things conflicting, thus making our current situation what it is now, and I didnt explain it as I should, as Ive been following this too for a long time. Heres a link showing M$'s connection, decisions and complicity in all this http://blogs.msdn.com/ptaylor/archive/2007/03/03/optimi... This link shows M$ direction for this, and nVidias solution we all know, as we have the G80/90 cards. And if youve read enough on these forums, or elsewhere, youd know we still havnt had a true DX10 game out....UNTIL now. And again, look what theyve done. I havnt gone to Vista, as this right here is the main reason why. DX10 for Vista, that was the promise. nVidia G80, true DX10 cards, that was the promise. Like I said, what a shame. Tho it is encouraging that game devs are anticipating using real DX10 in their future games, its just that I thought that I heard the same thing last year at this time......
May 10, 2008 3:16:25 AM

ugh, Jay, use paragraphs...my eyes are bleeding...


njalterio said:
So what I am gathering from this discussion is, Nvidia is paying (or compensating some how) game makers (Ubisoft in this case) to alter the games so ATI cards do not work as well. Isn't that illegal or violate some kind of antitrust laws?


Well, that is the hidden dark side of TWIMTBP. They don't need to pay anyone. NV can use their services as leverage.
May 10, 2008 3:21:47 AM

SpinachEater said:
ugh, Jay, use paragraphs...my eyes are bleeding...


+1
May 10, 2008 4:57:55 AM

here we go again:) 
!