Sign in with
Sign up | Sign in
Your question

DirectX10.1 in future games?

Last response: in Graphics & Displays
Share
June 7, 2008 4:06:23 AM

Will future games be optimized to support DirectX10.1?(become standard)
If that happens, what'll happen to nVidia?

More about : directx10 future games

June 7, 2008 4:15:29 AM

10.1 directx includes features that largely only effected ati 3000 series cards as far as i know - namely the passes done by anti-aliasing

there are some bugs in 10.1 with basically anything but ati 3000 series; so I somewhat doubt it will be standard. I'd look out for directx 10b or something of the sort though, but it'll prolly just be optimizations.
June 7, 2008 4:51:49 AM

romulus47plus1 said:
Will future games be optimized to support DirectX10.1?(become standard)
If that happens, what'll happen to nVidia?


I think this will largely depend on how successful the 4000-series ATI cards are. With Nvidia still holding the majority of the discrete graphics market and not having any support for DX10.1, there is not much incentive for developers to optimize games for DX10.1 at the moment. However, if ATI's new cards turn out to be very successful and ATI takes back a large chunk of the market you may start to see games with DX10.1 support. Most likely this would be largely facilitated by ATI using some of their new influx of cash to sponsor games like Nvidia does. I would guess that any game sponsored by ATI in the near future would be very likely to have DX10.1 support.
Related resources
June 7, 2008 4:55:33 AM

Just_An_Engineer said:
I think this will largely depend on how successful the 4000-series ATI cards are. With Nvidia still holding the majority of the discrete graphics market and not having any support for DX10.1, there is not much incentive for developers to optimize games for DX10.1 at the moment. However, if ATI's new cards turn out to be very successful and ATI takes back a large chunk of the market you may start to see games with DX10.1 support. Most likely this would be largely facilitated by ATI using some of their new influx of cash to sponsor games like Nvidia does. I would guess that any game sponsored by ATI in the near future would be very likely to have DX10.1 support.


Definitely answered my question. Thank you!
a c 130 U Graphics card
a b Î Nvidia
June 7, 2008 7:38:03 AM

ovaltineplease said:
10.1 directx includes features that largely only effected ati 3000 series cards as far as i know - namely the passes done by anti-aliasing

there are some bugs in 10.1 with basically anything but ati 3000 series; so I somewhat doubt it will be standard. I'd look out for directx 10b or something of the sort though, but it'll prolly just be optimizations.


This isnt my understanding of it and i think you have it mixed up a bit. As i understand it DX10.1 is basically what DX10 should have been before M$ changed it around because Nvidia couldnt meet the requirements. The ATI cards have the abillity and the Nvidia cards dont. Its not a bug issue its that the Nvidia cards plain cant do it and Nvidia are spinning it and trying to say its an irelevant issue.
We had a thread about it a while back and we concluded that what Just_An_Engineer has said is about the size of it and support if it is provided will probably be in the form of a patch.
Mactronix
June 7, 2008 3:24:09 PM

10.1 supports great software AA right?
Then it means nVidia's using hardware AA?
Or ATI's using hardware AA too to run DX10 games right?
What's the benefits between both(hardware and software AA)?
a c 130 U Graphics card
a b Î Nvidia
June 7, 2008 9:03:30 PM


I beleive that DX10.1 is hardware AA imnot 100% sure but this even though it is from wicki seems to suport that.

"Direct3D 10.1 is an incremental update of Direct3D 10.0 which is shipped with, and requires, Windows Vista Service Pack 1.[6] This release mainly sets a few more image quality standards for graphics vendors, while giving developers more control over image quality.[7] It also requires a whole new set of requirements, including Shader Model 4.1 and 32-bit floating-point operations. Direct3D 10.1 still fully supports Direct3D 10 hardware, but in order to utilize all of the new features, updated hardware is required.[8] As of April 24, 2008, only the ATI Radeon HD3xxx series GPUs are compliant".

That basically sums up my understanding of it. Someone like The Great Grape Ape could tell you which does AA how, Hardware or software. But you could probably find out yourself if you googled around abit. Try sites like Tec Report and Beyond 3D they are very good.
Im a bit pushed for time right now but if you have no luck or nobody else joins the thread and gives you the answers let us know and i will get back to you.
Mactronix :) 
June 7, 2008 9:16:13 PM

mactronix said:
This isnt my understanding of it and i think you have it mixed up a bit. As i understand it DX10.1 is basically what DX10 should have been before M$ changed it around because Nvidia couldnt meet the requirements. The ATI cards have the abillity and the Nvidia cards dont. Its not a bug issue its that the Nvidia cards plain cant do it and Nvidia are spinning it and trying to say its an irelevant issue.
We had a thread about it a while back and we concluded that what Just_An_Engineer has said is about the size of it and support if it is provided will probably be in the form of a patch.
Mactronix


Yea, and i'm not trying to quibble over semantics but in the real world what I said is still relatively true and there has been reported bugs on ati 2000 series cards in directx 10.1 and I have seen some talk of x800 series and whatnot having issues as well.

Anyways, its hard to say whether ATI has changed their AA to be more similar to nvidias in their new gpu; so its arguable as to whether 10.1 optimization (support will prolly always be there) is going to be a big ATI thing - maybe it will be - but they are definitely throwing a lot of their eggs into a basket if it is.

Like, I guess to put it in perspective if you look at ati 3000 series; if developers had adopted 10.1 directx than the 3000 series would've outperformed nvidia models with AA enabled (so i've heard is the idea) - however this did not happen because there was simply too much money at stake from nvidia, as a result the cards ended up not having enough resources to do a method similar to nvidia's anti aliasing method (hence terrible AA performance on 3000 series in a lot of games)

Really if games like crysis had supported dx10.1 I bet you at least 50 cents that ATI would've been on top of the performance gpu market in the last year - however this is not reality and maybe it wouldn't have been reality even under ideal conditions, but the potential was there.
June 8, 2008 5:36:11 AM

nvidia invest much capital in the company of games
i think future games will jump over dx10.1 an directy into dx11
just looking Assassin's Creed
June 8, 2008 6:10:12 AM

i think DX10.1 will be skipped since assassin's creed dropped DX10.1 because they dont see the future and potential in it.plus not other game companies is investing time into it so ubisoft just following the "wave".
a b U Graphics card
June 8, 2008 7:18:03 AM

Assassins Creed dropped DX10.1 dubiously, for reasons no ones figured out, and the company that made it has been under fire by many review site thruout the web. Its supposed to have been done in shaders starting with DX10, as it was included in the original spec. nVidia had already come out with their cards, which werent compliant, and ATI at the time was just about to release their cards. Microsoft decided to use the old hardware way of doing the AA in hardware, as thats what nVidia had. That left ATI in a fix, since their cards were set to follow the true DX10 model. Assassins creed has shown that theres at least a 20% improvement using DX10.1 and AA with ATI cards, but theres been controversy and a bit of anger when they decided to remove it, since nVidia "helped" them develope the game. So, real DX10, and future DX models will do AA in shaders, which only ATI currently have. Heres a link to show what Micro Soft did, and why they did it, from a Micro Soft Devs site http://blogs.msdn.com/ptaylor/archive/2007/03/03/optimi...
June 8, 2008 10:12:30 AM

we shall see if the release of the hd4800s be as good as it seems, probably devs would start to do sumthing abt dx10.1.ATI certainly has to catch market share so that dev wont be threatened by nvidia.its not that nvidia is bad....its just...you know....HUANG!
a c 130 U Graphics card
a b Î Nvidia
June 8, 2008 12:42:56 PM

@ ovaltineplease,
From what i have heard it seems that they have changed the AA back to the rops, but its all still speculation.
Its not like ATI are putting all there hopes on DX10.1 or "All their eggs in one basket" as you put it, all they are doing is sticking with the cards the way they developed them in the first place IE fully Original DX10 spec compliant.
As you say it does seem that people (Devs) are sucking up to Nvidia, as what you say is quite true the performance with AA would be quite different if it (DX10 ) was standard like it should have been from the start.
Can you link me to some of these reports you are talking about with regard to these bugs/issues you are talking about ? I couldn't swear as to whats going on with the 2000 series cards but i do know that the X800 series cards really want to be having a problem with it, they are DX 9 cards and as such couldn't even run DX10 let alone DX10.1.
I suspect its either a typo or some one has seriously mislead you.
Mactronix :) 
[ Edited for spelling and fat fingers :)  ]
August 5, 2008 1:15:19 PM

lol DX10.1 is nothing to have to upgrade.. very minor and insigificant update to DX10 that doesn't offer anything that even developers want to use.... the standard of 4xAA is about the only change, and with DX10 or 9, you can pick 4xAA or 8x or 16x or 0x or whatever you want which is better anyway.

The only game to use anything DX10.1 was Assassin's Creed and they removed that in a patch as well.
August 5, 2008 1:30:35 PM

Nvidia doesn't have the talent or ability to make DX10.1 cards. They hold back progress because of their own "shortcommings"
August 5, 2008 2:20:25 PM

I see a direct dx10 to dx11 transition for the mainstream largely skipping 10.1 entirely. I didnt consider 10.1 to be a feature of interest when I brought the 4870. Im pretty convinced DX11 will be on us before many games start doing anything worthwhile with dx 10.1
August 5, 2008 3:57:56 PM

I just hope some more games put it to use. Assassin's Creed runs great for me, and I believe it would improve AA performance with my 3xxx series card.
a b U Graphics card
a b Î Nvidia
August 5, 2008 9:14:18 PM

fps_dean said:
lol DX10.1 is nothing to have to upgrade.. very minor and insigificant update to DX10 that doesn't offer anything that even developers want to use.... the standard of 4xAA is about the only change,


Those three statements are made by people who don't know anything more about DX10.1 than what they've been told by someone equally uninformed.

While the developments are in addition to what's there, they are obviously far more than simply adding app controlled 4XAA minimum support, or else nV would've been able to ad it wouldn't they. :sarcastic: 

Quote:
and with DX10 or 9, you can pick 4xAA or 8x or 16x or 0x or whatever you want which is better anyway.


You can do that with DX10.1 as well, it just that to be compliant it must be able to support a minimum when required by the developer. But it's not forced nor a hard limit, which illustrates your misunderstanding of the requirement at even a basic level.

Seems that the developers prefer some of the features that were dropped from DX10 to become DX10.1, Epic's Tim Sweeney (Gears of War, UT3) specificially pointed out the most important part of DX10 being the part that was delayed to DX10.1; "I see DirectX 10's support for virtualized video memory and multitasking as the most exciting and forward-looking features."
http://www.firingsquad.com/hardware/directx_10_graphics...

Add to that better buffer read/right and to render from MSAA buffers, plus materials management, these are all things that when implemented are great efficiency features. Tesselation will depend on how quickly they can adopt DX11.

Considering the time to DX10 (2 delays, an OS delay, and then driver delays) I wouldn't be expecting DX11 to be right on tic-toc time.
Likely the period of DX10.1 specific benefits wil be short, but as we saw with DX10.1 in Assassin's creed there is little reason to think that those features won't be exposed to those cards when DX11 hits. This is not like the DX9/DX10 split.

I still wouldn't use it as the main reason to buy a card, but just like before and every iteration of D3D before it, it might be a tie breaker for choices that are close enough otherwise.
!