Sign in with
Sign up | Sign in
Your question

Microsoft: DirectX 10 Hardware WILL Support DirectX 10.1!

Last response: in Graphics & Displays
Share
November 3, 2007 9:38:31 AM

Just read this, much to my happy surprise because I was nervous about my new EVGA 8800 GT NOT supporting DX 10.1.

Here's the link:

http://news.softpedia.com/news/Microsoft-DirectX-10-Har...

UPDATE:

Ok, finally found out why the HD 3800 series from ATI will be SLOWER than the G92 offering from NVidia despite having superior components:

"ATI will lose to G92/D8P, primarily because its Shaders are running at clock speed, while Nvidia runs shaders 2.5 times faster than the chip clock and gains a lot of performance as a result. ATI cannot do this with its current RV670 generation, but we would not be surprised if they use it for their next generation scheduled for the middle of next year."

So it comes down to the shader speed in NVidia vs ATI.
November 3, 2007 9:59:56 AM

"Even though DX10.1 will support current DX10 graphics hardware, today's DX10 hardware will not be able to support all of the features of DX10.1, which includes incremental improvements to 3D rendering quality."

Quoted from the original source:

http://www.next-gen.biz/index.php?option=com_content&ta...

Unfortunetly, your 8800GT will still not support DX10.1. What they are saying is, that no game will be just DX10.1 exclusively. Your card will be able to run all new games, just not in DX10.1 (or more accurately, without the new 10.1 advancements).
November 3, 2007 10:19:55 AM

mitchellvii said:
Just read this, much to my happy surprise because I was nervous about my new EVGA 8800 GT NOT supporting DX 10.1.

Here's the link:

http://news.softpedia.com/news/Microsoft-DirectX-10-Har...



O'really ?
I am going to buy a X1950PRO :D  because I want best Peformance for Bucks now and I totally ignored the idea of DX10 for another year , and ur NERVIOUS because of Dx10.1 ?

LoL .
Related resources
November 3, 2007 10:23:08 AM

ib,

True, however, from what I have seen of 10.1, some of the "advancements" are highly undesireable.

For instance, forcing anti-aliasing willl mean that shaders will probably need to be set to HIGH. Although setting shaders from medium to high makes a small difference (IMHO) in the game experience, it creams your FPS bigtime (cuts my Crysis FPS in half on my 8800 GTS 640).

I prefer to be able to set my own shaders and anti-aliasing.

Also, I just bought an EVGA card which means I have 3 months to "trade-up" if they decide to add the 10.1 support later to compete with ATI's card (which although it has great looking specs, apparently is 15% slower than the GT).

My guess is they will come up with a hack that "mimics" 10.1 support even though it may not truly support it (e.g., the XP hack that gives DX9 users DX10 effects at much higher FPS).

In other words, I'm not concerned. The geeks out there always find a workaround.
November 3, 2007 10:26:32 AM

Come to think of it, DX10.1 may be THE reason the ATI Card underperforms despite DDR4 ram, smaller die size and higher clock speeds. Forcing the anti-aliasing is a performance killer (and honestly, in a fast paced game, you almost can't tell the difference so long as your Object Models are set to VERY HIGH)
November 3, 2007 10:41:04 AM

mitchellvii said:
Come to think of it, DX10.1 may be THE reason the ATI Card underperforms despite DDR4 ram, smaller die size and higher clock speeds. Forcing the anti-aliasing is a performance killer (and honestly, in a fast paced game, you almost can't tell the difference so long as your Object Models are set to VERY HIGH)


Possibly, my post, however, was just in reference to how I interpreted your post. I just figured you meant that DX10 cards will be DX10.1 capable, which is really understandable because the Softpedia article is misleading. Although I am curious as to where you heard about the new ATI cards performance. I haven't been able to find any preliminary benchies on the HD 38** series cards yet. If you are refering to the HD2900's, they were not DX10.1 capable ro begin with if i'm correct.
a c 169 U Graphics card
November 3, 2007 11:12:55 AM

In todays games , the difference between DX9 and DX10 isnt much and it will take some time (like 1-2 or maybe more years) for games to be fully DX10 optimized, so lets not worry about DX 10.1 (at least when it isnt here yet and we dont have benchmarks of it )
a c 130 U Graphics card
November 3, 2007 11:45:33 AM

mitchellvii said:
Come to think of it, DX10.1 may be THE reason the ATI Card underperforms despite DDR4 ram, smaller die size and higher clock speeds. Forcing the anti-aliasing is a performance killer (and honestly, in a fast paced game, you almost can't tell the difference so long as your Object Models are set to VERY HIGH)

Can you link us to the info about the ATI cards performance please should make a good read.
Mactronix
November 3, 2007 12:37:55 PM

Guys, here is some of the inside scoop on the ATI Performance. Not detailed benchmarks, but overall commentary. I've seen this in quite a few places so I am guessing it may be correct:

http://www.fudzilla.com/index.php?option=com_content&ta...

RV670XT to end up 20+ percent slower than the 8800 GT
Written by Fuad Abazovic
Thursday, 01 November 2007 08:44

"Our sources have confirmed that RV670XT, the card that might be called Radeon HD 3870, should end up more than twenty percent slower than the 8800 GT 512MB at its default clock.

ATI's chip simply cannot compete with 112 Shader units at such a high clock, but it will sell for some US$50 to $60 less to begin with.

RV670PRO will end up even lower priced, as it has the hard task of fighting the soon to be launched 8800 GT 256MB version.

The performance analysis is based on more than ten games and the current drivers, so if ATI comes up with miraculous drivers perhaps these will be even more competitive."

November 3, 2007 12:45:26 PM

I've also seen stable OC's for the 8800GT shown up to 728 core x 2040 (1040 x 2) memory with a 3DMark06 score of about 12,264 which is impressive to say the least (600 points HIGHER than a stock 8800 GTX that costs $600!).

http://www.fudzilla.com/index.php?option=com_content&ta...

The most I've squeezed out of my 8800 GTS 640 was 10,580 and that was pushin it. Pretty incredible for a $250 card.

P.S., I am aware that the 8800 GT will not be able to handle 10.1 at this time; however, I was glad to see that it will not be made "obsolete" as some sensational articles lead one to believe. I do feel that the uber-hackers in our community will find a way to get 10.1 performance out of an 8800GT. Nevertheless, I prefer NOT to be forced into using anti-aliasing as it is a FPS vampire and really doesn't improve my gaming experience.
November 3, 2007 12:53:43 PM

Ok, last post, I promise:

The advancement of OLED's (organic LED's), which will allows the creation of VERY high quality video on bendable medium, will mean that very shortly we will start to see full "wrap-a-round" wearable 3D gaming goggles which completely fill your field of vision and don't just look like a big tv hovering in the distance.

Add to these the stereoscopic 3D and surround sound capabilities of these headsets along with motion sensors (look to your right and the game pans right, look up and the game looks up) and you will be getting very close to that "holo-deck" experience we all want.

The point being that you can create a MUCH clearer image with lower resolutions on that tiny OLED screen than you can on that giant monitor with much less Vram on the card. The 512 on the 8800GT should be plenty going forward.
November 3, 2007 1:27:03 PM

The update of DX10.1 is great for the gaming community. If it has forced 4x AA, then that means that nVidia and ATi are going to have to start making technology that won't take as much of a performance hit whilst running 4x AA. It will probably take years for that to happen, but it will happen eventually. It means better quality while still being playable. And hey, there is no reason you have to run Crysis in DX10.1; you could run all your games in DX10.1, but set Crysis to DX10 or DX9. I have never played the demo/beta, but I would imagine that they put that option in the game menus somewhere.
a c 130 U Graphics card
November 3, 2007 2:28:29 PM

mitchellvii said:
Guys, here is some of the inside scoop on the ATI Performance. Not detailed benchmarks, but overall commentary. I've seen this in quite a few places so I am guessing it may be correct:

http://www.fudzilla.com/index.php?option=com_content&ta...

RV670XT to end up 20+ percent slower than the 8800 GT
Written by Fuad Abazovic
Thursday, 01 November 2007 08:44

"Our sources have confirmed that RV670XT, the card that might be called Radeon HD 3870, should end up more than twenty percent slower than the 8800 GT 512MB at its default clock.

ATI's chip simply cannot compete with 112 Shader units at such a high clock, but it will sell for some US$50 to $60 less to begin with.

RV670PRO will end up even lower priced, as it has the hard task of fighting the soon to be launched 8800 GT 256MB version.
Oh right so its just speculation then ? nothing definate,educated speculation i know but still speculation.
I think its a bit hard to tell what chance it has of keeping up with the clocks of the gt, if we were talking about the older X1*** cards then yes but with the new arcitecture on the ATI card its hard to compare them like for like.
mactronix

The performance analysis is based on more than ten games and the current drivers, so if ATI comes up with miraculous drivers perhaps these will be even more competitive."

a c 130 U Graphics card
November 3, 2007 2:31:36 PM

Dont know what happened there its missed my whole reply off i cant re-do it now will post later.
Mactronix
November 3, 2007 2:57:11 PM

Regarding Crysis DX 10 performance. The new 169.04 Video Driver Beta dramatically improves performance of DX10 in Crysis. If you have an NVidia Card, get it.
November 3, 2007 3:00:54 PM

When I first saw the specs for the new ATI card I was pumped because it just looked so much better on every level, so I was surprised to see that it actually underperforms. Honestly, it's hard to see how except for the structure and amount of pipes.
November 3, 2007 3:06:19 PM

Vista was suppose to come out with dirctX 10.1 in directX 10,but they fail to do so because of deadline.This was an advantage for Nvidia because they are still having trouble with it.Before ATI was snatched up by AMD,they where in there final stages of getting done .The then CEO of ATI was saying how ATI will no longer be in second place,then not to long after they got bought,Nvidia took off .10.1 is too much a even playing field for Nvidia and is now a marketing hype for AMD/ATI
a c 130 U Graphics card
November 3, 2007 4:59:46 PM

I think i will wait to see some proper benchies before i give up on the 3800 cards, the fudzilla stuff is just speculation at this point, ok so its educated speculation but still.
Fudzilla isnt exactly the most reliable source either, its not even like you can compare the two based on the number of shaders and clocks the arcitecture is too differant.
mactronix
November 3, 2007 5:11:34 PM

mac,

But the word on the street is these new ATI cards will be much cheaper. I doubt they would do that for comparable or better performance.

Personally, I've found Fudzilla to be on target as far as specualtion goes.

But as I mentioned, I have seen many other articles saying the ATI Card is slower, and ATI doesn't seem to be doing any press releases to dispell the online rumors that are damaging to the brand.
November 3, 2007 5:34:55 PM

Remember how everyone said the hd2900 would destroy the 8800gtx? Now everyone is saying hd3800 will be slower than the 8800gt. We will see what happens. :) 
November 3, 2007 5:41:56 PM

mitchellvii said:
Ok, last post, I promise:

The advancement of OLED's (organic LED's), which will allows the creation of VERY high quality video on bendable medium, will mean that very shortly we will start to see full "wrap-a-round" wearable 3D gaming goggles which completely fill your field of vision and don't just look like a big tv hovering in the distance.

Add to these the stereoscopic 3D and surround sound capabilities of these headsets along with motion sensors (look to your right and the game pans right, look up and the game looks up) and you will be getting very close to that "holo-deck" experience we all want.

The point being that you can create a MUCH clearer image with lower resolutions on that tiny OLED screen than you can on that giant monitor with much less Vram on the card. The 512 on the 8800GT should be plenty going forward.



um wasn't it Bill Gates who once said 640k ram should be more than enough going forward back in the early 80's? :heink: 

[/age meter]
November 3, 2007 5:47:27 PM

pip,

Except for the fact that 512M is exponentially more than 640k, I see your point - lol.

Face it, right now, all that Vram is only necessary to pump out the high resolution video. If you are looking at a screen that is 1" x 2", then the res doesn't need to be high to appear ultra sharp.
November 3, 2007 7:23:03 PM

I opened this thread to check what mirracle happened, and it appears - none :??:  Just misleading softpedia article...
November 3, 2007 7:46:54 PM

Harrison,

The article is accurate, just the title was misleading - sorry :( 
November 3, 2007 8:03:11 PM

mitchellvii said:
Harrison,

The article is accurate, just the title was misleading - sorry :( 

You quoted softopedia title, so its not your fault it was misleading :) 
a c 130 U Graphics card
November 3, 2007 8:35:56 PM

mitchellvii
Dont get me wrong im nobodys fanboy but rumours and speculation is just that with out benchies most would take a lot of reports with a pinch of salt. Remember when the G92 was gonna be 3 times the card a GTX is ? That was quoted along with plausable(at first glance)data /clocks etc to back it up.
ATI/AMD are in a bad position at the min and i think its plausable that the pricing is more about getting back market share than how good the card is. Its the second stab at the tec also so there should be some worthwhile improvement to it.
My personal take on the whole situation is that i want the card to be competative be that on performance or price point as that can only be good for all of us.
mactronix
November 3, 2007 9:26:16 PM

mac,

This whole thing is giving me a headache.

I just returned my 8800 GTS 640 to TigerDirect yesterday. I was running out of time on my 30 day return and I had ordered an EVGA GT which didn't come in, so they were nice enough to credit me my entire purchase price withoiut the 15% re-stocking fee (thanks guys!).

Ok, so now I am sitting on $400 trying to figure out how to spend it.

My choices are:

1. Buy groceries - nah.
2. Pay the rent - boring.
3. Buy an EVGA GT stock direct from EVGA and OC it and spend the extra on that nice Coolmaster VGA Watercooler rig.
3. Wait for two weeks to see the REAL STORY on the ATI cards. If I can get the same performance as a GT for less $ and 10.1, I may go ATI.
4. Wait until December when the new G92 GTS is supposed to come out with bigger bandwidth and 128 shader pipes ad 1G vRam (and blowing away even the GTX Ultra).

THE FINAL SOLUTION:
Probably best to wait two weeks to see the real deal on the ATI card. Then, if the GT really is better, go ahead with that and then if the GTS is really THAT much better for a reasonable cost in December, trade-up to that.

In two weeks, the prices on the GT's will probably have come down to compete with the ATI's if, in fact, they are better than rumored.

Ugh, so I won't be playing any games for 2 weeks, but heck, most of the good new games won't be out until then anyway. Can just play online poker in the meantime - lol.
a c 130 U Graphics card
November 3, 2007 10:04:25 PM

I think you have come to the best conclusion but just for the sake of it answer me this.
How much less performance would you find acceptable in exchange for DX10.1?
Any ways if you are going to get EVGA and you are fairly certain that the G92 GTS is going to be that good you can always step up.
Mactronix
November 3, 2007 11:30:48 PM

Mac,

I don't want to sacrifice any performance for 10.1, but who knows, maybe the ATI 3800 Series rocks and the rumors we have heard are lies. I mean, it has smaller die, faster ram and higher clock speeds - just seems odd it should be slower, but what do I know? Don't know how many transisters it has or how its software works.

Until someone leaks some benchies, who knows.

But by not having a card for the next two weeks I really am not missing anything as the best games will just be coming out.

I'll probably wait two weeks. By then we will know on the ATI and also on the G90 GTS - if it really is the monster peope say or just a smalll improvement on the GT for a big jump in cost.

**But waiting sucks.

November 4, 2007 1:21:23 AM

Ibanezrg570 said:
"Even though DX10.1 will support current DX10 graphics hardware, today's DX10 hardware will not be able to support all of the features of DX10.1, which includes incremental improvements to 3D rendering quality."

Quoted from the original source:

http://www.next-gen.biz/index.php?option=com_content&ta...

Unfortunetly, your 8800GT will still not support DX10.1. What they are saying is, that no game will be just DX10.1 exclusively. Your card will be able to run all new games, just not in DX10.1 (or more accurately, without the new 10.1 advancements).

Agreed, This means that DX10 card won't SUPPORT 10.1's features.
http://www.tomshardware.com/2007/10/29/amd_hd_3800_to_s...
Check it out, it will tell you the new features such as native MSAA and much more efficient global illumination, the use of cube arrays. etc....
November 4, 2007 2:10:48 AM

mitchellvii said:
Guys, here is some of the inside scoop on the ATI Performance. Not detailed benchmarks, but overall commentary. I've seen this in quite a few places so I am guessing it may be correct:

http://www.fudzilla.com/index.php?option=com_content&ta...

RV670XT to end up 20+ percent slower than the 8800 GT
Written by Fuad Abazovic
Thursday, 01 November 2007 08:44

"Our sources have confirmed that RV670XT, the card that might be called Radeon HD 3870, should end up more than twenty percent slower than the 8800 GT 512MB at its default clock.

ATI's chip simply cannot compete with 112 Shader units at such a high clock, but it will sell for some US$50 to $60 less to begin with.

RV670PRO will end up even lower priced, as it has the hard task of fighting the soon to be launched 8800 GT 256MB version.

The performance analysis is based on more than ten games and the current drivers, so if ATI comes up with miraculous drivers perhaps these will be even more competitive."



Is this true? Why buy ATI then.... I feel sorry for AMD, truly. They're not doing the best at the moment. I thought if this card was something like the new 8800GT (I expected better) then I thought about a cross-fire in x38, but now... just going for the 8800GT, a new penryn, and a 680i and full x16
November 4, 2007 2:50:03 AM

as Maziar said... the games wont fully implement DX10 in two or three years and by that time microsoft might have released DX 11 or 10.5
a b U Graphics card
November 4, 2007 2:51:37 AM

Nothing different from when DX8 cards "support" DX9. There's a difference between Compatible and Compliant.
a b U Graphics card
November 4, 2007 6:03:58 AM

What a train-wreck of a thread.

First with the post of something that is 3 months old news, and then the misconception that DX10.1 'FORCES' AA. DX10.1 set the standard for AA, it doesn't turn it on, nor does it give it penalty free (the other misconception).

Mitch, your links support nothing of what you're saying, you're reading in things that aren't there, just like those who thought that DX10.1 rendered their DX10.0 hardware useless.

Like Randomizer said it's like DX8 (or DX7) cards being supported under DX9; DX10.1 us a SUPERSET of DX10 which support the older cards, they just can't use any of the additional features not in their hardware.
a b U Graphics card
November 4, 2007 6:17:40 AM

DX10.1 only becomes a problem if games do like what some did with DX9.0c, preventing older hardware from working at all. I can't see that happening for years, you can still use DX8 cards for many recent games (heck even DX7 cards if you are that poor) albeit with shotty performance for the most part. It seems as though the time of playing on DX7/8 cards is finally gone, taking DX9.0a and DX9.0b cards with it. Unless... has anyone tried running crysis on a GF4?
a b U Graphics card
November 4, 2007 6:40:52 AM

Yeah, but I doubt that they'll limit games to anything less than SM3.0/DX9.0C for a long time, due mainly to the impact on console ports.

By the time they have a DX10.1 or above only game, we'll all be talking about DX12 and the fate of our DX11 systems/cards IMO.
DX10 only will likely be the first and major step. Like DX8.1 it's unlikely many games will even exploit DX10.1 for a while, nor that it will be a major factor, although it could a deciding factor, seeing as how DX8.1 influenced my choice of the AIW-R8500 for Morrowind though, a killer game/app/feature could make it 'of interest' if not a requirement.
a b U Graphics card
November 4, 2007 6:48:11 AM

Which as the really important DX version? I can't remember. I think it was DX7 with hardware T&L, or was that in DX8? Grrr... M$ is screwing with my mind with all their stupid versions.
a b U Graphics card
November 4, 2007 7:05:58 AM

Yeah T&L was introduced in DX7, but not limited to DX7 (some DX7 cards didn't have hardware T&L, like some of the Radeon 7xxx series cards).

Of course it always depends on you impression of 'important' to me the water effects in Morrowind were worth it.
November 4, 2007 8:22:37 AM

TheGreatGrapeApe,

Wow, so happy the THREAD POLICE arrived to declare our popular little thread here a "train wreck"! Thanks man. I would have gone on thinking this was a valuable discussion of a hot topic. It is so cool of you to set us straight!

Everyone, can we all stand and have a round of applause for TheGreatGrapeApe for showing us the way? Huzzah!

From now on, let's no one else have anything to say besides TheGreatGrapeApe. We'll just create a thread and wait for him to declare it worthy or not before proceeding. As a matter of fact, maybe the Mods can just add a 'WORTHY BUTTON' to each thread that only TheGreatGrapeApe can click. All "unworthy" threads will then be deleted immediately.

Thanks Grape. You've saved us little people from making fools of ourselves in the future. Chicks must dig you.
a c 130 U Graphics card
November 4, 2007 9:32:19 AM

Calm down Mitch I'm sure TGGA wasn't having a go at you, just pointing out all be it a bit more abruptly what i was saying about it all being speculation at this point.
Technically the thread police have in fact arrived since TGGA is a Mod :)  and a very helpful member of our comunity when you get to know him.
Right on to the main issue namely the award for most misleading sentence published in a online article.
Drum roll please.........And the winner is...
Glassenberg revealed that the existing DirectX 10 hardware supporting the Windows Vista operating system will run the upcoming version 10.1 of the DirectX graphics technology with absolutely no issues. :lol: 
Mactronix
November 4, 2007 10:30:08 AM

mac,

There is NEVER an excuse for rudeness. Ever. TGGA calling my thread a "train-wreck" was rude, self-absorbed and obnoxious. If THIS is what he thinks being a "mod" is, he is mistaken. A mod's job is to prevent flaming and gratuitous attacks, NOT to author them.

Actually, as a Mod, he should be ashamed of himself for insulting a well-meaning member publicly. I created this thread with good intentions on what I considered (and the web considers) a "hot" topic, and he insults me?

Ok, fair is fair. TGGA is a "train-wreck" of a mod. Now we are even :) 
November 4, 2007 1:17:01 PM

Ok, finally found out why the HD 3800 series from ATI will be SLOWER than the G92 offering from NVidia despite having superior components:

"ATI will lose to G92/D8P, primarily because its Shaders are running at clock speed, while Nvidia runs shaders 2.5 times faster than the chip clock and gains a lot of performance as a result. ATI cannot do this with its current RV670 generation, but we would not be surprised if they use it for their next generation scheduled for the middle of next year."

So it comes down to the shader speed in NVidia vs ATI.
November 4, 2007 2:53:50 PM

well then lets just wait for R700.
November 4, 2007 3:16:47 PM

I sure wish I could get EVGA to come clean about when their new G92 128 Shader GTS will be coming out. I have heard as early as November 19th. It seem that with 128 shaders it should be faster than the GT, but how much and will it be worth the extra cost?
November 4, 2007 3:32:52 PM

mitchellvii said:
I sure wish I could get EVGA to come clean about when their new G92 128 Shader GTS will be coming out. I have heard as early as November 19th. It seem that with 128 shaders it should be faster than the GT, but how much and will it be worth the extra cost?


If its a lot more powerful than my 8800GTX I will step-up. :D 
a b U Graphics card
November 5, 2007 1:10:53 AM

mitchellvii said:
mac,

There is NEVER an excuse for rudeness. Ever. TGGA calling my thread a "train-wreck" was rude, self-absorbed and obnoxious. If THIS is what he thinks being a "mod" is, he is mistaken. A mod's job is to prevent flaming and gratuitous attacks, NOT to author them.

Actually, as a Mod, he should be ashamed of himself for insulting a well-meaning member publicly. I created this thread with good intentions on what I considered (and the web considers) a "hot" topic, and he insults me?

Ok, fair is fair. TGGA is a "train-wreck" of a mod. Now we are even :) 

You could have just not responded to his comment, it's always the responses that fuel flame wars. Though this hardly qualifies as even a little quarrel by the standards set in past times.
November 5, 2007 1:49:09 AM

surprisingly amd rv670 Radeon HD 3800: is going to be able to render 10.1. Whitepaper read
The shader model is being upgraded to Shader Model 4.1 to add features improving global illumination and shadowing, deferred rendering performance and LOD instructions.


a b U Graphics card
November 5, 2007 2:11:51 AM

I don't find that surprising actually, ATI were often more innovative than nvidia (think ring bus), but AMD isn't helping them show their potential.
November 5, 2007 3:01:47 AM

randomizer,

"You could have just not responded to his comment, it's always the responses that fuel flame wars. Though this hardly qualifies as even a little quarrel by the standards set in past times."

When there is a forest fire, who do you blame? The trees that burn or the fool that lit the match?
a b U Graphics card
November 5, 2007 3:10:22 AM

In our modern world, you always blame somebody else. If you were the fool, you'd blame the matches for lighting at the wrong time. The matches will blame the trees for being flammable and the trees will blame you for being there with matches. Of course, since you aren't the fool with the matches, you can blame all 3.
!