Fill me in on DX10 cards

Vinny

Distinguished
Jul 3, 2004
402
0
18,780
I really don't know too much about whats happening with DX10 but I'm still wondering if I should wait.

Should I wait to get a DX10 card? I'm thinking about getting a cheapo $50 card for now getting a new card when DX10 cards arrive but I keep hearing all this stuff about DX10 cards- like you'll need a 1000watt PSU to run these things or that the coding is really bad, etc.
 

Vinny

Distinguished
Jul 3, 2004
402
0
18,780
I see... is there anywhere I can get more info on what advantages the DX10 will hold? All I know so far is that DX9 cards wont be able to use DX10.

I'm really worried about those PSU requirements though and the price. But I guess if worse comes to worse, it'll only mean that DX9 cards will become much cheaper.:p
 

spazmire11

Distinguished
Jun 10, 2005
11
0
18,510
my personal recemendation would be just to get a card now, itll run vista fine and run games for quite a while, just dont get super ultra high end. 7950 gtx cards get a 7600gt/s or x1600
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
I see... is there anywhere I can get more info on what advantages the DX10 will hold? All I know so far is that DX9 cards wont be able to use DX10.

I'm really worried about those PSU requirements though and the price. But I guess if worse comes to worse, it'll only mean that DX9 cards will become much cheaper.:p

Here's an overview of the differences between DX9 and DX10:
http://www.driverheaven.net/articles/dx10/

Vista will have DX9L for legacy and DX10 for native games. For the first time, there will be no DX10 downloads for Windows XP. Though I plan to get a Crossfire system this fall, I'm using two X1600XTs, because in the future those X1600XTs can be used for physics alongside a DX10 card.

I can't see spending too much on a high end X1900 or X1950 class card unless you need an All in Wonder card for video editing. The same goes for Nvidia. DX10 will force an upgrade to Vista for some games, at least, and it offers advantages over DX9. ATI has the upper hand here because of unified shader architecture, though Nvidia could surprise us with their mutant solution.
 

Cody_7

Distinguished
Jun 12, 2004
172
0
18,680
Great, I'm glad I found a thread explaining exactly what DX10 does! So... As I read from that article you linked to, yipsl, you cannot run DX10 effects on a DX9 card? It's not backwards compatible?

Hmm... and another thing from the article:

DirectX 10 adds an extra type of shader into the mix, the geometry shader. This type of shader has the ability to create extra triangles if requested to do so. One of the many benefits of this type of shader would be to create displacement mapping (a way to make textures and scenes look more detailed by adding the illusion of height to an otherwise flat texture).

I thought bump-mapping already did that! Doesn't it do the same thing, e.g. slightly raise your poly count to make a textre look like it is 3-D, bevelled, textured, etc?

P.S. When I click on the images in the article, they don't open. Anyone else have that happen with Firefox?
 

derek2006

Distinguished
May 19, 2006
751
0
18,990
P.S. When I click on the images in the article, they don't open. Anyone else have that happen with Firefox?
When u click on the picture a yellow bar should appear beneath the tab bar. On the right side click on popup option and select allow popups from whatever site your at. If u dont see that bar u probly clicked dont show this message again and u then might have to play with firefox settings.
 

ZOldDude

Distinguished
Apr 22, 2006
1,251
1
19,280
I really don't know too much about whats happening with DX10 but I'm still wondering if I should wait.

Should I wait to get a DX10 card? I'm thinking about getting a cheapo $50 card for now getting a new card when DX10 cards arrive but I keep hearing all this stuff about DX10 cards- like you'll need a 1000watt PSU to run these things or that the coding is really bad, etc.

Should I wait to get a DX10 card?

You will have to nobody sales them and only Vista will support DX10 and most people won't buy Vista.
I won't even install it OR torrent it when that time comes.

When the OS and hardware does arrive you can expect a long wait untill software follows that uses it...and your eyes are not likely to ever notice a change.
Sort of like running SLI gfx cards today,alot of money for nothing you can really see.
 

Vinny

Distinguished
Jul 3, 2004
402
0
18,780
You will have to nobody sales them and only Vista will support DX10 and most people won't buy Vista.
I won't even install it OR torrent it when that time comes.

When the OS and hardware does arrive you can expect a long wait untill software follows that uses it...and your eyes are not likely to ever notice a change.
Sort of like running SLI gfx cards today,alot of money for nothing you can really see.

That's a very good point... I'm still iffy on Vista. Rather than try to make the system as bulletproof as possible, MS is doing a lot visual upgrades. Sure some features are really nice (like using a USB flash drive as system memory!) but overall, I'm seeing a lot of stability issues and such.

I will get Vista sooner or later, but definitely not for a year or two. I doubt game makers will just ditch DX9 cards because a majority of the people will run DX9 cards for a while.
 

Cody_7

Distinguished
Jun 12, 2004
172
0
18,680
I've been beta testing Vista for a while now. And there's still alot of bugs -- so what? It's one of the largest-scale upgrades of an OS that's ever been done. And i'm glad that they're focusing more on "visual" appearances (Like the nice glass eye-candy) because OSes like OSX have always been better visuall. Now it's Windows' turn to take that prize.

I do agree though - I will wait a little while before upgrading to Vista just to make sure everything is dealt with. And not releasing DX10 for XP is a risky affair, but it didn't take long for everyone to adopt XP as their new OS, and it probably won't for Vista, either.
 

ZOldDude

Distinguished
Apr 22, 2006
1,251
1
19,280
Sure some features are really nice (like using a USB flash drive as system memory!)

Why would you want to use slow memory as system ram?
If you ment BOOT from it (USB),you can do that already.

http://tkc-community.net/Forums/viewtopic.php?t=4206

Vista will be a hog. As it is if you have 2GB of ram then it wants to use up to 940MB and load the whole thing into the system.
Looks like a good idea....make the OS fast by having it all in ram....untill you need that ram.

It has good points and bad points but the main thing I dislike is the DRM locked into it's core.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
Great, I'm glad I found a thread explaining exactly what DX10 does! So... As I read from that article you linked to, yipsl, you cannot run DX10 effects on a DX9 card? It's not backwards compatible?

DX10 is not backwards compatible, though VISTA will have DX9L for legacy games and graphics cards. The Inquirer broke the news relatively early that there will be no DX10 for Windows XP. I'll be an early adopter of VISTA on one PC but I'll keep one of our two XP only until the first service pack arrives to patch the new OS. By that time, SP3 should be out for Windows XP.

Well, the AIW X1900 box can stay XP while I can take the Crossfire system I'm building to VISTA. It will only have X1600XTs until then, but ATI claims I'll be able to use one X1600XT for physics (and even RPGs have physics nowadays) and get a midrange DX10 card.

Though I've read about bump mapping, and my wife mods for Morrowind and Oblivion, I really don't have a handle on what's really different between geometry shaders and bump or normal mapping. I do know that DX10 goes to shader 4.0 as well as adding geometry shaders. If the difference is even more pronounced than it was between my Radeon 8500 under DX8.1 and DX9.0c under an X1600XT, then I see the gaming value in it.

I'd rather upgrade my PCs than to go out and buy an Xbox 360. If I didn't like PC RPGs, then I guess I'd have no reason to go VISTA right away.
 

Vinny

Distinguished
Jul 3, 2004
402
0
18,780
It has good points and bad points but the main thing I dislike is the DRM locked into it's core.

What does that mean? I think DRM means digital Rights Management but I don't understand what 'locked into it's core' means? It doesn't sound like a good thing... :(

Great, I'm glad I found a thread explaining exactly what DX10 does! So... As I read from that article you linked to, yipsl, you cannot run DX10 effects on a DX9 card? It's not backwards compatible?

DX10 is not backwards compatible, though VISTA will have DX9L for legacy games and graphics cards. The Inquirer broke the news relatively early that there will be no DX10 for Windows XP. I'll be an early adopter of VISTA on one PC but I'll keep one of our two XP only until the first service pack arrives to patch the new OS. By that time, SP3 should be out for Windows XP.

Well, the AIW X1900 box can stay XP while I can take the Crossfire system I'm building to VISTA. It will only have X1600XTs until then, but ATI claims I'll be able to use one X1600XT for physics (and even RPGs have physics nowadays) and get a midrange DX10 card.

Though I've read about bump mapping, and my wife mods for Morrowind and Oblivion, I really don't have a handle on what's really different between geometry shaders and bump or normal mapping. I do know that DX10 goes to shader 4.0 as well as adding geometry shaders. If the difference is even more pronounced than it was between my Radeon 8500 under DX8.1 and DX9.0c under an X1600XT, then I see the gaming value in it.

I'd rather upgrade my PCs than to go out and buy an Xbox 360. If I didn't like PC RPGs, then I guess I'd have no reason to go VISTA right away.

Wait, so... you're saying with Crossfire, I can get a X1600XT to use for a graphics card for XP. But later on, with Vista, I can get a X1900XTX Crossfire Edition and use that as my GPU while the X1600XT serves as my PPU? Any links on this? It's quite intriguing!!:D
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
Wait, so... you're saying with Crossfire, I can get a X1600XT to use for a graphics card for XP. But later on, with Vista, I can get a X1900XTX Crossfire Edition and use that as my GPU while the X1600XT serves as my PPU? Any links on this? It's quite intriguing!!:D

The use of an ATI card for physics starts with the X1600 series. Though a Twitch Guru comment said that onboard graphics could be unlockable for physics in a single PCIe card setup sometime in the future.

From what I've read, you'll need a Crossfire board, and if the board has two PCIe slots, then you could use one DX10 card for graphics and the X1600XT for physics. It would not be Crossfire per se.

Now, I've read there will be boards with something like 2 PCIe 16 and one PCIe 8 (or 3 PCIe 16) in the future. Those boards will allow two cards in Crossfire and one in physics. I'm not 100% sure how it will all pan out.

The interesting thing about what we've seen today from ATI is their plan to use three graphics cards, instead of two or four, so you can grab two x1900's (let's say, for arguments sake) and then stick in a relatively cheap x1600 or so to do your physics grunt work. Of course you need 3 PCI Express slots to do this, and motherboards with three slots are few and far between. However if and when the likes of Quad SLI and this three-way solution from ATI take off we may start to find mobo's with three and four PCI-E slots easier and easier to come by.

http://www.tgdaily.com/2006/06/07/ati_goes_physics_gaga/

It seems it will be delayed for 9 months (ie March 2007), so it's gearing up for Vista.

http://www.theinquirer.net/default.aspx?article=32755

Currently, if you use an X1900 Crossfire card and an X1600XT, you'd be making a mistake as the pixel pipelines on the better card default to those of the weaker card. You can use two X1600XT's without a Crossfire card or dongle, and they use a bit more power than an X1800XL but are just as fast.

If you got a Crossfire card (ie X1800 for use with another X1800 or an X1900 for use with another X1900), then you'd be spending quite a bit of money for a one year DX9.0c solution. While the X1800 and X1900 Crossfire ready cards (ie the cards that aren't dedicated Crossfire masters) can be used for physics just like the low end X1600XT, I'm not sure the Crossfire masters would be usable for physics.

Would there be a conflict with the newer Crossfire DX10 master, or would the third card in a three card system not be recognized for graphics, but for physics? That's what's up in the air. It should pan out nicely, when the 3 or 4 PCIe 16 (or even 8) boards are available.

It's Havok FX physics that will be supported by the ATI cards, so games that use other physics engines might or might not benefit. It depends on if there's a DirectPhysics standard by Microsoft for Vista that's supported by all the physics engines out there.
 
One of the many benefits of this type of shader would be to create displacement mapping (a way to make textures and scenes look more detailed by adding the illusion of height to an otherwise flat texture).

I thought bump-mapping already did that! Doesn't it do the same thing, e.g. slightly raise your poly count to make a textre look like it is 3-D, bevelled, textured, etc?

Your confusing the two, bump mapping is face on only, current virtual displacement mapping (ie ocludded parallax offset mapping) give you the apperance depth when moving, the new geometry shader will allow true displacement mapping like that found on the Parhelia (so ahead of it's time :mrgreen: ) which will actually give it depth not just the appearance of depth;
http://www.tomshardware.com/2002/05/14/matrox_parhelia/page4.html

Look at the sphere, it looks like it has a rougher surfaces when bump mapped, but it in effect is still just a perfect smooth sphere (look at the outside edges), then look at the displacement mapped surface, which actually extends beyond the originating sphere.

Wiki's got a good description of the differences, check the links to to get an easy idea of the differences;
http://en.wikipedia.org/wiki/Displacement_mapping
 
The use of an ATI card for physics starts with the X1600 series.

Actually ATi mentioned the X1300 in alot of their early material, it's just that they demoed on the X1600, and that's likely because the X1300 didn't outperform Ageia's PhysX card enough to make it into the demo. But there's no physical restriction to it.

Though a Twitch Guru comment said that onboard graphics could be unlockable for physics in a single PCIe card setup sometime in the future.

And possible even unused portions of a single VPU if the load can be well balanced.

Now, I've read there will be boards with something like 2 PCIe 16 and one PCIe 8 (or 3 PCIe 16) in the future.

They already exist, Tyan has 2 x 16X + 2 x 8X , and the ATi demo was on 2 x 16X + 1 x 8X, but futre models are to be 3 x 16X. And Gigabyte has their quad board which is 4 x 8X.

Currently, if you use an X1900 Crossfire card and an X1600XT, you'd be making a mistake as the pixel pipelines on the better card default to those of the weaker card.

You'd be mistaken, because it's not the case. they would both work independantly in their own pixel pipeline configuration, but no in crossfire, which is no different than anyone else's solution. The X1600 would not be for rendering graphics, except in a multi-monitor capacity.

If you got a Crossfire card (ie X1800 for use with another X1800 or an X1900 for use with another X1900), then you'd be spending quite a bit of money for a one year DX9.0c solution.

So obviously you should get 2 X1800GTOs and then you're set, no worries of a master card.

I'm not sure the Crossfire masters would be usable for physics.

I'm not sure why you would think they wouldn't be. Of course they would.

Would there be a conflict with the newer Crossfire DX10 master, or would the third card in a three card system not be recognized for graphics, but for physics?

Depends on M$' implementation, but it would act very much like XP does now, fine for multi-monitor, fine for physics, but graphics for gaming becomes a case by case application.

That's what's up in the air. It should pan out nicely, when the 3 or 4 PCIe 16 (or even 8) boards are available.

Like I said, they're already here.

It's Havok FX physics that will be supported by the ATI cards,

And also ATi's own implementation, which is not simply Havok's but in addition to that.

so games that use other physics engines might or might not benefit.

Which would be a small number of games, since it's priarily Havok doing the phsyics now, with a small number for Ageia (most are future titles); and that's why ATi would give the alternative, in case things should change in the future. Right now though there's no worry.

It depends on if there's a DirectPhysics standard by Microsoft for Vista that's supported by all the physics engines out there.

Actially it won't be supported by all the engines completely since M$ is not going the PPU route only the VPU route, and thus Ageia will have to decide whether they wish to support it, and even then developers may add their own support in additional to Ageia's restricted path.
However in all likelyhood M$' decision to enter the fray has basically ensured that the future of VPU based physics will have a generalized/standardized platform upon which to buid other engines, at worst IMO there'll be Ageia, Havok and M$' solution, the lasttwo of which both ATi and nV have said they'd support. And it's unlikely that ATi and nV's own solutions would be anything that game developers would aopt instead of those 3, more likely they would be a means of bridging them, the way that thei graphics drivers do now.

MMmm physics drivers, yeah! :wink:
 

NightlySputnik

Distinguished
Mar 3, 2006
638
0
18,980
I really don't know too much about whats happening with DX10 but I'm still wondering if I should wait.

Should I wait to get a DX10 card? I'm thinking about getting a cheapo $50 card for now getting a new card when DX10 cards arrive but I keep hearing all this stuff about DX10 cards- like you'll need a 1000watt PSU to run these things or that the coding is really bad, etc.

OK, quickly there is upcoming high-end R600 Dx10 card from ATI maybe in november, at worst february (more the first than the latter tough). It should have, according to most estimate, 64 unified shaders unit, that can do both pixel and vertex calculations. So, in theory, it should use it's ressource the most efficient way. ATI should be pretty good at this since their R500 gpu for XBox360 already is build that way, and I gotta say XBOX360 games looks really amazing, even tough I'll never get one (can't do encoding with an XBox :twisted: ).

For nVidia, there is upcoming high-end G80 Dx10 card. It should have (according to Inquirer, and in this case I tend to believe them) 32 pixels shaders and 16 vertex shaders. New gpu for Dx10 shouldn't have dedicated pipeline for pixel and vertex, but either I'm wrong or nVidia found a way to make it work. If so, they were probably thinking more of Dx9 performances, because there isn't gonna be much Dx10 game before it's replacement arrives in march-june 2007 anyway.

If anybody think I'm wrong, please let me know, cause I'm really not sure at all about this. I should be fairly close tough.

My 2 cents.
 

Cody_7

Distinguished
Jun 12, 2004
172
0
18,680
Should I wait to get a DX10 card? I'm thinking about getting a cheapo $50 card for now getting a new card when DX10 cards arrive but I keep hearing all this stuff about DX10 cards- like you'll need a 1000watt PSU to run these things or that the coding is really bad, etc.

Back to that orignal question. I've heard people say that the high-end DX-9 cards (Such as the X1800 / 1900) need at least a 550W power supply. This is making me very upset too, becasue I just bought a budget $40 500W power supply.

I researched, though, and it really just depends hon how much electricity your PSU can send over one wire. Obviously, the graphics card needs a certain amount of electricity over a single "rail".

I'm pretty sceptical, though... My X800XL 256MB ran fine on a 250W power supply. Maybe my next-gen DX10 card will run fine on my budget 500W power supply :lol:

Your confusing the two, bump mapping is face on only, current virtual displacement mapping (ie ocludded parallax offset mapping) give you the apperance depth when moving, the new geometry shader will allow true displacement mapping like that found on the Parhelia (so ahead of it's time :mrgreen: ) which will actually give it depth not just the appearance of depth

Well... first we used bump-mapping to make things look like they were 3-D, without actual polygons (so it's not as taxing on video cards), now we're going to use "displacement mapping" to actually raise the polygon count so it is 3D in a way.

Why don't we just do it already, and design all of our models with excrutiating poly detail so we don't have to make "xxx mapping" do it for us! lol!
 

Vinny

Distinguished
Jul 3, 2004
402
0
18,780
NightlySputnik, so let me ask you a question because this is mind boggling for me due to too much information too fast from two different people. Assuming I have a Crossfire mobo and all goes well, if I get a PCIe X1600XT DX9 card now and a PCIe DX10 series ATI card (something midrange or a bit higher), I will be able to dedicate the 1600XT as my PPU and have the new DX10 card as my GPU?

It'd be really cool if it works but no biggie if it doesn't. If it doesn't, I'd just like to make sure that I don't waste my time building my system with Crossfire (because currently, no AM2 mobos have Crossfire :lol:).

If DX10 cards are coming within the next year then I can play the 'wait and see' game. If it works then I'll get a DX10 card, if not, then DX9 cards will probably become a bit cheaper at the least.:)
 

Cody_7

Distinguished
Jun 12, 2004
172
0
18,680
Assuming I have a Crossfire mobo and all goes well, if I get a PCIe X1600XT DX9 card now and a PCIe DX10 series ATI card (something midrange or a bit higher), I will be able to dedicate the 1600XT as my PPU and have the new DX10 card as my GPU?

My answer: Yes. I say yes because they work seperatley, so it doesn't really depend upon what your main GPU is. But, we better make sure someone else here is sure about it.
 

NightlySputnik

Distinguished
Mar 3, 2006
638
0
18,980
I honestly can't answer a definitive yes to this. But I remember ATI saying you could use pretty much any serieX (1300, 1600, 1800 and 1900 for now) VPU for physics with any other VPU for dedicated graphics. But I can't tell about different DX version of cards. Maybe DX10 vpu will need to be used with other card of the DX10 series to work for physics.

But, it might be quite long before game with ATI physics engine get to retails. So at this time all your X1600 card is gonna be good for is most probably gonna be physics (no offence intended).

You can rest assure that DX10 card will be out before year end for nVidia (except very big problem a la R520) and february (at worst) for ATI. For mid-range I'd say probably next march more or less, maybe for before depending on Vista release date.
 
Well... first we used bump-mapping to make things look like they were 3-D, without actual polygons (so it's not as taxing on video cards), now we're going to use "displacement mapping" to actually raise the polygon count so it is 3D in a way.

Yeah except you miss the middle where we made it appear even more 3D with virutal displacement mapping (as used very well in Oblivion), and this thought will help you understand why we didn't do it until now.

Why don't we just do it already, and design all of our models with excrutiating poly detail so we don't have to make "xxx mapping" do it for us! lol!

Of courseit's the power required and the memory required. Up until now it hasn't been a concern for anyone but Matrox and even then the egine was a little slow despite dedicated transistors for the process. It's like Truform in Morrowind on the ATis, the R8500 could do it faster than an R9700Pro because it had dedicated resources however they weren't good for much else, and since Truform/nPatch didn't take off like ATi and users of the feature had hoped, they were dropped in order to add more transistors to yield more speed in a larger number of games/apps.

NOW we have the unified design with geometry shader units, both of these things relieve the overhead. Think about it, if need be you coul dedicate 8 shaders for pixel 24 for vertex and 32 for geometry if you have some bumpy as heck cave wall, and it would probably render amazing detail at a low model count, whereas now there's no short cuts, and no help from a dedicated geometry shader.

The thing that I think might be the fly in the ointment is with all this increased visual geometry, the physical boundaries of these walls will still be that sphere, notthe new bumpy visual surface. Now this might no cause alot of problems in general, but I would be that we'll have far more sse-through pillar and walls etc as characters now interact with these bumps that don't really exist in the outline of the parameters of the room.
 
Damn... now I'm wondering if I should just go ahead with the card I was planning to get:

IMO, get the HIS, even if you don't use it for physics, you could probably resell it for as much as you bought it for by the time the DX10 cards come out. It's got a quality name, killer HSF, and will be easily gobble up for even $80.
 

Vinny

Distinguished
Jul 3, 2004
402
0
18,780
Damn... now I'm wondering if I should just go ahead with the card I was planning to get:

IMO, get the HIS, even if you don't use it for physics, you could probably resell it for as much as you bought it for by the time the DX10 cards come out. It's got a quality name, killer HSF, and will be easily gobble up for even $80.

That's not a bad idea. People love those oversized coolers. The only reason I prefer them is because they exhaust hot air out- rather then use the hot air inside and/or create more hot air inside the case like most other coolers. Plus, even though the GPU might not need it, having a beefy cooler is never a bad thing. 8)