Sign in with
Sign up | Sign in
Your question

Fill me in on DX10 cards

Last response: in Graphics & Displays
Share
July 7, 2006 10:05:40 PM

I really don't know too much about whats happening with DX10 but I'm still wondering if I should wait.

Should I wait to get a DX10 card? I'm thinking about getting a cheapo $50 card for now getting a new card when DX10 cards arrive but I keep hearing all this stuff about DX10 cards- like you'll need a 1000watt PSU to run these things or that the coding is really bad, etc.

More about : fill dx10 cards

July 7, 2006 10:08:28 PM

If you can, wait completely for DX10 cards. They should be out in the Fall, at the earliest, and Spring at the latest.
July 8, 2006 1:24:53 AM

I see... is there anywhere I can get more info on what advantages the DX10 will hold? All I know so far is that DX9 cards wont be able to use DX10.

I'm really worried about those PSU requirements though and the price. But I guess if worse comes to worse, it'll only mean that DX9 cards will become much cheaper.:-p
Related resources
July 8, 2006 1:40:10 AM

my personal recemendation would be just to get a card now, itll run vista fine and run games for quite a while, just dont get super ultra high end. 7950 gtx cards get a 7600gt/s or x1600
July 8, 2006 2:09:44 AM

Quote:
I see... is there anywhere I can get more info on what advantages the DX10 will hold? All I know so far is that DX9 cards wont be able to use DX10.

I'm really worried about those PSU requirements though and the price. But I guess if worse comes to worse, it'll only mean that DX9 cards will become much cheaper.:-p


Here's an overview of the differences between DX9 and DX10:
http://www.driverheaven.net/articles/dx10/

Vista will have DX9L for legacy and DX10 for native games. For the first time, there will be no DX10 downloads for Windows XP. Though I plan to get a Crossfire system this fall, I'm using two X1600XTs, because in the future those X1600XTs can be used for physics alongside a DX10 card.

I can't see spending too much on a high end X1900 or X1950 class card unless you need an All in Wonder card for video editing. The same goes for Nvidia. DX10 will force an upgrade to Vista for some games, at least, and it offers advantages over DX9. ATI has the upper hand here because of unified shader architecture, though Nvidia could surprise us with their mutant solution.
July 8, 2006 2:22:21 AM

Great, I'm glad I found a thread explaining exactly what DX10 does! So... As I read from that article you linked to, yipsl, you cannot run DX10 effects on a DX9 card? It's not backwards compatible?

Hmm... and another thing from the article:

Quote:
DirectX 10 adds an extra type of shader into the mix, the geometry shader. This type of shader has the ability to create extra triangles if requested to do so. One of the many benefits of this type of shader would be to create displacement mapping (a way to make textures and scenes look more detailed by adding the illusion of height to an otherwise flat texture).


I thought bump-mapping already did that! Doesn't it do the same thing, e.g. slightly raise your poly count to make a textre look like it is 3-D, bevelled, textured, etc?

P.S. When I click on the images in the article, they don't open. Anyone else have that happen with Firefox?
July 8, 2006 3:42:04 AM

Quote:

P.S. When I click on the images in the article, they don't open. Anyone else have that happen with Firefox?
When u click on the picture a yellow bar should appear beneath the tab bar. On the right side click on popup option and select allow popups from whatever site your at. If u dont see that bar u probly clicked dont show this message again and u then might have to play with firefox settings.
July 8, 2006 4:08:33 AM

Quote:
I really don't know too much about whats happening with DX10 but I'm still wondering if I should wait.

Should I wait to get a DX10 card? I'm thinking about getting a cheapo $50 card for now getting a new card when DX10 cards arrive but I keep hearing all this stuff about DX10 cards- like you'll need a 1000watt PSU to run these things or that the coding is really bad, etc.


Quote:
Should I wait to get a DX10 card?


You will have to nobody sales them and only Vista will support DX10 and most people won't buy Vista.
I won't even install it OR torrent it when that time comes.

When the OS and hardware does arrive you can expect a long wait untill software follows that uses it...and your eyes are not likely to ever notice a change.
Sort of like running SLI gfx cards today,alot of money for nothing you can really see.
July 8, 2006 4:22:11 AM

Quote:

You will have to nobody sales them and only Vista will support DX10 and most people won't buy Vista.
I won't even install it OR torrent it when that time comes.

When the OS and hardware does arrive you can expect a long wait untill software follows that uses it...and your eyes are not likely to ever notice a change.
Sort of like running SLI gfx cards today,alot of money for nothing you can really see.


That's a very good point... I'm still iffy on Vista. Rather than try to make the system as bulletproof as possible, MS is doing a lot visual upgrades. Sure some features are really nice (like using a USB flash drive as system memory!) but overall, I'm seeing a lot of stability issues and such.

I will get Vista sooner or later, but definitely not for a year or two. I doubt game makers will just ditch DX9 cards because a majority of the people will run DX9 cards for a while.
July 8, 2006 4:35:13 AM

I've been beta testing Vista for a while now. And there's still alot of bugs -- so what? It's one of the largest-scale upgrades of an OS that's ever been done. And i'm glad that they're focusing more on "visual" appearances (Like the nice glass eye-candy) because OSes like OSX have always been better visuall. Now it's Windows' turn to take that prize.

I do agree though - I will wait a little while before upgrading to Vista just to make sure everything is dealt with. And not releasing DX10 for XP is a risky affair, but it didn't take long for everyone to adopt XP as their new OS, and it probably won't for Vista, either.
July 8, 2006 4:39:57 AM

Quote:
Sure some features are really nice (like using a USB flash drive as system memory!)


Why would you want to use slow memory as system ram?
If you ment BOOT from it (USB),you can do that already.

http://tkc-community.net/Forums/viewtopic.php?t=4206

Vista will be a hog. As it is if you have 2GB of ram then it wants to use up to 940MB and load the whole thing into the system.
Looks like a good idea....make the OS fast by having it all in ram....untill you need that ram.

It has good points and bad points but the main thing I dislike is the DRM locked into it's core.
July 8, 2006 4:49:00 AM

Quote:
Great, I'm glad I found a thread explaining exactly what DX10 does! So... As I read from that article you linked to, yipsl, you cannot run DX10 effects on a DX9 card? It's not backwards compatible?


DX10 is not backwards compatible, though VISTA will have DX9L for legacy games and graphics cards. The Inquirer broke the news relatively early that there will be no DX10 for Windows XP. I'll be an early adopter of VISTA on one PC but I'll keep one of our two XP only until the first service pack arrives to patch the new OS. By that time, SP3 should be out for Windows XP.

Well, the AIW X1900 box can stay XP while I can take the Crossfire system I'm building to VISTA. It will only have X1600XTs until then, but ATI claims I'll be able to use one X1600XT for physics (and even RPGs have physics nowadays) and get a midrange DX10 card.

Though I've read about bump mapping, and my wife mods for Morrowind and Oblivion, I really don't have a handle on what's really different between geometry shaders and bump or normal mapping. I do know that DX10 goes to shader 4.0 as well as adding geometry shaders. If the difference is even more pronounced than it was between my Radeon 8500 under DX8.1 and DX9.0c under an X1600XT, then I see the gaming value in it.

I'd rather upgrade my PCs than to go out and buy an Xbox 360. If I didn't like PC RPGs, then I guess I'd have no reason to go VISTA right away.
July 8, 2006 6:02:47 AM

Quote:

It has good points and bad points but the main thing I dislike is the DRM locked into it's core.


What does that mean? I think DRM means digital Rights Management but I don't understand what 'locked into it's core' means? It doesn't sound like a good thing... :( 

Quote:
Great, I'm glad I found a thread explaining exactly what DX10 does! So... As I read from that article you linked to, yipsl, you cannot run DX10 effects on a DX9 card? It's not backwards compatible?


DX10 is not backwards compatible, though VISTA will have DX9L for legacy games and graphics cards. The Inquirer broke the news relatively early that there will be no DX10 for Windows XP. I'll be an early adopter of VISTA on one PC but I'll keep one of our two XP only until the first service pack arrives to patch the new OS. By that time, SP3 should be out for Windows XP.

Well, the AIW X1900 box can stay XP while I can take the Crossfire system I'm building to VISTA. It will only have X1600XTs until then, but ATI claims I'll be able to use one X1600XT for physics (and even RPGs have physics nowadays) and get a midrange DX10 card.

Though I've read about bump mapping, and my wife mods for Morrowind and Oblivion, I really don't have a handle on what's really different between geometry shaders and bump or normal mapping. I do know that DX10 goes to shader 4.0 as well as adding geometry shaders. If the difference is even more pronounced than it was between my Radeon 8500 under DX8.1 and DX9.0c under an X1600XT, then I see the gaming value in it.

I'd rather upgrade my PCs than to go out and buy an Xbox 360. If I didn't like PC RPGs, then I guess I'd have no reason to go VISTA right away.

Wait, so... you're saying with Crossfire, I can get a X1600XT to use for a graphics card for XP. But later on, with Vista, I can get a X1900XTX Crossfire Edition and use that as my GPU while the X1600XT serves as my PPU? Any links on this? It's quite intriguing!!:D 
July 8, 2006 8:37:53 AM

Quote:

Wait, so... you're saying with Crossfire, I can get a X1600XT to use for a graphics card for XP. But later on, with Vista, I can get a X1900XTX Crossfire Edition and use that as my GPU while the X1600XT serves as my PPU? Any links on this? It's quite intriguing!!:D 


The use of an ATI card for physics starts with the X1600 series. Though a Twitch Guru comment said that onboard graphics could be unlockable for physics in a single PCIe card setup sometime in the future.

From what I've read, you'll need a Crossfire board, and if the board has two PCIe slots, then you could use one DX10 card for graphics and the X1600XT for physics. It would not be Crossfire per se.

Now, I've read there will be boards with something like 2 PCIe 16 and one PCIe 8 (or 3 PCIe 16) in the future. Those boards will allow two cards in Crossfire and one in physics. I'm not 100% sure how it will all pan out.

Quote:

The interesting thing about what we've seen today from ATI is their plan to use three graphics cards, instead of two or four, so you can grab two x1900's (let's say, for arguments sake) and then stick in a relatively cheap x1600 or so to do your physics grunt work. Of course you need 3 PCI Express slots to do this, and motherboards with three slots are few and far between. However if and when the likes of Quad SLI and this three-way solution from ATI take off we may start to find mobo's with three and four PCI-E slots easier and easier to come by.


http://www.tgdaily.com/2006/06/07/ati_goes_physics_gaga...

It seems it will be delayed for 9 months (ie March 2007), so it's gearing up for Vista.

http://www.theinquirer.net/default.aspx?article=32755

Currently, if you use an X1900 Crossfire card and an X1600XT, you'd be making a mistake as the pixel pipelines on the better card default to those of the weaker card. You can use two X1600XT's without a Crossfire card or dongle, and they use a bit more power than an X1800XL but are just as fast.

If you got a Crossfire card (ie X1800 for use with another X1800 or an X1900 for use with another X1900), then you'd be spending quite a bit of money for a one year DX9.0c solution. While the X1800 and X1900 Crossfire ready cards (ie the cards that aren't dedicated Crossfire masters) can be used for physics just like the low end X1600XT, I'm not sure the Crossfire masters would be usable for physics.

Would there be a conflict with the newer Crossfire DX10 master, or would the third card in a three card system not be recognized for graphics, but for physics? That's what's up in the air. It should pan out nicely, when the 3 or 4 PCIe 16 (or even 8) boards are available.

It's Havok FX physics that will be supported by the ATI cards, so games that use other physics engines might or might not benefit. It depends on if there's a DirectPhysics standard by Microsoft for Vista that's supported by all the physics engines out there.
a b U Graphics card
July 8, 2006 1:29:19 PM

Quote:

One of the many benefits of this type of shader would be to create displacement mapping (a way to make textures and scenes look more detailed by adding the illusion of height to an otherwise flat texture).


I thought bump-mapping already did that! Doesn't it do the same thing, e.g. slightly raise your poly count to make a textre look like it is 3-D, bevelled, textured, etc?

Your confusing the two, bump mapping is face on only, current virtual displacement mapping (ie ocludded parallax offset mapping) give you the apperance depth when moving, the new geometry shader will allow true displacement mapping like that found on the Parhelia (so ahead of it's time :mrgreen: ) which will actually give it depth not just the appearance of depth;
http://www.tomshardware.com/2002/05/14/matrox_parhelia/...

Look at the sphere, it looks like it has a rougher surfaces when bump mapped, but it in effect is still just a perfect smooth sphere (look at the outside edges), then look at the displacement mapped surface, which actually extends beyond the originating sphere.

Wiki's got a good description of the differences, check the links to to get an easy idea of the differences;
http://en.wikipedia.org/wiki/Displacement_mapping
a b U Graphics card
July 8, 2006 1:55:22 PM

Quote:

The use of an ATI card for physics starts with the X1600 series.


Actually ATi mentioned the X1300 in alot of their early material, it's just that they demoed on the X1600, and that's likely because the X1300 didn't outperform Ageia's PhysX card enough to make it into the demo. But there's no physical restriction to it.

Quote:
Though a Twitch Guru comment said that onboard graphics could be unlockable for physics in a single PCIe card setup sometime in the future.


And possible even unused portions of a single VPU if the load can be well balanced.

Quote:
Now, I've read there will be boards with something like 2 PCIe 16 and one PCIe 8 (or 3 PCIe 16) in the future.


They already exist, Tyan has 2 x 16X + 2 x 8X , and the ATi demo was on 2 x 16X + 1 x 8X, but futre models are to be 3 x 16X. And Gigabyte has their quad board which is 4 x 8X.

Quote:
Currently, if you use an X1900 Crossfire card and an X1600XT, you'd be making a mistake as the pixel pipelines on the better card default to those of the weaker card.


You'd be mistaken, because it's not the case. they would both work independantly in their own pixel pipeline configuration, but no in crossfire, which is no different than anyone else's solution. The X1600 would not be for rendering graphics, except in a multi-monitor capacity.

Quote:
If you got a Crossfire card (ie X1800 for use with another X1800 or an X1900 for use with another X1900), then you'd be spending quite a bit of money for a one year DX9.0c solution.


So obviously you should get 2 X1800GTOs and then you're set, no worries of a master card.

Quote:
I'm not sure the Crossfire masters would be usable for physics.


I'm not sure why you would think they wouldn't be. Of course they would.

Quote:
Would there be a conflict with the newer Crossfire DX10 master, or would the third card in a three card system not be recognized for graphics, but for physics?


Depends on M$' implementation, but it would act very much like XP does now, fine for multi-monitor, fine for physics, but graphics for gaming becomes a case by case application.

Quote:
That's what's up in the air. It should pan out nicely, when the 3 or 4 PCIe 16 (or even 8) boards are available.


Like I said, they're already here.

Quote:
It's Havok FX physics that will be supported by the ATI cards,


And also ATi's own implementation, which is not simply Havok's but in addition to that.

Quote:
so games that use other physics engines might or might not benefit.


Which would be a small number of games, since it's priarily Havok doing the phsyics now, with a small number for Ageia (most are future titles); and that's why ATi would give the alternative, in case things should change in the future. Right now though there's no worry.

Quote:
It depends on if there's a DirectPhysics standard by Microsoft for Vista that's supported by all the physics engines out there.


Actially it won't be supported by all the engines completely since M$ is not going the PPU route only the VPU route, and thus Ageia will have to decide whether they wish to support it, and even then developers may add their own support in additional to Ageia's restricted path.
However in all likelyhood M$' decision to enter the fray has basically ensured that the future of VPU based physics will have a generalized/standardized platform upon which to buid other engines, at worst IMO there'll be Ageia, Havok and M$' solution, the lasttwo of which both ATi and nV have said they'd support. And it's unlikely that ATi and nV's own solutions would be anything that game developers would aopt instead of those 3, more likely they would be a means of bridging them, the way that thei graphics drivers do now.

MMmm physics drivers, yeah! :wink:
July 8, 2006 2:33:48 PM

Quote:
I really don't know too much about whats happening with DX10 but I'm still wondering if I should wait.

Should I wait to get a DX10 card? I'm thinking about getting a cheapo $50 card for now getting a new card when DX10 cards arrive but I keep hearing all this stuff about DX10 cards- like you'll need a 1000watt PSU to run these things or that the coding is really bad, etc.


OK, quickly there is upcoming high-end R600 Dx10 card from ATI maybe in november, at worst february (more the first than the latter tough). It should have, according to most estimate, 64 unified shaders unit, that can do both pixel and vertex calculations. So, in theory, it should use it's ressource the most efficient way. ATI should be pretty good at this since their R500 gpu for XBox360 already is build that way, and I gotta say XBOX360 games looks really amazing, even tough I'll never get one (can't do encoding with an XBox :twisted: ).

For nVidia, there is upcoming high-end G80 Dx10 card. It should have (according to Inquirer, and in this case I tend to believe them) 32 pixels shaders and 16 vertex shaders. New gpu for Dx10 shouldn't have dedicated pipeline for pixel and vertex, but either I'm wrong or nVidia found a way to make it work. If so, they were probably thinking more of Dx9 performances, because there isn't gonna be much Dx10 game before it's replacement arrives in march-june 2007 anyway.

If anybody think I'm wrong, please let me know, cause I'm really not sure at all about this. I should be fairly close tough.

My 2 cents.
July 8, 2006 6:09:51 PM

Quote:
Should I wait to get a DX10 card? I'm thinking about getting a cheapo $50 card for now getting a new card when DX10 cards arrive but I keep hearing all this stuff about DX10 cards- like you'll need a 1000watt PSU to run these things or that the coding is really bad, etc.


Back to that orignal question. I've heard people say that the high-end DX-9 cards (Such as the X1800 / 1900) need at least a 550W power supply. This is making me very upset too, becasue I just bought a budget $40 500W power supply.

I researched, though, and it really just depends hon how much electricity your PSU can send over one wire. Obviously, the graphics card needs a certain amount of electricity over a single "rail".

I'm pretty sceptical, though... My X800XL 256MB ran fine on a 250W power supply. Maybe my next-gen DX10 card will run fine on my budget 500W power supply :lol: 

Quote:

Your confusing the two, bump mapping is face on only, current virtual displacement mapping (ie ocludded parallax offset mapping) give you the apperance depth when moving, the new geometry shader will allow true displacement mapping like that found on the Parhelia (so ahead of it's time :mrgreen: ) which will actually give it depth not just the appearance of depth


Well... first we used bump-mapping to make things look like they were 3-D, without actual polygons (so it's not as taxing on video cards), now we're going to use "displacement mapping" to actually raise the polygon count so it is 3D in a way.

Why don't we just do it already, and design all of our models with excrutiating poly detail so we don't have to make "xxx mapping" do it for us! lol!
July 8, 2006 6:10:49 PM

NightlySputnik, so let me ask you a question because this is mind boggling for me due to too much information too fast from two different people. Assuming I have a Crossfire mobo and all goes well, if I get a PCIe X1600XT DX9 card now and a PCIe DX10 series ATI card (something midrange or a bit higher), I will be able to dedicate the 1600XT as my PPU and have the new DX10 card as my GPU?

It'd be really cool if it works but no biggie if it doesn't. If it doesn't, I'd just like to make sure that I don't waste my time building my system with Crossfire (because currently, no AM2 mobos have Crossfire :lol: ).

If DX10 cards are coming within the next year then I can play the 'wait and see' game. If it works then I'll get a DX10 card, if not, then DX9 cards will probably become a bit cheaper at the least.:) 
July 8, 2006 6:13:39 PM

Quote:
Assuming I have a Crossfire mobo and all goes well, if I get a PCIe X1600XT DX9 card now and a PCIe DX10 series ATI card (something midrange or a bit higher), I will be able to dedicate the 1600XT as my PPU and have the new DX10 card as my GPU?


My answer: Yes. I say yes because they work seperatley, so it doesn't really depend upon what your main GPU is. But, we better make sure someone else here is sure about it.
July 8, 2006 6:43:15 PM

I honestly can't answer a definitive yes to this. But I remember ATI saying you could use pretty much any serieX (1300, 1600, 1800 and 1900 for now) VPU for physics with any other VPU for dedicated graphics. But I can't tell about different DX version of cards. Maybe DX10 vpu will need to be used with other card of the DX10 series to work for physics.

But, it might be quite long before game with ATI physics engine get to retails. So at this time all your X1600 card is gonna be good for is most probably gonna be physics (no offence intended).

You can rest assure that DX10 card will be out before year end for nVidia (except very big problem a la R520) and february (at worst) for ATI. For mid-range I'd say probably next march more or less, maybe for before depending on Vista release date.
a b U Graphics card
July 8, 2006 9:18:19 PM

Quote:

Well... first we used bump-mapping to make things look like they were 3-D, without actual polygons (so it's not as taxing on video cards), now we're going to use "displacement mapping" to actually raise the polygon count so it is 3D in a way.


Yeah except you miss the middle where we made it appear even more 3D with virutal displacement mapping (as used very well in Oblivion), and this thought will help you understand why we didn't do it until now.

Quote:
Why don't we just do it already, and design all of our models with excrutiating poly detail so we don't have to make "xxx mapping" do it for us! lol!


Of courseit's the power required and the memory required. Up until now it hasn't been a concern for anyone but Matrox and even then the egine was a little slow despite dedicated transistors for the process. It's like Truform in Morrowind on the ATis, the R8500 could do it faster than an R9700Pro because it had dedicated resources however they weren't good for much else, and since Truform/nPatch didn't take off like ATi and users of the feature had hoped, they were dropped in order to add more transistors to yield more speed in a larger number of games/apps.

NOW we have the unified design with geometry shader units, both of these things relieve the overhead. Think about it, if need be you coul dedicate 8 shaders for pixel 24 for vertex and 32 for geometry if you have some bumpy as heck cave wall, and it would probably render amazing detail at a low model count, whereas now there's no short cuts, and no help from a dedicated geometry shader.

The thing that I think might be the fly in the ointment is with all this increased visual geometry, the physical boundaries of these walls will still be that sphere, notthe new bumpy visual surface. Now this might no cause alot of problems in general, but I would be that we'll have far more sse-through pillar and walls etc as characters now interact with these bumps that don't really exist in the outline of the parameters of the room.
a b U Graphics card
July 8, 2006 9:28:03 PM

Quote:
Damn... now I'm wondering if I should just go ahead with the card I was planning to get:


IMO, get the HIS, even if you don't use it for physics, you could probably resell it for as much as you bought it for by the time the DX10 cards come out. It's got a quality name, killer HSF, and will be easily gobble up for even $80.
July 8, 2006 9:59:09 PM

Quote:
Damn... now I'm wondering if I should just go ahead with the card I was planning to get:


IMO, get the HIS, even if you don't use it for physics, you could probably resell it for as much as you bought it for by the time the DX10 cards come out. It's got a quality name, killer HSF, and will be easily gobble up for even $80.

That's not a bad idea. People love those oversized coolers. The only reason I prefer them is because they exhaust hot air out- rather then use the hot air inside and/or create more hot air inside the case like most other coolers. Plus, even though the GPU might not need it, having a beefy cooler is never a bad thing. 8)
a b U Graphics card
July 8, 2006 10:14:22 PM

Quote:

That's not a bad idea. People love those oversized coolers. The only reason I prefer them is because they exhaust hot air out- rather then use the hot air inside and/or create more hot air inside the case like most other coolers. Plus, even though the GPU might not need it, having a beefy cooler is never a bad thing. 8)


Yep the arctic cooling HSFs are great (sure hope ATi's next cooler doesn't lose it's good points). And really it does make people go, Ooohh Kick-Ass! Of course they're on X1300s too, so it means nothng. But your reason for liking them is one of mine too, all the hot air leaves the case, and that's a good thing because everything else in there is already warm enough. And actually it will let you OC the core agood amount, too bad it's the memory that's really holding it back, but at that price, there's nothing close, and when you resell it it has everything a checkbox eBayer is looking for, BIG HSF, SM3.0+, HDR+AA, latest tech, then throw in Dual-Link DVI, heck even a photochopper could use it for 2D on a 30"LCD.

Yeah as a buy and resell, I'd say it's a good option for the price. If you were holding on to it for a year or two I'd direct you elsewhere but it seems perfect for your needs.
July 8, 2006 11:02:35 PM

Cool, this actually works both ways for me then. If I can use it as a PPU then it'll stay. If not, off to eBay!:) 
July 10, 2006 12:21:47 AM

Quote:

Currently, if you use an X1900 Crossfire card and an X1600XT, you'd be making a mistake as the pixel pipelines on the better card default to those of the weaker card.


You'd be mistaken, because it's not the case. they would both work independantly in their own pixel pipeline configuration, but no in crossfire, which is no different than anyone else's solution. The X1600 would not be for rendering graphics, except in a multi-monitor capacity.

I should have referenced the actual ATI Crossfire FAQ. This is what it says:

Quote:

6. What happens when you pair a 12-pipe CrossFire Edition card with a 16-pipe card?

A. In this scenario both cards will operate as 12-pipe cards while in CrossFire mode.


http://www.ati.com/technology/crossfire/faq.html

The X1800XL or XT would have been a better example than the X1900XT. The X1900XT has 16 pixel pipelines, but does more shader operations, whereas the X1800 is closer in design to the X1600. Anyways, the above ATI Crossfire information is what I was attempting to explain.

Thanks for information on the current 2 PCIe x16 and 2 PCIe x8 board. I can't wait till the 3 PCIe Crossfire boards arrive with next year's onboard ATI DX10 graphics. I've always felt that onboard graphics X200 and above should work for more than just a second monitor, that it should work as a poor man's Crossfire alongside a single card in a single PCIe board, or that it should work for physics.

As for the Crossfire Masters, I was referring to the driver issue of having a DX10 Crossfire Master with a DX10 card for the second Crossfire, and an older Crossfire card as the physics in the third setup.

ATI's first Conroe Crossfire boards will be 2 PCIe x8 instead of x16, but they'll have the onboard X700 based graphics.
a b U Graphics card
July 10, 2006 2:38:08 AM

Quote:

I should have referenced the actual ATI Crossfire FAQ. This is what it says:

Quote:

6. What happens when you pair a 12-pipe CrossFire Edition card with a 16-pipe card?

A. In this scenario both cards will operate as 12-pipe cards while in CrossFire mode.


Actually you should read the context of the question, that' s 800Pro with and X800XT, not X1600 with X1800/1900.

Quote:
The X1800XL or XT would have been a better example than the X1900XT.


Neither way works for your explanation, understand that the X1600 is 4 'pipelines' not 12, this is the first reason you're going to have difficulty understanding this. NO X1600 with either the X1800 with X1900 in the context you are crossfiring them.

Quote:
The X1900XT has 16 pixel pipelines, but does more shader operations, whereas the X1800 is closer in design to the X1600.


You got that backwards, the X1600 is closer to the X1900 design than the X1800 design.

Quote:
Anyways, the above ATI Crossfire information is what I was attempting to explain.


I know what you were attempting, but you're confused, and your first false assumption was the previous line, the second is that ATi is talking about something othe than crossfiring two cards of the same series (ie X800 with X800 ; X1300 with X1300).

Quote:
As for the Crossfire Masters, I was referring to the driver issue of having a DX10 Crossfire Master with a DX10 card for the second Crossfire, and an older Crossfire card as the physics in the third setup.


Well there won't be older Crossfire cards at that time, it's either DX9 or DX10, and really like I said before it has to be the same series, so same DX. As for 2 DX10 in Xfire and1 DX9 in physics, it'll be fine because it's not rendering graphics, it becomes a physics unit, and then as for the multi-monitor option it act like the DX9 cards now with older DX7 or older cards, where the game rendering still sticks to DX10, and the other card is DX9L supported for basic multi-monitor output.

Quote:
ATI's first Conroe Crossfire boards will be 2 PCIe x8 instead of x16, but they'll have the onboard X700 based graphics.


Which matches the above example, DX10+DX9.
July 10, 2006 3:02:02 AM

Quote:
Damn... now I'm wondering if I should just go ahead with the card I was planning to get:
http://www.newegg.com/Product/Product.asp?item=N82E1681....

Or get a lower midrange card like this one:
http://www.newegg.com/Product/Product.asp?item=N82E1681...

:?


Any 7900GT or whatever the ATI equivlent is are both at the top range of all the human eye can really see.

After that point all the tests,reports,reviews,benchmarks and other propaganda is just to keep the writers busy and people spending money.
a b U Graphics card
July 10, 2006 3:17:08 AM

Quote:
Any 7900GT or whatever the ATI equivlent is are both at the top range of all the human eye can really see

Um, that all depends on the game. Oblivion hammers a 7900GT at just 1024x768 resolution even with the 7900GT averaging 24fps in this test. We could all see a difference between 10x7 and 16x12 resolution, and clearly can notice when framerates are in the teens.
July 10, 2006 4:52:01 AM

I don't plan to run Oblivion anytime soon.. I'll try to get any GPU demanding game on consoles whenever possible unless it's a FPS.

This system is mainly for FPS and some PC-only titles I've missed out on for the past... 6 years.:lol:  So if it can handle FEAR at a widescreen resolution, then I'll be happy.

Pauldh, hows that X1800XT working out for you? That's the card I'm looking to get.:) 
July 10, 2006 10:39:46 AM

Quote:

Actually you should read the context of the question, that' s 800Pro with and X800XT, not X1600 with X1800/1900.


What page are you reading? Here is the relevant sections of the Crossfire FAQ. The info related to a 16 pipeline card defaulting to 12 pipelines when used with a 12 pipeline card seems to apply across the board, not just to the first implementation of Crossfire.

They both operate at their respective clock speeds, but the card with more pixel pipelines seems to lose out. They list all the Crossfire ready cards on this table, not just the older series.

Quote:



1. What is the difference between ATI’s CrossFire platform and the competitor’s solution?

A. The principal differences between the competitor’s multi-GPU solutions and ATI’s CrossFire are:

a. CrossFire can enable multi-GPU rendering on all applications.
b. CrossFire supports more rendering modes. Supertiling evenly distributes the workload between the two GPUs to improve performance. CrossFire can use multiple GPUs to improve image quality rather than performance with Super antialiasing (AA) modes. Supertiling and SuperAA modes are only supported on the CrossFire platform.
c. CrossFire is an open platform that supports multiple components and graphics cards that can be mixed and matched in a single system. Competitive multi-GPU solutions are constrained to supporting identical graphics cards.

2. What graphics cards work with CrossFire?

A. A complete CrossFire system requires a CrossFire Ready motherboard, a CrossFire Edition graphics card and a compatible standard Radeon (CrossFire Ready) graphics card from the same series, or two CrossFire Ready cards if they are software enabled. This applies to cards from ATI or any of its partners.
Card One --------------- Card Two
Radeon X1900 Series ----Radeon X1900CrossFire Edition
Radeon X1800 Series ----Radeon X1800 CrossFire Edition
Radeon X1600 Series---- Radeon X1600 Series
Radeon X1300 Series---- Radeon X1300 Series
Radeon X850 Series------Radeon X850 CrossFire Edition
Radeon X800, PRO, XL, GTO, XT
or XT Platinum Edition ---Radeon X800 CrossFire Edition

3. What is the difference between CrossFire Ready graphics cards and CrossFire Edition graphics cards?

A. CrossFire Edition graphics cards include a “compositing engine” chip on-board. This chip takes the partially rendered image from the CrossFire Ready graphics card, and merges it with the partially rendered image from the CrossFire Edition graphics card. The result is a complete frame rendered at up to twice the performance of a single graphics card. The CrossFire compositing engine is a programmable chip that offers flexible support of different graphics cards, allows a superior feature set (advanced compositing modes), and enables further enhancements to be quickly implemented on next generation products. The CrossFire compositing engine also offers a performance benefit over combining the final image on the GPU.

4. What motherboard is required for a CrossFire system?

A. CrossFire Xpress 3200, CrossFire Xpress 1600, Radeon Xpress 200 CrossFire, Intel i955X and Intel i975X based dual-slot motherboards are supported platforms.

5. When will CrossFire Ready motherboards be available?

A. CrossFire Xpress 3200, CrossFire Xpress 1600, Radeon Xpress 200 CrossFire, Intel i955X and Intel i975X motherboards are available from our partners now.

6. What happens when you pair a 12-pipe CrossFire Edition card with a 16-pipe card?

A. In this scenario both cards will operate as 12-pipe cards while in CrossFire mode.

7. What happens when your CrossFire Edition card and and a compatible standard Radeon (CrossFire Ready) graphics card have different clock speeds?

A. Both cards will continue to operate at their individual clock speeds.



The question I have is what happens with the X1900 Crossfire card running with an X1600XT? The X1900 has 16 pipelines that do 3 shader operations each, so does it default to 12 with 3 operations each?

The X1800 is based on the R520 core, the X1600 is based on the R520 core and the X1900 is based on the R580, so I stand by my comment that the X1600 is closer in generation and operation to the X1800 than the X1900.

Regardless of what core generation the X1600 is derived from, it only has 12 pixel pipelines and one operation per pipeline. Thus, it does shaders in a manner closer to the X1800 than the X1900.

Note the Newegg description of the ATI X1900XT:

Quote:

# ATI 102-A52025 Radeon X1900XT 512MB GDDR3 PCI Express x16 Video Card - OEM

Chipset Manufacturer: ATI
Core clock: 625MHz
DirectX: DirectX 9
DVI: 2
Memory Clock: 1450MHz
Memory Interface: 256-bit
OpenGL: OpenGL 2.0
PixelPipelines: 16(48 pixel shader processors)
TV-Out: HDTV/S-Video/Composite Out
VIVO: Yes
# Model #: 102-A52025
# Item #: N82E16814102690


So, I think it's a valid question. An X1900GT with 12 (36) pipelines and an X1900 Crossfire card with 16 (48 )

ATI also didn't make the All in Wonder cards Crossfire ready, which is why I can't use an AIW X1900 with an X1900 Crossfire. That would be an ideal situation for the way I use my PC. Perhaps that will change in with the DX10 All in Wonder cards.

Quote:

Neither way works for your explanation, understand that the X1600 is 4 'pipelines' not 12, this is the first reason you're going to have difficulty understanding this. NO X1600 with either the X1800 with X1900 in the context you are crossfiring them.


The X1300 is 4 pipelines. You have it wrong. The X1600 is 12 pipelines. Note the relevant info from Newegg:

Quote:

# HIS Hightech H160XTQT256GDDN Radeon X1600XT 256MB GDDR3 PCI Express x16 IceQ Turbo CrossFire Supported Video Card - Retail

Chipset Manufacturer: ATI
Core clock: 600MHz
DirectX: DirectX 9
DVI: 2
Memory Clock: 1400MHz
Memory Interface: 128-bit
OpenGL: OpenGL 2.0
PixelPipelines: 12
TV-Out: HDTV/S-Video Out
VIVO: No
# Model #: H160XTQT256GDDN
# Item #: N82E16814161160


I can't believe you're getting such basic information wrong. There might be a difference between newer drivers and mixing different generations of Crossfire cards not mentioned in their FAQ, but at least don't say that the FAQ info on defaulting to x number of pipelines only applies to one series, when the FAQ as a whole references all the series. At least get the number of pipelines of the X1600XT series correct.

I've owned both ATI and Nvidia cards over the years (and Diamond Voodoo cards before that) and I try to get the information related to each company's cards correct. I've been mainly an ATI fan since I started using All in Wonder cards in each of my PCs. That's an area where Nvidia just can't compete.
July 10, 2006 10:51:59 AM

I'm a long term Elder Scrolls fan since TES: Arena and IMHO, Oblivion is a FPS. It's closer to a Battlespire 2 than it is the next Daggerfall. I think it will compare favorably to the "Half Life Too" that Ubisoft is making out of the Might and Magic series.

Other people have been arguing that Oblivion isn't a true RPG and isn't a true FPS, so even if it's a chimera in design, it's at least a marketing success. No matter what anyone says about today's Bethsoft, they know how to market.
July 10, 2006 4:27:02 PM

I can't tell, every body has different point of view on this.

All I can tell is that I'll probably get a Radeon X1800XT for around 350$ (canadian) when I'll get my new computer in september. Unless DX10 card are already out. This way I'll have good performances without paying too much. If I'd get a 600+$ VPU, it would be too much for a card that I'll use maybe only 9-12 months before upgrading to DX10.

If I would have more patience, I would wait for december to get my computer with a DX10 card. I might actually do this, I don't know yet. It's only that at some point you do have to buy, or you'll wait forever. I just don't know when is the best time.

That's what you should ask yourself.

My 2 cents :wink:
a b U Graphics card
July 10, 2006 6:42:30 PM

I tried to hold back my criticism before, but seriously dude, you don't know $H1T, and now you're confusing people who don't know that you don't know squat, as well as wasting my ime correcting you, thus Pi$$ing me off in the process!



Quote:

What page are you reading? Here is the relevant sections of the Crossfire FAQ. The info related to a 16 pipeline card defaulting to 12 pipelines when used with a 12 pipeline card seems to apply across the board, not just to the first implementation of Crossfire.


The Section that you missing, the first;
a CrossFire Edition graphics card and a compatible standard Radeon (CrossFire Ready) graphics card from the same series, or two CrossFire Ready cards if they are software enabled.

ie X1600 with X1600 , X1800 with X1800, X1900 with X1900.
That you can't wrap your head around that means, there's nothing left to discuss until you do more and more importantly BETTER research. Stop spouting your misconceptions, you're just F'in up people who know as little as you!

Quote:
The X1300 is 4 pipelines. You have it wrong. The X1600 is 12 pipelines. Note the relevant info from Newegg:


Whatever dude, you're an idjit if you take you informatin from NewEgg, heck they list AGP cards as Crossfire ready! :roll:

Quote:
I can't believe you're getting such basic information wrong.


I can't believe that you still bother posting your ignorance. I'm right on this and you're wrong.

And if you continue like this then you better get used to it. Now STFU before I beat you with my massive brain!

Quote:
There might be a difference between newer drivers and mixing different generations of Crossfire cards not mentioned in their FAQ, but at least don't say that the FAQ info on defaulting to x number of pipelines only applies to one series, when the FAQ as a whole references all the series. At least get the number of pipelines of the X1600XT series correct.


Seriously, there's no point in discussing it with you, because you wouldn't know the difference between a pixel pipe and crack pipe, although you sound like you're more familiar with the later.

Quote:
I've owned both ATI and Nvidia cards over the years (and Diamond Voodoo cards before that) and I try to get the information related to each company's cards correct.


Just because you own it doesn't mean you know anything about it. Just like your average DELL owner.
July 10, 2006 7:08:30 PM

Quote:
Sure some features are really nice (like using a USB flash drive as system memory!)


Why would you want to use slow memory as system ram?
If you ment BOOT from it (USB),you can do that already.

http://tkc-community.net/Forums/viewtopic.php?t=4206

Vista will be a hog. As it is if you have 2GB of ram then it wants to use up to 940MB and load the whole thing into the system.
Looks like a good idea....make the OS fast by having it all in ram....untill you need that ram.

It has good points and bad points but the main thing I dislike is the DRM locked into it's core.

Yes, that guy knew what he was talking about... a flash memory disk should have considerably lower access times than a conventional hard drive... I believe Vista plans on using that sort of alternate storage to compliment RAM... not replace it. I think M$ envisions it being used for the temp/swap file.
a b U Graphics card
July 10, 2006 7:18:44 PM

From what I read it's primarily the notebook/laptop segment that is forced to do this. It's for the auto archiving, etc.

The flash ram has quicker access (thanks to no spin-up time) but slower transfer rates.

It'll be nice, but once they get the MRAM on the drives then it should be both quick, fast and efficient, eventually likely replacing HDs all together IMO, but that's aways off.
July 11, 2006 12:05:50 AM

Stop being a troll. I used the Newegg data for quick reference. If you want Anandtech, then here it is:

Quote:

ATI's RV530 aka Radeon X1600

Targetting the upper midrange is the Radeon X1600 (RV530), built on the same 90nm process as the Radeon X1800 (R520). All of these RV530 series and lower are single slot products, according to the roadmaps.

ATI RV530 Roadmap
Card Pipes Std Core Clock Std Memory Power Consumption
X1600 XT 12 600MHz 700MHz 60W
X1600 Pro 12 500MHz 400MHz 40W



Neoseeker

Lastly, direct from ATI

Quote:

GPU Specifications
Features
157 million transistors on 90nm fabrication process
Dual-link DVI
Twelve pixel shader processors
Five vertex shader processors
128-bit 4-channel DDR/DDR2/GDDR3 memory interface
Native PCI Express x16 bus interface
AGP 8x configurations also supported with AGP-PCI-E external bridge chip
Dynamic Voltage Control


Have fun trolling somebody else, but anyone who yells, insults and swears all the while claiming that an X1600XT is 4 pixel pipelines is either a fool, a deliberate troll of a fanboy of a competing GPU company who just can't get his facts straight out of emotional need.
a b U Graphics card
July 11, 2006 2:25:39 AM

Quote:
Stop being a troll.


LOL!
Troll would imply I'm wrong. Sofar you're the only one posting incorrect information, and continuing to do so despite sinking in your own mistakes.

Why am I not surprised you're an Anandtech flunkie. :roll:

Quote:

Twelve pixel shader processors


Which is not a pipeline, why do you think they expressly made it 'processor and not 'pipeline'.
In order to count as a pipeline it must have a texture component, of which is only has 4, therefore 12 shader procxessors, and 4 pipelines. Like I said before.

http://www.tomshardware.com/2005/10/06/ati_enters_the_x...

"Radeon X1000 architecture de-couples components of the rendering pipeline."

There are only 4 full pixels can be processed per clock, each pipeline contains 3 pixel shader units and one texture unit.

The X1600 has 12 shadeer processors and 4 texture units, therefore 4 full pipelines. The X1800 has both 16 shader units and 16 texture units, and as such 16 full pipelines, and the X1900 has 48 shader units, and 16 texture units, and as such still just 16 pipelines.

Quote:
Have fun trolling somebody else, but anyone who yells, insults and swears all the while claiming that an X1600XT is 4 pixel pipelines is either a fool, a deliberate troll of a fanboy of a competing GPU company who just can't get his facts straight out of emotional need.


Anyone who doesn't know the difference between a pipeline and a single pixel shader is an ignroant nÖÖb, especially if they think they can Crossfire an X1600 with and X1800 or X1900, while posting a FAQ proving themselves wrong.

It's quite lame that you try to use your mistaken understand of the difference between a single shader unit and a pipeline to try and defend your original mistakes. Sofar you've only posted a list of sites who made similar mistakes.

Only later did the reviewers finally figure out what the new design was;

"It wasn't until RV530 (Radeon X1600) was fully understood did these descriptions become clearer as to their intent; with a configuration signalled on the roadmap as 4-1-3-2 for RV530 it become clear that the numbers represented the number of "pipelines" (even though ATI's engineering didn't like this terminology), the number of texture units per pipeline, parallel shader ALUs per pipeline and the number of Z samples per pipeline."

http://www.beyond3d.com/reviews/ati/r580/

Read the first two pages of that review and edjumakate yourself, or else go crawl back to Anandtech where you belong!


edted typo.
July 11, 2006 3:31:16 PM

Quote:
anyone who yells, insults and swears all the while claiming that an X1600XT is 4 pixel pipelines is either a fool, a deliberate troll of a fanboy of a competing GPU company who just can't get his facts straight out of emotional need.


He was actually right. It has 12 pixel shaders, but it can render only four of them. This put it at 4 pipeline. Do a seach on Radeon X1600 review and you'll find good explanations of the card specifications. Because of this, it will perform for today's games and upcoming one not so bad. That because pixel calculations for these game are multiple for any single pixel rendered on the screen (which mean they are more complex). But for older games, it will lag behind, because 8 out of 12 pixel shaders units will just have nothing to do.

I'd say it's a not so bad card overall, because older games don't need all these horsepower, and newer games will make good use of it. Still, there are better options available.

My 2 cents. :wink:
!