The use of an ATI card for physics starts with the X1600 series.
Actually ATi mentioned the X1300 in alot of their early material, it's just that they demoed on the X1600, and that's likely because the X1300 didn't outperform Ageia's PhysX card enough to make it into the demo. But there's no physical restriction to it.
Though a Twitch Guru comment said that onboard graphics could be unlockable for physics in a single PCIe card setup sometime in the future.
And possible even unused portions of a single VPU if the load can be well balanced.
Now, I've read there will be boards with something like 2 PCIe 16 and one PCIe 8 (or 3 PCIe 16) in the future.
They already exist, Tyan has 2 x 16X + 2 x 8X , and the ATi demo was on 2 x 16X + 1 x 8X, but futre models are to be 3 x 16X. And Gigabyte has their quad board which is 4 x 8X.
Currently, if you use an X1900 Crossfire card and an X1600XT, you'd be making a mistake as the pixel pipelines on the better card default to those of the weaker card.
You'd be mistaken, because it's not the case. they would both work independantly in their own pixel pipeline configuration, but no in crossfire, which is no different than anyone else's solution. The X1600 would not be for rendering graphics, except in a multi-monitor capacity.
If you got a Crossfire card (ie X1800 for use with another X1800 or an X1900 for use with another X1900), then you'd be spending quite a bit of money for a one year DX9.0c solution.
So obviously you should get 2 X1800GTOs and then you're set, no worries of a master card.
I'm not sure the Crossfire masters would be usable for physics.
I'm not sure why you would think they wouldn't be. Of course they would.
Would there be a conflict with the newer Crossfire DX10 master, or would the third card in a three card system not be recognized for graphics, but for physics?
Depends on M$' implementation, but it would act very much like XP does now, fine for multi-monitor, fine for physics, but graphics for gaming becomes a case by case application.
That's what's up in the air. It should pan out nicely, when the 3 or 4 PCIe 16 (or even 8) boards are available.
Like I said, they're already here.
It's Havok FX physics that will be supported by the ATI cards,
And also ATi's own implementation, which is not simply Havok's but in addition to that.
so games that use other physics engines might or might not benefit.
Which would be a small number of games, since it's priarily Havok doing the phsyics now, with a small number for Ageia (most are future titles); and that's why ATi would give the alternative, in case things should change in the future. Right now though there's no worry.
It depends on if there's a DirectPhysics standard by Microsoft for Vista that's supported by all the physics engines out there.
Actially it won't be supported by all the engines completely since M$ is not going the PPU route only the VPU route, and thus Ageia will have to decide whether they wish to support it, and even then developers may add their own support in additional to Ageia's restricted path.
However in all likelyhood M$' decision to enter the fray has basically ensured that the future of VPU based physics will have a generalized/standardized platform upon which to buid other engines, at worst IMO there'll be Ageia, Havok and M$' solution, the lasttwo of which both ATi and nV have said they'd support. And it's unlikely that ATi and nV's own solutions would be anything that game developers would aopt instead of those 3, more likely they would be a means of bridging them, the way that thei graphics drivers do now.
MMmm physics drivers, yeah!