What ever happened to the ATI Physics driver?

hcforde

Distinguished
Feb 9, 2006
313
0
18,790
What ever happened to the ATI Physics driver?

I love ATI but they totally dropped the ball...on me, on this one. I thought it would come to pass and I still have a 1600xt I bought just for that purpose (over a year ago)still in the shrink wrap. Now I see that through the natural progression of things AGEIA cards are $150.00 at CompUsa. Nvidia has also stopped it talk about it (Physics)and IT IS SAID that quad cores (C2D)will be able to handle the physics(CRYSIS). I also hear that Havok was bought by INTEL. That leaves AGEIA where?

The future of Physics accelerated or pushed back again?
 

justinmcg67

Distinguished
Sep 24, 2007
565
0
18,980
I had tremendously high hopes for AGEIA but it was a complete flop. Oh well...Even Cell factor was the game I was most looking forward to, but they went down the hill as well.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
While both ATI and Nvidia flexed their muscles to show that their cards would be capable of doing physics calculations, neither company has come with drivers or software supporting GPU Physics. While the AGEIA PhysX card has definitely had trouble, Unreal Tournament 3 this November is going to determine whether the card lives or dies. As CPUs continue to evolve with more cores, I find it more likely we'll see CPU based physics being the most popular physics solution. While yes the PhysX card and GPUs could do certain things better like more realistic cloth and liquid physics, I think the majority of developers will not support these tools because they'll be the minority of the market, and they're more interested in making money than having the most advanced game out.
 

gamebro

Distinguished
Mar 10, 2007
239
0
18,680
Yeah, it just isn't needed now that we are going into the multicore era. With quad cores already, there is soo much power and resources available, that a phys x card seems overkill lol. It is a feature that graphic cards should have built in already, and it is strange they haven't jumped on this sooner (Ageia should have never had a chance, but ATI NV were sleeping on this one).

As we get more and more cpu's each year, the need for ridiculous hardware will diminish. I am willing to bet that UT3 will play good on quad core PC's with a powerful video card, whether they have a physics card or not.
 

weskurtz81

Distinguished
Apr 13, 2006
1,557
0
19,780
But once an engine is created that supports it, can't other people create games on the engine as well? It might not even work that way, I just thought I would pose the question.
 


Dont see why not thats basically what happened with the (oh god im gonna get fryed for this)Doom or Quake or whichever one it was that came first engine.
Mactronix
 

enewmen

Distinguished
Mar 6, 2005
2,249
5
19,815

How long until CPUs have the power of the X1800XT ? Nehalem is coming :bounce: Maybe have a specialized "core"? Unlikely have 1 CPU core used just for physics.
 

masteryoda34

Distinguished
Feb 9, 2006
102
0
18,690
I believe there is support for doing physics on graphics cards. Its a technology called Havoc FX. (different than simply Havoc). Only thing is, so far no developer has implemented it. Please correct me if I am wrong.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
I believe the first game to use Havok FX is going to be Hellgate: London.
 


They are working with the other companies for physics, so it's not just like 1 company dropped the ball on this. The lead company for VPU physics was Havok, and with intel owning them (should never have been allowed to buy Havok due to anti-trust concerns in this burgeoning field), who knows where it stands now Larabee isn't a VPU physics type of solution, it'll simply be a multi-core solution. DirectPhysics is in M$' hands, so without a single unified API neither AMD nor nV were going to go it alone.

I thought it would come to pass and I still have a 1600xt I bought just for that purpose (over a year ago)still in the shrink wrap.

Very dumb idea, X1600s weren't getting more expensive or rare with time, why?

Never buy hardware until you need it. It would be just as dumb as buying a PhysX card back then to anticipate its potential use in UT3 which may or may not be a future add-on (beyond the usual 'Demo level').
 


They have software and drivers, just nothing more than demos though, no games.

While the AGEIA PhysX card has definitely had trouble, Unreal Tournament 3 this November is going to determine whether the card lives or dies.

Which actually looks bad, early reviews show it to be a seperate add-on with limited support, and like GRAW will basicaly have an Ageia Demo area built into UT3, but it won't be as integral as we'd hoped for all of these physics solutions.

As CPUs continue to evolve with more cores, I find it more likely we'll see CPU based physics being the most popular physics solution.

Yep, and I think that's why intel bought Havok, to give them that direction for their future Larrabee and multi-core verse, where the average user doesn't need one, but a gamer could easily now make a case for a new mega CPU versus a quad core with 3 Graphics cards or a PPU had either of those solutions gotten traction. Now intel owns the largest physics software company, and can guide the direction of physics gameplay back towards their hardware, and that they're making graphics chips, I wouldn't hold my breatyh thinking that they'll make it free and open to AMD and nV even if they go the GPU route.

I think the majority of developers will not support these tools because they'll be the minority of the market, and they're more interested in making money than having the most advanced game out.

Personally I think they aren't going to be worried about having the most advanced game so much as having an easy and profitable path to use some additional features. I suspect intel will be more biased with their implementation of Havok features favouring themselves, more so than AMD and nV were with GITG and TWIMTBP.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790


I agree... intel is not the best of friends when it comes to making the "best" implementation of an "open" platform... while I am sure if there is a vpu solution in the newer havoc renditions and amd/Nv will be able to leverage at least some of that (but don't NEED to as graphics are still their butter)... ageia is in a tight spot that is getting smaller each day.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
You're right; I've run those silly tech demos from Nvidia that technically show physics calculations can be done on the GPU.
 

hcforde

Distinguished
Feb 9, 2006
313
0
18,790
To the GreatGrapeApe,

Quote :


I thought it would come to pass and I still have a 1600xt I bought just for that purpose (over a year ago)still in the shrink wrap.





Very dumb idea, X1600s weren't getting more expensive or rare with time, why?

I got the card for $100 at BB. Other sellers were still selling it at $200. Actually they are very rare to find now in such a pristine mint condition. :)
 

hcforde

Distinguished
Feb 9, 2006
313
0
18,790
Maybe Intel will now buy a gaming company and SHOWCASE its new physics technology on a 'special/programable' CPU core. Maybe one of the cores can be programmable for Graphics or graphics assist.

How about a sextet core CPU so you can have your quad and the other 2 cores be dedicated to graphic rendition of some nature?

This would give AMD another run for its money because the speculation was that an AMD/ATI company could put both Cpu and Gpu on the same card and enjoy the benefits of that pairing. I do not know how all this will work out, but I do not see Intel letting up the pressure on AMD ever again.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790


not speculation. They are doing it... not on a card, but both on a cpu package. Toms has had many news articles speaking of this. I am not flaming nor fanboi-posting, just keeping the info real. ;)
 

hcforde

Distinguished
Feb 9, 2006
313
0
18,790
sojrner


Good to hear they have not dropped the ball on that. One poster pm'ed me when the ATI/AMD merger was new and told me the same thing among otherthings that was going on behind the scene. GOOD to know that AMD is still focused on it. Also gives more understandibility as to the WHY of Intel/Havok thing. Classic business model to stop competing on price after your product becomes a commodity and begin to compete on features.
 

sailer

Splendid


Just think, if you keep that card all pristine and in its factory box, maybe your grandchildren will be able to sell it for a fortune as an antique from the past. Won't be worth anything as a computer part, but the museum quality, that's the ticket. Kind of like my 1965 Plymouth with the 426. It sold new for $3000, plunged in value to a few hundred dollars, and now I've seen similar cars sell for a couple million on today's market. You just never know what's going to end up trash and what's going to be worth a pot of gold.
 


The combo cpu/gpu are just that just a cpu and video card on one chip nothing more. NO speed increase no performance increase, just a simple solution nothing else.
If they made the gpu an apu on the cpu, that's different (like a Powerpc cpu). Gpus calculate far faster then any cpu period. The reason Physics cards died, is most calculations are taken by the gpu, the physics cards get the "leftovers" hense NO performance gain (some report performance DROP). The Havok FX (aka hardware) was suppose to address that so code could be sent directly to the physics card instead of the gpu intercepting it.
And btw, ati and nvidia support physics in their gpus, as one said, Havok (software).