What ever happened to the ATI Physics driver?

What ever happened to the ATI Physics driver?

I love ATI but they totally dropped the ball...on me, on this one. I thought it would come to pass and I still have a 1600xt I bought just for that purpose (over a year ago)still in the shrink wrap. Now I see that through the natural progression of things AGEIA cards are $150.00 at CompUsa. Nvidia has also stopped it talk about it (Physics)and IT IS SAID that quad cores (C2D)will be able to handle the physics(CRYSIS). I also hear that Havok was bought by INTEL. That leaves AGEIA where?

The future of Physics accelerated or pushed back again?
22 answers Last reply
More about what happened physics driver
  1. OH OH --- CompUsa just dropped the BFG Physx card to $100.00.
  2. imo don't buy technology hardware till there ssoftware to run it
  3. I had tremendously high hopes for AGEIA but it was a complete flop. Oh well...Even Cell factor was the game I was most looking forward to, but they went down the hill as well.
  4. hcforde said:
    What ever happened to the ATI Physics driver?

    I love ATI but they totally dropped the ball...on me, on this one. I thought it would come to pass and I still have a 1600xt I bought just for that purpose (over a year ago)still in the shrink wrap. Now I see that through the natural progression of things AGEIA cards are $150.00 at CompUsa. Nvidia has also stopped it talk about it (Physics)and IT IS SAID that quad cores (C2D)will be able to handle the physics(CRYSIS). I also hear that Havok was bought by INTEL. That leaves AGEIA where?

    The future of Physics accelerated or pushed back again?
    While both ATI and Nvidia flexed their muscles to show that their cards would be capable of doing physics calculations, neither company has come with drivers or software supporting GPU Physics. While the AGEIA PhysX card has definitely had trouble, Unreal Tournament 3 this November is going to determine whether the card lives or dies. As CPUs continue to evolve with more cores, I find it more likely we'll see CPU based physics being the most popular physics solution. While yes the PhysX card and GPUs could do certain things better like more realistic cloth and liquid physics, I think the majority of developers will not support these tools because they'll be the minority of the market, and they're more interested in making money than having the most advanced game out.
  5. Yeah, it just isn't needed now that we are going into the multicore era. With quad cores already, there is soo much power and resources available, that a phys x card seems overkill lol. It is a feature that graphic cards should have built in already, and it is strange they haven't jumped on this sooner (Ageia should have never had a chance, but ATI NV were sleeping on this one).

    As we get more and more cpu's each year, the need for ridiculous hardware will diminish. I am willing to bet that UT3 will play good on quad core PC's with a powerful video card, whether they have a physics card or not.
  6. But once an engine is created that supports it, can't other people create games on the engine as well? It might not even work that way, I just thought I would pose the question.
  7. weskurtz81 said:
    But once an engine is created that supports it, can't other people create games on the engine as well? It might not even work that way, I just thought I would pose the question.


    Dont see why not thats basically what happened with the (oh god im gonna get fryed for this)Doom or Quake or whichever one it was that came first engine.
    Mactronix
  8. Heyyou27 said:
    While both ATI and Nvidia flexed their muscles to show that their cards would be capable of doing physics calculations, neither company has come with drivers or software supporting GPU Physics. While the AGEIA PhysX card has definitely had trouble, Unreal Tournament 3 this November is going to determine whether the card lives or dies. As CPUs continue to evolve with more cores, I find it more likely we'll see CPU based physics being the most popular physics solution. While yes the PhysX card and GPUs could do certain things better like more realistic cloth and liquid physics, I think the majority of developers will not support these tools because they'll be the minority of the market, and they're more interested in making money than having the most advanced game out.

    How long until CPUs have the power of the X1800XT ? Nehalem is coming :bounce: Maybe have a specialized "core"? Unlikely have 1 CPU core used just for physics.
  9. I believe there is support for doing physics on graphics cards. Its a technology called Havoc FX. (different than simply Havoc). Only thing is, so far no developer has implemented it. Please correct me if I am wrong.
  10. masteryoda34 said:
    I believe there is support for doing physics on graphics cards. Its a technology called Havoc FX. (different than simply Havoc). Only thing is, so far no developer has implemented it. Please correct me if I am wrong.
    I believe the first game to use Havok FX is going to be Hellgate: London.
  11. hcforde said:
    What ever happened to the ATI Physics driver?

    I love ATI but they totally dropped the ball...on me, on this one.


    They are working with the other companies for physics, so it's not just like 1 company dropped the ball on this. The lead company for VPU physics was Havok, and with intel owning them (should never have been allowed to buy Havok due to anti-trust concerns in this burgeoning field), who knows where it stands now Larabee isn't a VPU physics type of solution, it'll simply be a multi-core solution. DirectPhysics is in M$' hands, so without a single unified API neither AMD nor nV were going to go it alone.

    Quote:
    I thought it would come to pass and I still have a 1600xt I bought just for that purpose (over a year ago)still in the shrink wrap.


    Very dumb idea, X1600s weren't getting more expensive or rare with time, why?

    Never buy hardware until you need it. It would be just as dumb as buying a PhysX card back then to anticipate its potential use in UT3 which may or may not be a future add-on (beyond the usual 'Demo level').
  12. Heyyou27 said:
    While both ATI and Nvidia flexed their muscles to show that their cards would be capable of doing physics calculations, neither company has come with drivers or software supporting GPU Physics.


    They have software and drivers, just nothing more than demos though, no games.

    Quote:
    While the AGEIA PhysX card has definitely had trouble, Unreal Tournament 3 this November is going to determine whether the card lives or dies.


    Which actually looks bad, early reviews show it to be a seperate add-on with limited support, and like GRAW will basicaly have an Ageia Demo area built into UT3, but it won't be as integral as we'd hoped for all of these physics solutions.

    Quote:
    As CPUs continue to evolve with more cores, I find it more likely we'll see CPU based physics being the most popular physics solution.


    Yep, and I think that's why intel bought Havok, to give them that direction for their future Larrabee and multi-core verse, where the average user doesn't need one, but a gamer could easily now make a case for a new mega CPU versus a quad core with 3 Graphics cards or a PPU had either of those solutions gotten traction. Now intel owns the largest physics software company, and can guide the direction of physics gameplay back towards their hardware, and that they're making graphics chips, I wouldn't hold my breatyh thinking that they'll make it free and open to AMD and nV even if they go the GPU route.

    Quote:
    I think the majority of developers will not support these tools because they'll be the minority of the market, and they're more interested in making money than having the most advanced game out.


    Personally I think they aren't going to be worried about having the most advanced game so much as having an easy and profitable path to use some additional features. I suspect intel will be more biased with their implementation of Havok features favouring themselves, more so than AMD and nV were with GITG and TWIMTBP.
  13. TheGreatGrapeApe said:
    Yep, and I think that's why intel bought Havok, to give them that direction for their future Larrabee and multi-core verse, where the average user doesn't need one, but a gamer could easily now make a case for a new mega CPU versus a quad core with 3 Graphics cards or a PPU had either of those solutions gotten traction. Now intel owns the largest physics software company, and can guide the direction of physics gameplay back towards their hardware, and that they're making graphics chips, I wouldn't hold my breatyh thinking that they'll make it free and open to AMD and nV even if they go the GPU route...

    ...Personally I think they aren't going to be worried about having the most advanced game so much as having an easy and profitable path to use some additional features. I suspect intel will be more biased with their implementation of Havok features favouring themselves, more so than AMD and nV were with GITG and TWIMTBP.


    I agree... intel is not the best of friends when it comes to making the "best" implementation of an "open" platform... while I am sure if there is a vpu solution in the newer havoc renditions and amd/Nv will be able to leverage at least some of that (but don't NEED to as graphics are still their butter)... ageia is in a tight spot that is getting smaller each day.
  14. TheGreatGrapeApe said:
    They have software and drivers, just nothing more than demos though, no games.
    You're right; I've run those silly tech demos from Nvidia that technically show physics calculations can be done on the GPU.
  15. To the GreatGrapeApe,

    Quote :


    I thought it would come to pass and I still have a 1600xt I bought just for that purpose (over a year ago)still in the shrink wrap.





    Very dumb idea, X1600s weren't getting more expensive or rare with time, why?

    I got the card for $100 at BB. Other sellers were still selling it at $200. Actually they are very rare to find now in such a pristine mint condition. :-)
  16. Maybe Intel will now buy a gaming company and SHOWCASE its new physics technology on a 'special/programable' CPU core. Maybe one of the cores can be programmable for Graphics or graphics assist.

    How about a sextet core CPU so you can have your quad and the other 2 cores be dedicated to graphic rendition of some nature?

    This would give AMD another run for its money because the speculation was that an AMD/ATI company could put both Cpu and Gpu on the same card and enjoy the benefits of that pairing. I do not know how all this will work out, but I do not see Intel letting up the pressure on AMD ever again.
  17. hcforde said:
    ...because the speculation was that an AMD/ATI company could put both Cpu and Gpu on the same card and enjoy the benefits of that pairing.


    not speculation. They are doing it... not on a card, but both on a cpu package. Toms has had many news articles speaking of this. I am not flaming nor fanboi-posting, just keeping the info real. ;)
  18. sojrner


    Good to hear they have not dropped the ball on that. One poster pm'ed me when the ATI/AMD merger was new and told me the same thing among otherthings that was going on behind the scene. GOOD to know that AMD is still focused on it. Also gives more understandibility as to the WHY of Intel/Havok thing. Classic business model to stop competing on price after your product becomes a commodity and begin to compete on features.
  19. hcforde said:
    To the GreatGrapeApe,

    Quote :


    I thought it would come to pass and I still have a 1600xt I bought just for that purpose (over a year ago)still in the shrink wrap.





    Very dumb idea, X1600s weren't getting more expensive or rare with time, why?

    I got the card for $100 at BB. Other sellers were still selling it at $200. Actually they are very rare to find now in such a pristine mint condition. :-)


    Just think, if you keep that card all pristine and in its factory box, maybe your grandchildren will be able to sell it for a fortune as an antique from the past. Won't be worth anything as a computer part, but the museum quality, that's the ticket. Kind of like my 1965 Plymouth with the 426. It sold new for $3000, plunged in value to a few hundred dollars, and now I've seen similar cars sell for a couple million on today's market. You just never know what's going to end up trash and what's going to be worth a pot of gold.
  20. rofl... put it in a hyperbaric chamber like Michael Jackson... it will live forever!
  21. sojrner said:
    not speculation. They are doing it... not on a card, but both on a cpu package. Toms has had many news articles speaking of this. I am not flaming nor fanboi-posting, just keeping the info real. ;)


    The combo cpu/gpu are just that just a cpu and video card on one chip nothing more. NO speed increase no performance increase, just a simple solution nothing else.
    If they made the gpu an apu on the cpu, that's different (like a Powerpc cpu). Gpus calculate far faster then any cpu period. The reason Physics cards died, is most calculations are taken by the gpu, the physics cards get the "leftovers" hense NO performance gain (some report performance DROP). The Havok FX (aka hardware) was suppose to address that so code could be sent directly to the physics card instead of the gpu intercepting it.
    And btw, ati and nvidia support physics in their gpus, as one said, Havok (software).
Ask a new question

Read More

Graphics Cards ATI Graphics