Sign in with
Sign up | Sign in
Your question

So what's the deal with Havok, ATi Physics Technology, etc.?

Last response: in Graphics & Displays
Share
January 27, 2010 12:42:49 AM

Hello,

I was searching around Newegg.com and saw there was a PhysX oriented Nvidia card from EVGA. PhysX has been around for a while now, but I was curious to ask this...

Did ATi ever finish making their Physics engine or buying it off of Havok?

Can you currently dedicate an ATi graphics card to run physics? (1 main card + 1 Physics offload card)


Thanks.
a c 376 U Graphics card
January 27, 2010 1:25:56 AM

You can use a Nvidia card with an ATI card to offload physx in games that use it but you need hacked drivers. I wouldn't bother however. Physx isn't very popular with developers because it is proprietary and there's only really one game where it makes a worthwhile difference(Batman: AA) and that was mostly a design decision by the developer, not that most of the differences really needed gpu accelerated physics.
a b U Graphics card
January 27, 2010 5:06:50 AM

Also Havok is now owned by intel (before nV bought Ageia), who haven't really put the effort into Havok FX as it was originally planned.

It's supposed to go OpenCL and thus make it more compatible with GPUs and CPUs, but I get the feeling not much effort is being put into it now that larrabee has been shelved, which you would think would give Ageia a chance, but nV is kinda killing that with the whole proprietary thing.

Related resources
a b U Graphics card
January 27, 2010 10:48:01 AM

^^ As I've been saying, unless ATI starts to support it, all implementations of PhysX will probably be shoddy, as whos really going to go all the way with an API that one of the two major GPU makers cards can't support?

I should make a point though, that this year, more titles released using PhysX as their physics engine then Havok. Now, only a small subset of those titles use the GPU accelerated features of PhysX, but to say that the API isn't popular simply goes against the facts.
a b U Graphics card
January 27, 2010 11:50:32 AM

PhysiX and Havok are just smoke and mirrors(pun intended).
a b U Graphics card
January 27, 2010 3:41:28 PM

Quote:
So why remove graphical effects in the first place, there is no reason they cannot be in the full game.

Also, as has been said, nvidia is not going to allow ati or anyone else to support it. That has been made clear unless ati are lying, which is unlikely if it is the opposite of what nvidia is saying,


Well, what else would ATI say? "Oh yeah, we CAN support it, we just choose not to"?. Easier to simply say that NVIDIA won't let them use a non--CUDA method.

Im not saying its the case, but ATI is geting far too much benifit of the doubt on this issue.

Also, as I've been saying, most current physics implementations aren't capable of doing some of the PhysX functions; Sure, you may see smoke effects, but getting smoke to move dynamically based on the environment (what PhysX tries to do) instead of a set dispersal pattern (which is the current prefered method) is NOT easy to do, and very, very taxing.

PhysX also offers the advantage of freeing developers from having to create their own implementation for the same effects. In the above example, you could use the API to handle the effects instead of having to create those same exact effects themselves. So its also a time-saving measure.

Like it or not, PhysX is one of the default physics API's in the Unreal 3 Engine, so we're going to be seeing a lot more it.
a c 271 U Graphics card
January 27, 2010 5:23:24 PM

Quote:
people do not sit and stare at smoke.

That depends on the festival you're at. :whistle: 
a b U Graphics card
January 28, 2010 2:33:00 AM

gamerk316 said:

Im not saying its the case, but ATI is geting far too much benifit of the doubt on this issue.


I think you've got that backwards. nVidia's getting too much benefit of doubt here, if it were intel controlling physX and nVidia saying they weren't allowed to do it I doubt there's any BS intel could say that people would believe. Yet nVidia's BS PR "we offered it to ATi and intel" gets this magical "see they tried, but were rebuffed" when nvidia is blocking nVidia customers who BOUGHT an nVidia card simply because it's being used alongside an ATi card. With OpenCL and even nVidia admitting it would be easy to port to OpenCL, if there were even a smidge of truth to their 'attempt to bring PhysX to other IHV's hardware' their actions would be much different than they are.

I understand what nVidia is doing, and it makes sense from a GPU vendor strategy, but not as a physics API vendor, and you're making it sound like I should give them the benefit of the doubt as if they aren't first and foremost a GPU vendor that has never played well with others (do you forget their attempt to give a 'boost' to nV cards on nV Mobos? by actually crippling the competition).

Seriously c'mon, if there were ANY motivation to make this work with ATi/intel/S3 the first step would be to remove the GPU check which wasn't required for the PPU, and works fine when disabled 'despite the other myth of instability'.

Quote:
Also, as I've been saying, most current physics implementations aren't capable of doing some of the PhysX functions; Sure, you may see smoke effects, but getting smoke to move dynamically based on the environment (what PhysX tries to do) instead of a set dispersal pattern (which is the current prefered method) is NOT easy to do, and very, very taxing.


But the point is, it's not game changing, and not game changing enough for the average consumer to go out and buy another GPU for physics or even to power an older one. Whether it's Havok or PhysX if it's nothing more than live vs pre-rendered 'debris effects' I don't care. I don't care if the dumb character kicking an empty soda can is doing it via Havok or PhysX, it still doesn't impress me. What EITHER of them needs is the game physics changes like in HL2 and Crysis, which are still way below our expectations when Brook+ and then Ageia and Havok started talking about improving in-game physics. If after nearly 5 years this is all we have to show for those improvements, then STOP, put your efforts somewhere else, because it's obvious you're not ready to implement better physics than software on a CPU, especially if it's just debris visuals.


Quote:
PhysX also offers the advantage of freeing developers from having to create their own implementation for the same effects. In the above example, you could use the API to handle the effects instead of having to create those same exact effects themselves. So its also a time-saving measure.


Which would be good if it was something more than adding debris. It's like adding overblown poor HDR to everything just because now it's so much easier and you can see it's something 'different'.

Considering your views on XP and DX10/10.1/11, this is the TOTAL opposite end of the scale (smaller install base & lesser impact), and unlike those, in a few years we see improvements, sofar with GPU physX we have seen little new since its early implementation in GRAW.

I was amped with Brook+, interested then disappointed in Ageia's PPU, amped for Havok FX (which was for both ATi and nV) and now think they are all smoke and mirrors about simply bringing smoke and mirrors, nothing worth even 1/10 the hype any of them received, and more advancements in Surround Gaming, 3D, even AUDIO improvements have shown greater benefit for less hype than any of these physics disappointments.

Quote:
Like it or not, PhysX is one of the default physics API's in the Unreal 3 Engine, so we're going to be seeing a lot more it.


PhysX is 'ONE OF the default APIs', but that means very little since it's game physics default is NOT PhysX and instead Epic's own implementation. It amounts to the same problem it's always had, a bigger list of 'supported games' than actual supported improvements.

The other thing is PhysX may be the API du jour, just like Havok was, but if it can't deliver, just like arguably Havok has't, it's pretty easy to move one especially since sofar none of them are really all that special if companies like Crytek, Epic and Valve can make much better implementations on their own. With OpenCL and DirectCompute it's becoming even easier too, so I wouldn't count on a 'list of games' continuing to be a compelling reason much longer if someone comes out with a better mousetrap. I'd rather have a list of 5-10 AWESOME physic-supported games, which I would then seek out, than a convoluted list of 100+ games with support ranging from crap non-accelerated to 'OK' debris support.

intel and nVidia sitting on these weak implementations as if they create their own momentum will only last for so long, because many of us, just want something more, and now we don't care (I don't seek out or avoid games because they either have or don't have the Havok or PhysX logo), I just don't care, since neither makes a difference anymore. I'd be more likely to look for the Gamebryo logo than either of the Physics API logos.
a b U Graphics card
January 28, 2010 2:34:42 AM

Mousemonkey said:
That depends on the festival you're at. :whistle: 


Glastonbury ! WooHoo !!! [:mousemonkey:3] [:thegreatgrapeape] [:thegreatgrapeape:3] [:thegreatgrapeape] [:mousemonkey:3]
a c 271 U Graphics card
January 28, 2010 2:41:48 AM

TheGreatGrapeApe said:
Glastonbury ! WooHoo !!! [:mousemonkey:3] [:thegreatgrapeape] [:thegreatgrapeape:3] [:thegreatgrapeape] [:mousemonkey:3]

I have no idea what that means or re(e)fers to, being the straight cut individual that I told the judge I woz. [:mousemonkey]
a b U Graphics card
January 28, 2010 2:00:20 PM

Quote:

I think you've got that backwards. nVidia's getting too much benefit of doubt here, if it were intel controlling physX and nVidia saying they weren't allowed to do it I doubt there's any BS intel could say that people would believe. Yet nVidia's BS PR "we offered it to ATi and intel" gets this magical "see they tried, but were rebuffed" when nvidia is blocking nVidia customers who BOUGHT an nVidia card simply because it's being used alongside an ATi card. With OpenCL and even nVidia admitting it would be easy to port to OpenCL, if there were even a smidge of truth to their 'attempt to bring PhysX to other IHV's hardware' their actions would be much different than they are.

I understand what nVidia is doing, and it makes sense from a GPU vendor strategy, but not as a physics API vendor, and you're making it sound like I should give them the benefit of the doubt as if they aren't first and foremost a GPU vendor that has never played well with others (do you forget their attempt to give a 'boost' to nV cards on nV Mobos? by actually crippling the competition).

Seriously c'mon, if there were ANY motivation to make this work with ATi/intel/S3 the first step would be to remove the GPU check which wasn't required for the PPU, and works fine when disabled 'despite the other myth of instability'.


Heres where your thinking is wrong: NVIDIA won't support ATI + NVIDIA, but then again, thats common sense. Why should NVIDIA be the one responsable for all compatability issues with such a setup? Hence the issue: AMD wont [or isn't able to] support PhysX, which means that automatically a AMD + NVIDIA setup will leave NVIDIA on the hook for potential problems.

Secondly, its not NVIDIAs job to deal with implementation of the API. NVIDIA is implementing PhysX through CUDA, Apparently, its easy to implement it via OpenCL as well. But the point is, its not NVIDIA's job to handle every single implementation, as that responsability belongs to those who wish to use the API.

Quote:

But the point is, it's not game changing, and not game changing enough for the average consumer to go out and buy another GPU for physics or even to power an older one. Whether it's Havok or PhysX if it's nothing more than live vs pre-rendered 'debris effects' I don't care. I don't care if the dumb character kicking an empty soda can is doing it via Havok or PhysX, it still doesn't impress me. What EITHER of them needs is the game physics changes like in HL2 and Crysis, which are still way below our expectations when Brook+ and then Ageia and Havok started talking about improving in-game physics. If after nearly 5 years this is all we have to show for those improvements, then STOP, put your efforts somewhere else, because it's obvious you're not ready to implement better physics than software on a CPU, especially if it's just debris visuals.


Quote:
Which would be good if it was something more than adding debris. It's like adding overblown poor HDR to everything just because now it's so much easier and you can see it's something 'different'.

Considering your views on XP and DX10/10.1/11, this is the TOTAL opposite end of the scale (smaller install base & lesser impact), and unlike those, in a few years we see improvements, sofar with GPU physX we have seen little new since its early implementation in GRAW.

I was amped with Brook+, interested then disappointed in Ageia's PPU, amped for Havok FX (which was for both ATi and nV) and now think they are all smoke and mirrors about simply bringing smoke and mirrors, nothing worth even 1/10 the hype any of them received, and more advancements in Surround Gaming, 3D, even AUDIO improvements have shown greater benefit for less hype than any of these physics disappointments.


Your looking at implementation, im looking at what the API is actually capable of. PhysX is capable of far more then debris, but the API isn't getting its full potential use. Basically, your making the same exact argument that people were making against dedicated graphic co-processors in the mid-90's [IE: Lack of support, no noticable improvement]. It took a unified standard [Glide, then later OpenGL and DirectX] to really push 3D graphics. The last thing that we need is a physics API war.

I could care less what the API is, I want a dedicated Physics API that is capable of being able to perform multiple physics calculation on every object without developers having to write special code into their engines. EG: I run full speed into a building, and I should fall falt on my back. The API should be strong enough to handle this interaction on its own, without the devs having to write this functionallity into the game engine [which leads to each game having different incompatable standards for Physics, hence the lack of advanced physics effects in modern games]
a b U Graphics card
January 28, 2010 4:27:14 PM

gamerk316 said:

Heres where your thinking is wrong: NVIDIA won't support ATI + NVIDIA, but then again, thats common sense. Why should NVIDIA be the one responsable for all compatability issues with such a setup? Hence the issue: AMD wont [or isn't able to] support PhysX, which means that automatically a AMD + NVIDIA setup will leave NVIDIA on the hook for potential problems.


That makes no sense, that would be like AMD and intel deciding to disable nVidia cards on their CPUs because they can't be responsible for nVidia's driver issues. :pt1cable:  If Adobe, Cyberlink, Sony and others can figure out how to get GPU acceleration to work with mixed setups, nVidia should be able to figure it out for their PhysX engine, especially if it has no issues running CPU mode, do they have trouble figuring out AMD vs intel vs VIA CPUs? And are you tell me getting it to work on the PS3, X360 and Wii is much easier? C'mon! :sarcastic: 

Also you're already buying the PR BS with the "AMD wont [or isn't able to] support PhysX", what exactly makes it supportable on a GF8600, but not on an HD5870?

It nV's platform, and that it works with the patch, yet they disable it, shows, they have no intention on it working.

Quote:
Secondly, its not NVIDIAs job to deal with implementation of the API. NVIDIA is implementing PhysX through CUDA, Apparently, its easy to implement it via OpenCL as well. But the point is, its not NVIDIA's job to handle every single implementation, as that responsability belongs to those who wish to use the API.


Actually it is up to nVidia, as they demonstrate that they don't leave it up to either the devs or the individuals, they go out of their way to disable it, therefore they are taking that responsability into their hands.

Quote:
Your looking at implementation, im looking at what the API is actually capable of. PhysX is capable of far more then debris, but the API isn't getting its full potential use.


PhysX is capable of more than debris, but it's GPU implementation has shown no ability to interact with game physics. So your position is it's capable of much more, but neither Ageia nor nVidia were capable of providing that. It's still not a position I would praise.

Quote:
Basically, your making the same exact argument that people were making against dedicated graphic co-processors in the mid-90's [IE: Lack of support, no noticable improvement].


Yeah, which if someone came to ask me what they should buy for 2D graphics with the occasional game, I'd give them the same answer as here I would've then, it's not worth buying a dedicated card or buying a poor 2D/3D card simply to add a feature you don't use. Which is the same reason I say people shouldn't make their graphics decision based on PhysX, make their GPU decision based on graphics, and then add a card later if you care about PhysX.

Quote:
It took a unified standard [Glide, then later OpenGL and DirectX] to really push 3D graphics. The last thing that we need is a physics API war.


OpenGL is older than Glide, which was a proprietary and not unified standard, 3Dfx pushed Glide, and when it no longer had the hardware or 'raison d'etre' it died, which is what will happen with Havoc and PhysX if they don't bring something more compelling than debris physics.

Quote:
I could care less what the API is, I want a dedicated Physics API that is capable of being able to perform multiple physics calculation on every object without developers having to write special code into their engines.


Which is nice from a dev perspective, but as consumers, we don't FAQ, we just want GOOD physics effects and we want them on our terms. It's that simple, and it's the way all succesful hardware works. If it doesn't meet customer needs, then the customers move on, just like Glide. I don't care if PhysX is 1/3 as hard ore 3 times as hard to implement versus another solution, as long as the end result is better, that's all that matters to consumers. nV's actions sofar are anti-consumer, even their own who have bought either the original PPU or have purchased and nV card to act as a PPU on a non-nV display rig.

The point being, we don't care if company A or B owns it, the current implementation sucks, there is no evidence of it getting better (especially if rocket sled is the direction they are taking it), and they are hurting their own customers. The only thing helping nVidia is that intel's ever worse at moving Havok forward, and all the small studios have less resources. PhysX is what it is, not because it's good, but because everyone else sucks too. [:thegreatgrapeape:5]
a b U Graphics card
January 28, 2010 4:43:42 PM

Someone needs to make a open-source HLPL (High-Level Physics Language) based off of OpenCL. It would be open-source throughout the entire pipeline. As a video game dev, I do care about API's and PhysX is about as crap as API's get;
Implementation as software - There are faster/better software API's
Implementation as hardware - Only pure nVidia setups will run or compel consumer to get hacked drivers

At the retard argument of why nVidia should be responsible for "stability problems" on ATI cards let me say this:
If some people can take the fking COMPILED driver, HACK it, & run PhysX with an ATI card with NO MORE problems than a pure nVidia setup then how the fk are there "stability problems"?
!