Should PHYSX be universal? (both ATI & NVIDA)

Status
Not open for further replies.

Chronobodi

Distinguished
Feb 19, 2009
498
0
18,780
Just played through Cryostasis, Mirror's Edge, etc...
The physics are awesome, but can PHYSX itself be adopted by more developers?

It can process physics better than the CPU can, but, what exactly is preventing the widespread adoption :heink: ? I heard Intel is doing something...
 

wh3resmycar

Distinguished


the PC being an open platform and physx (accelerated) is entirely dependent on a geforce gpu or the old ppu.
 

Its not like anything is stopping ATI from implementing PhysX...

As I explained a while back, given the extra bus length most graphics cards have over a CPU (32/64 vs 128/256), Physics calculation is far more suited for GPU's, as a single LOAD operation can load more data at one time (Example: Using a 32 bit setup, each 32 bit register can hold one 32 bit integer. Using a 256 bit setup, a single 256 bit register can hold (256/32 = 8) 32 bit integers). As such, considering the mathametical formula needed for accurate physics effects, I doubt a CPU can even load integers fast enough to gain an acceptable framerate. A large data bus is a requirement for any non-linear mathematic functions, and modern CPU's lack the bus length needed to send many values in a single operation.
 
With DX11 coming, Physx is gpu bound, whereas Havok can run on either a cpu or gpu, depending on the strength of your setup, so to me, Physx is too limiting, and wont make a big enough splash to survive
 

chef7734

Distinguished
Mar 3, 2009
856
0
19,010
Physx is not just gpu limited. Nvidia opened physx to anyone that wanted to implement it. it is available for ps3, xbox 360, wii, iphone to name a few. Havok does not support hardware physics acceleration but they are working on ati stream that is basically like cuda which will enable hardware acceleration.
http://www.havok.com/content/view/17/30/
http://en.wikipedia.org/wiki/PhysX
http://nzone.com/object/nzone_physxgames_home.html
 

Chronobodi

Distinguished
Feb 19, 2009
498
0
18,780
??? i was just asking whether PhysX should be implemented across the board, how am i a troll? you're a troll, if that's what you're doing.
 

Chronobodi

Distinguished
Feb 19, 2009
498
0
18,780
oK, this thread was going along just fine until someone thought somehow this whole thread was offensive or some bull****.

All i was asking was whether Physx should be implemented on all GPUs, and the pros and cons of doing so. I do not know the history of said subject, so, by all means, this isn't meant to be a "flame bait" or whatever the internet forums call it.
 

jennyh

Splendid
PhysX creates an overhead that adds yet more cost to Nvidia gpu's at a time where they simply cannot afford luxuries.

Would ATI have liked to use it for free a year ago, perhaps with royalty payments to Nvidia? Maybe. Would that have helped make PhysX take off? Most certainly. Did it happen? No. Why didn't it? It's just another in a long line of terrible mistakes Nvidia have made in the past year.
 

Dekasav

Distinguished
Sep 2, 2008
1,243
0
19,310
Proprietary standards seldomly win. Physx is a proprietary standard. OpenCL is coming, which will likely help do physics equally well on ATI, Nvidia, and Larrabee hardware.

My logic says Physx will never be universal.
 


For this discussion, that statement is incorrect, X360, Wii and iPhone have no more access to PhysX than a PC with an ATi graphics card, it's all CPU-dependant physics, and not GPU-accelerated physics. Even the PS3 uses it's CELL FPUs to emulate a PhysX PPU, thus robbing it of core resources.

GPU-accelerated PhysX is not open to anyone who wants to implement it, it's tied to CUDA, and then if you want to work with both only THEN can you have it. Not quite as open as other options are making themselves. However the threat of Havok has made nV talk about porting PhysX to OpenCL.

Havok does not support hardware physics acceleration but they are working on ati stream that is basically like cuda which will enable hardware acceleration.

Havok isn't on stream, it's using OpenCL, and so it's not tied to ATi or nV in the same way that PhysX is tied to nV's CUDA. S3 could simply write their OpenCL drivers and run Havok, whereas to run PhysX they have to ask and agree to run CUDA on their hardware to thus enable PhysX.

The time for PhysX seems to have passed. If they made a truly 'killer app' implementation when they had the stage to themselves maybe they could've brought people into their closed eco-system, however time has allowed others to perfect their implementations and provide the promise of universal support. PhysX as it stand either has to adapt to equal those benefits or still take a final kick at the killer-app can.

Sofar NONE of the implementations mean much of anything, because their implementation are as 'realistic' for physics as virtual reality is anywhere near virtually real.
 

chef7734

Distinguished
Mar 3, 2009
856
0
19,010
That is b.s sdk is sdk does not matter if it is Havoc or physx sdk. The ability is there whether anyone wants to use it or not. Havok is not run on gpu so it would be the same as running physx sdk. It is funny that when the sdk of physx is mentioned someone comes off saying that it is not the same because it is run by the cpu. Why the hell do you think a sdk is. Havok SDK physx SDK. Neither one uses gpu's and is available for anyone to use.
 
Actually, the Wii, 360, and PS3 recently got PhysX support. The only reason ATI hasn't implemented support is becaues they don't want to support it.

As for the whole "proprietary standards" argument, which is more popular: DirectX or OpenGL?

As I said before, Non-linear physics, which is a necessity to get accurate physics effects, requires a large data bus to enable loading of mass amounts of data using a single LOAD operation in order to execute at a reasonable speed. Regardless of how fast a CPU's IPC is, if it can't load the necessary data in a timly manner, then how fast it is capable of doing the actual computations is meaningless.

A decent Physics implementation should be around as fast as rendering is, and we know what happens to FPS when you have the CPU start to render as well...Hovok, at best, implements standard 'textbook' liner-physics formula, just like every other standalone engine out there, which leads to limitations to how far you can take physics.

For example: Destorying an individual object once created is impossible under current physics implementations, unless the object is created with breakaway zones (Company of Heros, Battlefield: Bad Company), as opposed to dynamically destroying an object as it takes damage. I want an implementation that is capable of fully dynamic destruction of a solid object, and based only at the mathematical formula required, I have come to the conclusion that a CPU based implementation simply will not be able to constantly LOAD and EXECUTE the necessary data in a timly manner.

Also remember, any data sent to the CPU needs to be stored in its CPU registers prior to any mathematic formula being applied. As those registers are limited, you may only have access to one or two registers while playing a game (due to other resources eating the other registers). Hence why the GPU has an advantage: As GPU's use a higher bus length (128/256), you can theoretically send/hold more data (by segmenting a single 256bit register, you can hold 8 32-bit integers) at a single time, cutting down on how many LOAD operations (Instruction Cycles) needed to get, load, and execute the data. Hence why I feel a large data bus is a necessity for any decent implementation of a physics engine.
 
Heres what we need to know:
"Additionally, OpenCL has querying tools to test the compute capabilities of an individual platform, so that the processing requirements can be best tuned to the components within an individual computer - if a system has a middle-range CPU but a high-end GPU then the tasks can be biased towards the GPU. Alternatively if the system uses a performance CPU but a mainstream GPU then the tasks can be biased to the CPU so that the user can maintain the best graphics quality whilst still attaining good performance.

It should be noted that the demonstrations we have done to date have had zero impact on the Havok toolset itself. In other words, in this case the developer does not need to change anything from their point of view to enable GPU acceleration; it is entirely transparent. "
http://www.guru3d.com/article/interview-with-ati-dave-baumann/4
There you have it. Its early on in dev, but ATI is doing it. Now, since Intel and ATI have decided on Havok, guess where that leaves Physx?

 


M$ owns the console though, so they won out in the end. There is nothing stopping PhysX from ATI GPU's except ATI itself. Hence the reason why Backbreaker, a console exclusive, will be the game that shows once and for all how far physics effects can be taken.

gamevideos.1up.com/video/id/18491

Havok can't touch that.
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810


Why can't Havok touch that? There is no reason to believe the new Havok we see in the coming months won't be just as advanced as PhysX is now. Heck, there is no reason to believe that the current havok being implimente into the upcoming blizzard games wont be able to look like that given the right cpu/gpu. PhysX doesnt have any special powers, physics is just math afterall. The only reason Havok games don't show off physics as real as every day life is that that kind of computation required excessive parallel computing and it has to be stripped down to run a game.. the reason so many whant to do that sort of thing on the GPU.

Being a physicist I see computer simulations of real world physics every day. We have had programs that can effectively produce perfect physics in certain systems, such as convection in a star (we are talking calculating the motion of billions and billions of particles in a fluid) or the climate, for decades, and the equations for a hundred years. The only difference between what I ran on a super computer in school that took 3 days to compute and Havok/PhysX is that the latter are dumbed down to support their particular computing limitations. Obviosuly when you provide the engine with a more suitable coputing platform you can get more realistic simulations out of it for the same performance/ calculation time investment. Also, taking 3 days to render the next frame wouldn't be nice.. :D
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810
Effects =/gameplay.
No, they certainly are not, I'd kill for a game as immersive as deus Ex with todays graphics...

That being said though, it certainly would be cool to play a game with physics calculations equal to the sedement deposit and movement, or metal deformation I do at times.. (Ya I know it sounds lame). But imagine a game where sand behaved like sand or a building bends and folds into itself realistically!

It does bother me though that people think PhysX is something special. Its all the same math.. sure there is something to be said for an efficient program with tricks here and there to avoid useless computation.. but you can't patent calculus..
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810
I supose lame is par for the course on a computer hardware forum :)

Speaking of the course.. it sure is ncie outside today, maybe get some golf in after work and see some real physics in action :D
 

Dekasav

Distinguished
Sep 2, 2008
1,243
0
19,310



Lunacy! There are no physics in the real world! Physx is for games.
 

wh3resmycar

Distinguished


do you actually play a videogame?

lets take race driver grid for example. the "debris effect". you approach a corner differently if its loaded with debris from a crash, as oppose to one that is clean.

same with fightnight round 3, the "facial effects" alone with let you know if your opponent is gonna hit the canvass any second.

the same way with any "real time explosion effect" from a nade/bomb/GL, the proximity of the explosion will let you determine the safe distance (ie, the old cs, cs:source, cod4, cod5 to name a few, quake3's old rocket launcher is also a good example).

well the statement you posted above is correct if you're still hell bent on playing text-based games, which is so 1991.

you are lame.

 
Status
Not open for further replies.