Sign in with
Sign up | Sign in
Your question
Closed

Should PHYSX be universal? (both ATI & NVIDA)

Last response: in Graphics & Displays
Share
April 22, 2009 12:51:03 PM

Just played through Cryostasis, Mirror's Edge, etc...
The physics are awesome, but can PHYSX itself be adopted by more developers?

It can process physics better than the CPU can, but, what exactly is preventing the widespread adoption :heink:  ? I heard Intel is doing something...
April 22, 2009 2:51:38 PM

I hope PhysX is entirely removed.
a b U Graphics card
April 22, 2009 3:08:41 PM

It can process physics better than the CPU can, but, what exactly is preventing the widespread adoption said:
It can process physics better than the CPU can, but, what exactly is preventing the widespread adoption


the PC being an open platform and physx (accelerated) is entirely dependent on a geforce gpu or the old ppu.
Related resources
a b U Graphics card
April 22, 2009 5:18:19 PM

wh3resmycar said:
the PC being an open platform and physx (accelerated) is entirely dependent on a geforce gpu or the old ppu.

Its not like anything is stopping ATI from implementing PhysX...

As I explained a while back, given the extra bus length most graphics cards have over a CPU (32/64 vs 128/256), Physics calculation is far more suited for GPU's, as a single LOAD operation can load more data at one time (Example: Using a 32 bit setup, each 32 bit register can hold one 32 bit integer. Using a 256 bit setup, a single 256 bit register can hold (256/32 = 8) 32 bit integers). As such, considering the mathametical formula needed for accurate physics effects, I doubt a CPU can even load integers fast enough to gain an acceptable framerate. A large data bus is a requirement for any non-linear mathematic functions, and modern CPU's lack the bus length needed to send many values in a single operation.
a b U Graphics card
April 22, 2009 6:38:52 PM

With DX11 coming, Physx is gpu bound, whereas Havok can run on either a cpu or gpu, depending on the strength of your setup, so to me, Physx is too limiting, and wont make a big enough splash to survive
April 22, 2009 7:31:04 PM

hmm.... Havok? isn't that mainly CPU? when did it do GPu?
a b U Graphics card
April 22, 2009 7:51:29 PM

Its not like anything is stopping ATI from implementing PhysX said:
Its not like anything is stopping ATI from implementing PhysX


radeons not being able to run CUDA will be that "anything".
April 22, 2009 10:13:27 PM

Havok FX will use OpenCL which will be compatible for all GPUs, even S3 and larabee. PhysX will be officially dead by then.
April 22, 2009 10:24:17 PM

??? i was just asking whether PhysX should be implemented across the board, how am i a troll? you're a troll, if that's what you're doing.
April 22, 2009 11:14:19 PM

oK, this thread was going along just fine until someone thought somehow this whole thread was offensive or some bull****.

All i was asking was whether Physx should be implemented on all GPUs, and the pros and cons of doing so. I do not know the history of said subject, so, by all means, this isn't meant to be a "flame bait" or whatever the internet forums call it.
a b U Graphics card
April 22, 2009 11:35:57 PM

PhysX creates an overhead that adds yet more cost to Nvidia gpu's at a time where they simply cannot afford luxuries.

Would ATI have liked to use it for free a year ago, perhaps with royalty payments to Nvidia? Maybe. Would that have helped make PhysX take off? Most certainly. Did it happen? No. Why didn't it? It's just another in a long line of terrible mistakes Nvidia have made in the past year.
April 23, 2009 12:09:49 AM

Proprietary standards seldomly win. Physx is a proprietary standard. OpenCL is coming, which will likely help do physics equally well on ATI, Nvidia, and Larrabee hardware.

My logic says Physx will never be universal.
a b U Graphics card
April 23, 2009 10:46:23 PM

chef7734 said:
Physx is not just gpu limited. Nvidia opened physx to anyone that wanted to implement it. it is available for ps3, xbox 360, wii, iphone to name a few.


For this discussion, that statement is incorrect, X360, Wii and iPhone have no more access to PhysX than a PC with an ATi graphics card, it's all CPU-dependant physics, and not GPU-accelerated physics. Even the PS3 uses it's CELL FPUs to emulate a PhysX PPU, thus robbing it of core resources.

GPU-accelerated PhysX is not open to anyone who wants to implement it, it's tied to CUDA, and then if you want to work with both only THEN can you have it. Not quite as open as other options are making themselves. However the threat of Havok has made nV talk about porting PhysX to OpenCL.

Quote:
Havok does not support hardware physics acceleration but they are working on ati stream that is basically like cuda which will enable hardware acceleration.


Havok isn't on stream, it's using OpenCL, and so it's not tied to ATi or nV in the same way that PhysX is tied to nV's CUDA. S3 could simply write their OpenCL drivers and run Havok, whereas to run PhysX they have to ask and agree to run CUDA on their hardware to thus enable PhysX.

The time for PhysX seems to have passed. If they made a truly 'killer app' implementation when they had the stage to themselves maybe they could've brought people into their closed eco-system, however time has allowed others to perfect their implementations and provide the promise of universal support. PhysX as it stand either has to adapt to equal those benefits or still take a final kick at the killer-app can.

Sofar NONE of the implementations mean much of anything, because their implementation are as 'realistic' for physics as virtual reality is anywhere near virtually real.
April 24, 2009 2:34:48 AM

That is b.s sdk is sdk does not matter if it is Havoc or physx sdk. The ability is there whether anyone wants to use it or not. Havok is not run on gpu so it would be the same as running physx sdk. It is funny that when the sdk of physx is mentioned someone comes off saying that it is not the same because it is run by the cpu. Why the hell do you think a sdk is. Havok SDK physx SDK. Neither one uses gpu's and is available for anyone to use.
a b U Graphics card
April 24, 2009 7:45:27 AM

it's all CPU-dependant physics said:
it's all CPU-dependant physics


but its still PhysX, although not hardware-accelerated like what you just said.
a b U Graphics card
April 24, 2009 12:12:32 PM

Actually, the Wii, 360, and PS3 recently got PhysX support. The only reason ATI hasn't implemented support is becaues they don't want to support it.

As for the whole "proprietary standards" argument, which is more popular: DirectX or OpenGL?

As I said before, Non-linear physics, which is a necessity to get accurate physics effects, requires a large data bus to enable loading of mass amounts of data using a single LOAD operation in order to execute at a reasonable speed. Regardless of how fast a CPU's IPC is, if it can't load the necessary data in a timly manner, then how fast it is capable of doing the actual computations is meaningless.

A decent Physics implementation should be around as fast as rendering is, and we know what happens to FPS when you have the CPU start to render as well...Hovok, at best, implements standard 'textbook' liner-physics formula, just like every other standalone engine out there, which leads to limitations to how far you can take physics.

For example: Destorying an individual object once created is impossible under current physics implementations, unless the object is created with breakaway zones (Company of Heros, Battlefield: Bad Company), as opposed to dynamically destroying an object as it takes damage. I want an implementation that is capable of fully dynamic destruction of a solid object, and based only at the mathematical formula required, I have come to the conclusion that a CPU based implementation simply will not be able to constantly LOAD and EXECUTE the necessary data in a timly manner.

Also remember, any data sent to the CPU needs to be stored in its CPU registers prior to any mathematic formula being applied. As those registers are limited, you may only have access to one or two registers while playing a game (due to other resources eating the other registers). Hence why the GPU has an advantage: As GPU's use a higher bus length (128/256), you can theoretically send/hold more data (by segmenting a single 256bit register, you can hold 8 32-bit integers) at a single time, cutting down on how many LOAD operations (Instruction Cycles) needed to get, load, and execute the data. Hence why I feel a large data bus is a necessity for any decent implementation of a physics engine.
April 24, 2009 1:45:41 PM

The 360 has PhysX support? I thought it was an ATI GPU
a b U Graphics card
April 24, 2009 1:55:34 PM

Heres what we need to know:
"Additionally, OpenCL has querying tools to test the compute capabilities of an individual platform, so that the processing requirements can be best tuned to the components within an individual computer - if a system has a middle-range CPU but a high-end GPU then the tasks can be biased towards the GPU. Alternatively if the system uses a performance CPU but a mainstream GPU then the tasks can be biased to the CPU so that the user can maintain the best graphics quality whilst still attaining good performance.

It should be noted that the demonstrations we have done to date have had zero impact on the Havok toolset itself. In other words, in this case the developer does not need to change anything from their point of view to enable GPU acceleration; it is entirely transparent. "
http://www.guru3d.com/article/interview-with-ati-dave-b...
There you have it. Its early on in dev, but ATI is doing it. Now, since Intel and ATI have decided on Havok, guess where that leaves Physx?

a b U Graphics card
April 24, 2009 4:31:08 PM

rags_20 said:
The 360 has PhysX support? I thought it was an ATI GPU


M$ owns the console though, so they won out in the end. There is nothing stopping PhysX from ATI GPU's except ATI itself. Hence the reason why Backbreaker, a console exclusive, will be the game that shows once and for all how far physics effects can be taken.

gamevideos.1up.com/video/id/18491

Havok can't touch that.
a b U Graphics card
April 24, 2009 4:55:34 PM

gamerk316 said:
M$ owns the console though, so they won out in the end. There is nothing stopping PhysX from ATI GPU's except ATI itself. Hence the reason why Backbreaker, a console exclusive, will be the game that shows once and for all how far physics effects can be taken.

gamevideos.1up.com/video/id/18491

Havok can't touch that.


Why can't Havok touch that? There is no reason to believe the new Havok we see in the coming months won't be just as advanced as PhysX is now. Heck, there is no reason to believe that the current havok being implimente into the upcoming blizzard games wont be able to look like that given the right cpu/gpu. PhysX doesnt have any special powers, physics is just math afterall. The only reason Havok games don't show off physics as real as every day life is that that kind of computation required excessive parallel computing and it has to be stripped down to run a game.. the reason so many whant to do that sort of thing on the GPU.

Being a physicist I see computer simulations of real world physics every day. We have had programs that can effectively produce perfect physics in certain systems, such as convection in a star (we are talking calculating the motion of billions and billions of particles in a fluid) or the climate, for decades, and the equations for a hundred years. The only difference between what I ran on a super computer in school that took 3 days to compute and Havok/PhysX is that the latter are dumbed down to support their particular computing limitations. Obviosuly when you provide the engine with a more suitable coputing platform you can get more realistic simulations out of it for the same performance/ calculation time investment. Also, taking 3 days to render the next frame wouldn't be nice.. :D 
a b U Graphics card
April 24, 2009 5:44:05 PM

Quote:
Effects =/gameplay.

No, they certainly are not, I'd kill for a game as immersive as deus Ex with todays graphics...

That being said though, it certainly would be cool to play a game with physics calculations equal to the sedement deposit and movement, or metal deformation I do at times.. (Ya I know it sounds lame). But imagine a game where sand behaved like sand or a building bends and folds into itself realistically!

It does bother me though that people think PhysX is something special. Its all the same math.. sure there is something to be said for an efficient program with tricks here and there to avoid useless computation.. but you can't patent calculus..
a b U Graphics card
April 24, 2009 5:55:49 PM

I supose lame is par for the course on a computer hardware forum :) 

Speaking of the course.. it sure is ncie outside today, maybe get some golf in after work and see some real physics in action :D 
April 24, 2009 7:16:16 PM

daedalus685 said:
I supose lame is par for the course on a computer hardware forum :) 

Speaking of the course.. it sure is ncie outside today, maybe get some golf in after work and see some real physics in action :D 



Lunacy! There are no physics in the real world! Physx is for games.
a b U Graphics card
April 24, 2009 11:23:36 PM

Effects =/gameplay. said:
Effects =/gameplay.


do you actually play a videogame?

lets take race driver grid for example. the "debris effect". you approach a corner differently if its loaded with debris from a crash, as oppose to one that is clean.

same with fightnight round 3, the "facial effects" alone with let you know if your opponent is gonna hit the canvass any second.

the same way with any "real time explosion effect" from a nade/bomb/GL, the proximity of the explosion will let you determine the safe distance (ie, the old cs, cs:source, cod4, cod5 to name a few, quake3's old rocket launcher is also a good example).

well the statement you posted above is correct if you're still hell bent on playing text-based games, which is so 1991.

you are lame.

a b U Graphics card
April 26, 2009 3:18:16 AM

this ius what this whole topic has been about for years but so many sheep follow nvidia and accept there failings. said:
this ius what this whole topic has been about for years but so many sheep follow nvidia and accept there failings.


years? last i check nvidia GPU's just started doing physX last year. and its PhysX/Nvidia's fault that we dont see gameplay physics? but we should leave havok and other in-house physics engine out of this?

oh puhlease dont blame the physics engine, blame the game developer. the api is just a tool. the same way you dont blame the brush and the canvass for a bad painting, you blame the artist. the same way you dont blame the programming language from a crappy app, you blame the programmer.

somebody mentioned here thats theres no 'physx killer-app" the same way that theres no "havok killer-app" eh?

Until gameplay physics arrive which make a difference and cannnot legitamitely be done on a cpu(i.e not a bunch of useless extra calcs added in) then this whole subject is worthless as far as gaming is concerned. said:
Until gameplay physics arrive which make a difference and cannnot legitamitely be done on a cpu(i.e not a bunch of useless extra calcs added in) then this whole subject is worthless as far as gaming is concerned.


so what do you suggest? that developers remove in-game physics from all the games till we get that "gameplay physics" you're talking about?

as far as this thread is concerned the threadstarter somehow found himself impressed with PhysX, is it his fault?

unless you skip the part where you think about gpu/cpu calculations and instead play the game, you might not actually be lame at all. oops i just noticed, you're an ATI fanboy, ouch.
a b U Graphics card
April 26, 2009 10:53:40 AM

"Oh and whilst i am an Ati fan, i think you are misreading my personal quote, ati forever has nothing to do with stating i am a fanboy and all to do with me not calling ati AMD. "
My thoughts as well
a b U Graphics card
April 26, 2009 11:18:33 AM

RIP ATi *salutes*
April 26, 2009 11:54:04 AM

ATI ain't dead, it's still obviously selling the Radeon cards, they're just swallowed up by AMD, that's all... or am i missing something?
a b U Graphics card
April 26, 2009 12:22:56 PM

The ATi branding is gone, therefore as far as many are concerned they are "dead."
a b U Graphics card
April 26, 2009 2:20:47 PM

This whole debate will not end till we have a way to implement physics that matters. said:
This whole debate will not end till we have a way to implement physics that matters.


actually it'll end if fanboys will stop choosing sides and instead just play the game.

physX/havok/inhouse physics engine is essential to a pc video game. well unless you're stuck on bookworm.

like i said, you're bashing the sdk's, instead of bashing the developers on why they came up short on adding "gameplay physics".

advancement on videogame physics is on the rise anyway. it may not be that long eh?

Like i said ever since halflife 2 and before people have been wanting more realism in games and gpu's seemed the perfect hardware to help deliver it. said:
Like i said ever since halflife 2 and before people have been wanting more realism in games and gpu's seemed the perfect hardware to help deliver it.


you keep missing the point, the implementation of gpu-accelerated physX just happened last year.

Sigh, go and read all the previous physics posts, you may learn something.
said:
Sigh, go and read all the previous physics posts, you may learn something.


i did, and what i learned is that you might me from the future, or you might be smoking crack.

this ius what this whole topic has been about for years but so many sheep follow nvidia and accept there failings. said:
this ius what this whole topic has been about for years but so many sheep follow nvidia and accept there failings.


physx was acquired by nvidia last feb 2008. eh? you must let us borrow your delorean or d'ya have bill and teds phone booth?
a b U Graphics card
April 26, 2009 3:48:58 PM

You seem to think that shiny effects are a good thing, they ain't. Shiny effects do not need physics calculations, gameplay physics require calculations and we need something that can deliver those. said:
You seem to think that shiny effects are a good thing, they ain't. Shiny effects do not need physics calculations, gameplay physics require calculations and we need something that can deliver those.


you tell that too to the TS, he enjoyed those games that has those shiny effects that you're talking about. or do you spend most of your time with that 4870x2 thinking where the graphics calculation is happening?

and oh shiny effects. ive read that before from someone. shiny effects that moves/reacts need physics calculations though didnt that occur to you?unless you've have been able to come up with the very first algorithm in the world that does not calculate.

Physx does not equal physics. said:
Physx does not equal physics.


^ according to you. so what is PhysX exactly?

Also, who the hell cares about physx, i am talking about gpu physics. said:
Also, who the hell cares about physx, i am talking about gpu physics.


^ ???


like what i've said, if you'd just quit thinking where the calculation is happening and instead focus on playing the game, you might quit posting crackpot post like this :

this ius what this whole topic has been about for years but so many sheep follow nvidia and accept there failings. said:
this ius what this whole topic has been about for years but so many sheep follow nvidia and accept there failings.


which concludes that you do care about physX. and the word "years" up there is crapazoola.


Where the hell are you getting this idea that i am saying there should be no in game physics? said:
Where the hell are you getting this idea that i am saying there should be no in game physics?


didnt you posted this?:

Until gameplay physics arrive which make a difference and cannnot legitamitely be done on a cpu(i.e not a bunch of useless extra calcs added in) said:
Until gameplay physics arrive which make a difference and cannnot legitamitely be done on a cpu(i.e not a bunch of useless extra calcs added in)


physics already does make a difference, although not as prevalent as you want it to be. and useless? lol. name a game where a physics effect is useless or somehow made the game bad? as far as reality is concerned, physics enhanced videogames, not take something away from it.

one thing puzzles me though, ATI FANBOIS always bash PHYSX, but the fact is not all of them experienced GPU-Accelerated Physx firsthand. weird. or maybe there closest experience is the medge video??

a b U Graphics card
April 26, 2009 5:13:30 PM

Tell me this, how can an effect be useful if it does not change anything in the game and why do you need a gpu do do it. said:
Tell me this, how can an effect be useful if it does not change anything in the game and why do you need a gpu do do it.


because its add something to the experience? unlike you i'd rather enjoy the game. instead of telling myself "oh bummer! i cant be enjoying this, its not calculated the way its supposed to be!".

i played the first part of medge with physX on (i mistakenly forgot to turn it off), the shattering glasses, which surprised me as i have an ATI card, added a sense of rush, especially when the chopper was approaching on the side of the building shooting at me. it sort of told me: "you gotta go quick buddy, bad guys are coming at ya", it lead shortly to a BSOD though. now if you're going to argue that, it doenst make a difference because of some calculation conundrum that you're trying to explain, tell that to yourself because i aint buying that crap. it did make a difference in my GAMEPLAY EXPERIENCE.

I do not think you actually understand what we are debating. said:
I do not think you actually understand what we are debating.


im debating about the crapazoola that you're preaching. effects=\ gameplay. along with some other physics issue. what about you? still blaming the hammer and nails for a crappy house?

a b U Graphics card
April 26, 2009 7:01:02 PM

I give up, you clearly do not understand the difference between gameplay and effect based physics.
said:
I give up, you clearly do not understand the difference between gameplay and effect based physics.


i do, i just dont get what warrants the terms useless and "dont matter" to the current "effects" based physics implementation that you're crying about. the only response i get from you is "calculations". i really dont understand why the fanboys are blaming an API instead of the people who're using it, i really dont.

and you honestly think that the link you posted above is gameplay physics? because according to you nvidia failed somehow. dont phsyx has a demo similar to that? but then again i have to find a tech demo from any tech company that looked bad. those are what tech demos are supposed to do, tell half of the story, impress the onlookers. and 3 years and counting we still dont see any similar implementation from ati in any videogame to date and yet nvidia failed.

This subject goes back further than this, but years is not crapazoola. It has occured to me that you honestly think GPgpu and more directly gpu physics started with nvidia and their cuda, purchasing ageia. said:
This subject goes back further than this, but years is not crapazoola. It has occured to me that you honestly think GPgpu and more directly gpu physics started with nvidia and their cuda, purchasing ageia.


years is crapazoola, where talking gpu-based physics here, the first consumer level (which is the only part im interested in, because thats the one that concerns me, im a gamer not, a nuclear physicist) only came up last year. and if you somehow wasted your time discussing/debating "unreleased" hardware/software about gpu-based physics 10 years ago i wouldnt care, you and your buddies mustve flooded an awful amount of FUD.

and you mustve missed something, everything you see inside your lcd panel is an "effect". graphics effects / sound effects / physics effects. so gameplay physics is indeed an effect. you cant recreate nature inside a videogame, at least not yet, but we're getting there.

what concerns me is this:

Physics, must affect the game, if i shoot something, for it to be physics, it must interact with an object, not jsut show an effect of interacting without any consequence. said:
Physics, must affect the game, if i shoot something, for it to be physics, it must interact with an object, not jsut show an effect of interacting without any consequence.


fire up half-life2, shoot a bottle of wine. now what consequence would you want to see? to have jesus come up after shooting it?
April 26, 2009 8:57:54 PM

wow, okay, this is indeed getting out of hand...
hell, this makes me laugh. I created a thread to see what people know whether Physx can be used in more games, and it turned into this stupid bickering contest between this "strangestranger" guy and some other guy.

who cares about calcuations? for all i care, it looked pretty good compared to standard CPU physics. How about the box in HL2? i noticed it only breaks into a few breakaway pieces? What about it breaking into more pieces based on how you destroy it? There's nothing wrong with improving something at all.
a b U Graphics card
April 26, 2009 9:01:21 PM

Thats my beef as well. If gpus raise to the level where using physics ala gpu wont effect perf/fps, then go for it, and Im not talking about what can be done with cpus, since having 4 cores is at this time, pointless for the most part for gaming, and SMT, and having 6 cores or more leans towards the cpu anyways.
For a true game making physics need, whether done by cpu or gpu to happen, itll require alot more than what currently is being done, and also, will blow away what currently we have. So, the question is, what do we tap for power/usage to get to this level? Gpus are still getting faster, while cpus are getting wider. Theres an infetismal amount of cpu power in the future, can the same be said for gpus?
a b U Graphics card
April 27, 2009 3:11:23 AM

who cares about calcuations? for all i care, it looked pretty good compared to standard CPU physics. How about the box in HL2? i noticed it only breaks into a few breakaway pieces? What about it breaking into more pieces based on how you destroy it? There's nothing wrong with improving something at all. said:
who cares about calcuations? for all i care, it looked pretty good compared to standard CPU physics. How about the box in HL2? i noticed it only breaks into a few breakaway pieces? What about it breaking into more pieces based on how you destroy it? There's nothing wrong with improving something at all.


thats my point exactly. you somehow did found those physX effects from cryostasis/Medge worthwhile didnt ya? now somebody points those means nothing because its what? its PhysX? its sucks because its a visual effect? like what this guy is saying:

What use would a box breaking into a few more pieces be? said:
What use would a box breaking into a few more pieces be?


the use being added immersion and added realism. didnt that occur to you?

All i said was that effects do not equal gameplay. said:
All i said was that effects do not equal gameplay.


Can you name me one game, one game where the physics cannot be done on a cpu, one game and tell me why said:
Can you name me one game, one game where the physics cannot be done on a cpu, one game and tell me why


Medge? i havent seen any visual effects like that coming from havok or any other previous physics iteration before, did you? the only thing ive seen from havok so far is a can that fell off the table when you shoot them and limbs that broke-off. now if you're saying those can be done on the CPU without impacting performance, be my guess and make a patch that'll do so and make me happy. since you sort of pretend that you know how to calculate and you seem to believe that you're better from the folks @ DICE. why dont you ask DICE that question, too because they're the ones who made that game. (i love DICE battlefield '42, and now battlefield heroes-beta). tell them they f*cked up because you can do all those visual effects on the CPU.

but then again all of this shortcomings that you're talking about is nvidia's fault, eh?


if you somehow still base your judgment from a speculation thread from 2006 that you wasted your time on based on your crapazoola, by all means start playing a videogame now. afterall, its a "videogame", isnt it?

right now, you ony need to calculate maybe a few and then generate 10 other pieces t follow it, doesn't take massive amounts of power and you couldn't tell the difference if it was done by a cpu or gpu. said:
right now, you ony need to calculate maybe a few and then generate 10 other pieces t follow it, doesn't take massive amounts of power and you couldn't tell the difference if it was done by a cpu or gpu.


like what ive said, you seem to know something that the folks @ DICE dont, please make a MEdge patch or make your own videogame. that way you'll stop spreading FUD and some crapazoola.
a b U Graphics card
April 27, 2009 9:01:39 AM

Do you honestly think that nvidia are going to allow something like that to happen, it would make their tech look rediculous, marketing tricks, same bs ageia tried to pull off. said:
Do you honestly think that nvidia are going to allow something like that to happen, it would make their tech look rediculous, marketing tricks, same bs ageia tried to pull off.


you're another misinformed fanboy.

the physX sdk is free for download. nvidia aint stopping you from doing anything. so if you think you're better than DICE, start downloading the SDK and make those visual effects from the CPU.

heres the link http://developer.nvidia.com/object/physx_downloads.html

if you happened to come up making comparable effects to the ones from Medge i might actually believe what you're preaching.

i doubt you can come up with anything though, you just keep on pretending that you know something about gpu/cpu calculations when you actually dont.

Do you honestly think the game actually needs the gpu, seriously, are you that naive. said:
Do you honestly think the game actually needs the gpu, seriously, are you that naive.


if those kind of visual effects were indeed doable sans the GPU, we shouldve seen something like it way back. we didnt. it just so happened that the indemnity of being a fanboy clouds your perception of the game. "its sucks because its physx, because physx is nvidia", oh please spare me the douchebaggery.


a b U Graphics card
April 27, 2009 9:58:19 AM

What we have right now is crappy AI. Forget physics, the AI is far more important.
a b U Graphics card
April 27, 2009 4:22:18 PM

here i am thinking you'd actually come up with something that'll support that gpu-accelerated PhysX Effects can be legitimately on the cpu, im disappointed.

heres havok. you can give it a try as well. http://software.intel.com/sites/havok/

arent you surprised that everybody seems to have access to all these sdk's and not one nerd from ubisoft and ea can comeup with what you want?

all you have been doing is speculate really, or must you mustve get that from some other fanboy's gospel of nvidia hate.

the end (enhanced gameplay experience) justifies the means (physX in the case of Medge).

but according to you those do not matter, those are useless, the fun is invalid because of one good reason, they can be legitimately done on the CPU. DICE & NVIDIA cheated the whole videogaming industry by supporting gpu-physX. NVIDIA failed.

like what ive said if you think those effects can be done on the cpu, which you strongly claim by all means do it, prove nvidia/physx wrong. but as long as your just basing your stupid crackpot post from a thread you've read years ago, you wont go anywhere.

but seriously all those talks of gpu/cpu calculations and crap, im assuming you actually code games, or did you just read it from someone else? because that'll make it exciting. its like getting prescriptions from a quack lol.
a b U Graphics card
April 27, 2009 4:48:41 PM

Funnily enough, no i don't code games said:
Funnily enough, no i don't code games


all your talk of legitimate gpu/cpu calculation is BS then. end of story.
a b U Graphics card
April 27, 2009 4:58:48 PM

Can't you guys take the fanboi flaming somewhere else and let this pointless thread die?

Understand that physics is physics is physics.. PhysX is not special, havok is not special, they all use the same math that anyone who took first year physics had to code into a projectile motion simulation for their first assignment.

Yes, because of the parallel style of a GPU it is better at phsyics calculations.

No, physics will enver go beyond what it is now until a standard engine is made avaliable that works on BOTH the cpu and GPU in order to maximize compatability, simply slower on the CPU. Havok is closer to this end than physX, not because of any advantage.. just becasue of the stuborn nature of nvidia.

No, you would never in a million years be able to tell the difference between Havok and PhysX if they were both on the GPU, the only reason HAVOK seems weak now is because it is limitted to the CPU for compatability and certain cosnessions must be made for acceptable framerates.. Unless Ageia proved newtonian approximations wrong and never told anyone they are the same bloody thing in a different wrapper.

Sure, fancy effects may be pretty.. but the reason PhysX doesnt change gameplay (maybe it makes it mroe fun.. whatever, yay shinies) shoudl be pretty damn obvious.. if a game programmer chose to impliment physics into the core gameplay that were too slow on a CPU they would surely have shot themslves in the foot. It is best for everyone if a common engine is used for eevrything.. I don't care if it is PhysX or havok.. it is all teh same.. provided whatever we get in a few months works on the CPU AND GPU maybe we can see some improved gameplay.
a b U Graphics card
April 27, 2009 5:19:43 PM

Everything you point out, Stranger, should be obvious enough.. I hope.

You know, I can't see into the future.. I don't knwo how graphics loads are going to change.. But in any cutting edge game even the most powerful GPU's are pressed really hard. I am personally not willing to accept the loss of FPS by giving some power from teh graphics to teh physics unless it is something that CHANGES my gamepley.. for it to change my games it has to be standard or noone will implement it in any regard that is anything but pretties.

I am a large advocate of realism in games.. as a physicist my favorite avenue of this is real world recreation. AI obvisouly a close second. But this isnt goign to happen until the companies respnsible get together and make a standard companies are willing to incorperate in their basic gameplay. The constant assumption that one engine is inherantly better than another just, obviosuly, drives me nuts. I imagine BOTH will have traction and be marketed jsut like any other aspect of a game engine once they can both be used by all hardware.

Until there is a drastic change in computer/software architecture I can't see large strides forward happening. I can't see GPU physics going anywhere until there are standards in place that allow someone with a weak gpu to use thier CPU (noone is goign to impliment something that requires everyone to have part X to even paly the game until everyone has part X...Duh), and limitations such that calcualting the physics wont take up all of the FLOPS fom playing at high res in my GPU and vice versa are removed.
a b U Graphics card
April 28, 2009 2:42:45 AM

Can't you guys take the fanboi flaming somewhere else and let this pointless thread die? said:
Can't you guys take the fanboi flaming somewhere else and let this pointless thread die?


theres no fanboi flaming here, it just so happened that someone is pointing something he has no credibility of pointing out. the hell do i care if its physX or not. but in Medge case where physX undeniably enhanced the experience, and somebody says that its just shiny effect that can be done on the cpu, those means nothing, those are just useless calcs, and you're not supposing to be enjoying, it is just plain absurd. either this guy make it run on the cpu or just stfu. im a gamer, the only thing matters to me is the end result of the videogame. if its fun, then the videogame worked.

If you have an object, you require it to explode into a 100 pieces, do you need to project where all 100 will go if they do not interact with their surroundings or can you do 5 calculations for 5 different directions and have 20 clumps of glass move in a preset pattern? said:
If you have an object, you require it to explode into a 100 pieces, do you need to project where all 100 will go if they do not interact with their surroundings or can you do 5 calculations for 5 different directions and have 20 clumps of glass move in a preset pattern?


^ you dont code games so please stop this kind of stupid talk. every time you do this calc talk you're like a pornstar asking for sainthood.

maybe you can start coding a game yourself and see how easy a simple movement algorithm is : http://creators.xna.com/en-US/


a b U Graphics card
April 28, 2009 6:04:05 AM

wh3resmycar said:
here i am thinking you'd actually come up with something that'll support that gpu-accelerated PhysX Effects can be legitimately on the cpu, im disappointed.


My question to you is where do you think PhysX is done on the PS3? Or on the iPhone for that matter.
You're underwhelming 'it's still PhysX' is no different than saying Mirror's Edge Physics is on CPU for ATI, intel or S3 equipped systems where it's still PhysX, it's still just as good/lame essentially, in your simplified statement above. You want it both ways. For the original discussion the OP was talking about the GPU-specific implementation of PhysX, not the software/CPU model used by the Wii, PS3 and iPhone. It's one or the other, either it's all just physX on all supported GPU + CPU platforms, or else it's GPU-specific PhysX on specific platforms, of which the consoles are definitely not included.

As for the quality of Mirror's edge, who cares? The main point is it's not a killer-app, and if anything, less so than GRAW or UT3. You like it obviously, but many couldn't care less about it or whether it has PhysX in it or nothing at all.

It was supposed to be a groundbreaking title that changed the gaming landscape, and it wasn't. It's technically nice, but it's not enough to push it to respectable sales. Even EA treated it as a learning experience, because it wasn't the killer app the fans claim it to be.

Things can push the envelope on features, but if they aren't something that generate sales then it's influence is limited, and that's where PhysX remains.

Interesting, but not compelling enough to move a mediocre title. Unless those compelling implementations come from any camp, then it will come down to ease of use and broad support, whomever can combine the two the best will likely shape the future of game physics. Being their first is no more important than it was for Ageia.
a b U Graphics card
April 28, 2009 6:48:33 AM

gamerk316 said:
Actually, the Wii, 360, and PS3 recently got PhysX support...


Actually, no they've had it for a while. [:thegreatgrapeape:2]

They've updated their PhysX support offered through the companies' toolkits and middleware partners, but Ageia announced support for the PS3 years ago detailing the FPU use and announcing their support on the X360 a long time ago as well as mentioned in Ageia's early statements in 2005;
http://www.extremetech.com/article2/0,1697,1855078,00.a...

Read Mark Rein's statements about Gears of War & UT3, on PS3 & X360 back in 2006;

http://www.shacknews.com/extras/2006/032906_markrein_2....

This isn't new, it's just re-branded and repackaged to make it look new to those who missed it first time around. :sleep: 

gamerk316 said:
M$ owns the console though, so they won out in the end. There is nothing stopping PhysX from ATI GPU's except ATI itself. Hence the reason why Backbreaker, a console exclusive, will be the game that shows once and for all how far physics effects can be taken.

gamevideos.1up.com/video/id/18491

Havok can't touch that.


Actually what limits PhysX from working on the Xenos is.. the PhysX SDK, which actually limits it's use, and no it's no updated, nor is the platform it runs on, still using the 2007 Xbox SDK as their development platform.

So really, show me these changes that now exploit the Xenos GPU for the PhysX workload and not just the Xenon CPU that would make it either relevant to the topic or would make it different from another gaming platform that has a CPU doing the PhysX calculations and the ATi GPU doing the graphics, and how this would in anyway matter to the Wii or iPhone/iPod implementation. :heink: 

BTW, I like your choice in games, the one where the developer stresses the role of the CPU above all and who also mentioned Ageia, PhysX and PS3/X360 before nV entered the picture;
http://www.naturalmotion.com/files/white_paper_dms.pdf
http://www.reuters.com/article/pressRelease/idUS131657+...

Even their recent statements make no distinction between the two PhysX implementations, so the extent to which it shows physics has little impact on the type of physics mentioned by the OP's post about the GPU implementation in Mirror's Edge. :pfff: 
a b U Graphics card
April 28, 2009 6:56:05 AM

My question to you is where do you think PhysX is done on the PS3? Or on the iPhone for that matter. said:
My question to you is where do you think PhysX is done on the PS3? Or on the iPhone for that matter.


my god, didnt you read the part where my post says GPU-accelerated PhysX? which is a rebuttal to what the guy pretending to actually know how game codes works is preaching. and thats where the stupidity comes in. by asking "where" the calculation is happening instead of enjoying the game.

and the f*ck do i care, im not the one claiming anything about gpu/cpu calc crap. all im saying is it isnt useless because of some conspiracy theory that all those effects can be done on the CPU, which came from someone who has no credibility at all. all im saying is that it adds something to the experience.

As for the quality of Mirror's edge, who cares? The main point is it's not a killer-app, and if anything, less so than GRAW or UT3. You like it obviously, but many couldn't care less about it or whether it has PhysX in it or nothing at all. said:
As for the quality of Mirror's edge, who cares? The main point is it's not a killer-app, and if anything, less so than GRAW or UT3. You like it obviously, but many couldn't care less about it or whether it has PhysX in it or nothing at all.


the thread was made by someone who got amazed by Medge.

sure it may not be the best videogame out there but as a cross between the fps and platformer genre it did its job. it gave me the feeling of height and vertigo that was a little hard to achieve with 3rd person style platformers (ratchet on the ps2 is an exception). but i think some people are too busy thinking about the gpu/cpu calculations thats why they failed to notice that.

so whats next? you're going to come up with your comparison again about physx and bad hdr? which is a good as saying that if one desperate housewife (lets say terry hatcher) begs your for sex, then you suddenly found out she has fake boobs you'd actually turn her down? puhleease.

a b U Graphics card
April 28, 2009 7:37:36 AM

wh3resmycar said:
my god, didnt you read the part where my post says GPU-accelerated PhysX? which is a rebuttal to what the guy pretending to actually know how game codes works is preaching. and thats where the stupidity comes in. by asking "where" the calculation is happening instead of enjoying the game.

and the f*ck do i care, im not the one claiming anything about gpu/cpu calc crap. all im saying is it isnt useless because of some conspiracy theory that all those effects can be done on the CPU, which came from someone who has no credibility at all. all im saying is that it adds something to the experience.


The reason it matters "where" and the reason the OP cares, and the reason people posted the distinctions is because of his question about the GPU-accelerated aspect or Mirror's Edge. It's not about whether you feel good about it, or care about it personally, and if you actually read the original responses instead of going off on your tangent, maybe you'd have gotten that.

the thread was made by someone who got amazed by Medge. said:
the thread was made by someone who got amazed by Medge.


Yeah, and had a specific question that you ignored. It wasn't about whether his amazement mattered or was in question, however other people's opinion's of Mirror's Edge is what you responded to, and out of context too, missing the point, that regardless of implementation the success of PhysX is heavily driven by it's ability to push both physics adoption and sales forward, and at this point in time, it's struck out on the later, and it's affect on the former has yet to be determined, but isn't really being trumpeted anymore compared to the next potential messiah of physics.

but i think some people are too busy thinking about the gpu/cpu calculations thats why they failed to notice that. said:
but i think some people are too busy thinking about the gpu/cpu calculations thats why they failed to notice that.


I spent no more time thnking about it in ME than I did thinking about how the physics were done in any other game. I did focus on the end result, which just didn't impress me, nor did Crysis' and many other's who were overhyped and failed to deliver near what they promised. I didn't care about or think about their implementation while playing, it was afterwards when that even entered the picture. Even with Crysis, we discussed their potential implementation, then their decision to change that but I didn't think about whether it was good, bad or indifferent until after I played it.
It's likely the same with most other people, however it's a convenient red-herring to try and distract people from the fact that it really just didn't impress the game buying public, most of whom likely don't know anything about PhysX let alone that there might be multiple implementations of physics or PhysX.

so whats next? you're going to come up with your comparison again about physx and bad hdr? which is a good as saying that if one desperate housewife (lets say terry hatcher) begs your for sex, then you suddenly found out she has fake boobs you'd actually turn her down? puhleease. said:
so whats next? you're going to come up with your comparison again about physx and bad hdr? which is a good as saying that if one desperate housewife (lets say terry hatcher) begs your for sex, then you suddenly found out she has fake boobs you'd actually turn her down? puhleease.


No I wouldn't go to that tangent, that's your forte. My argument has always been, I don't care how it gets done as long as the end result is what I'm looking for, and not what others want to say is 'good enough' and have me accept their standards. Whether it's Ageia, ATi, intel, nVidia, S3, VIA or some other player, I don't care as long as the bullet-drop is accurate, and the effect is worth my investment either in the game or in the hardware. And when people want to discuss the technology or the potential then a little bit of reality is needed to counter alot of the fanbois, either the original PPU fanbois or the current Havok and PhysX ones.

But to you analogy I'd reply most of us are interested in the end result, not the wrapper it's rolled up in and hidden by. Hey for those of us who can tell the difference, silicone is no substitute for the real thing nor are falsies, and a real woman is better than a surprise where SHE is a HE. You may be happy with the fakes, but I'd rather get what's advertised thanks.

BTW, if you knew anything the subject, you'd know as made famous by Seinfeld, "They're REAL, and they're SPECTACULAR !!" :kaola: 
a b U Graphics card
April 28, 2009 8:09:58 AM

The reason it matters "where" and the reason the OP cares, and the reason people posted the distinctions is because of his question about the GPU-accelerated aspect or Mirror's Edge said:
The reason it matters "where" and the reason the OP cares, and the reason people posted the distinctions is because of his question about the GPU-accelerated aspect or Mirror's Edge


like what ive said, my post is a rebuttal to the guy who said dice/nvidia cheated by not implementing it on the cpu, not to the OP. so im asking him to do it himself and show the world it can truly be done after all according to him its a useless calculation. if he can't do similar effects on physx do it on havok.

and as long as you fail to understand that gpu-physX is still a "graphics option" which you can turn on and off, which makes it an enhancement not a core function of the game, you'll never get anywhere. keyword is enhancement.

and bad hdr my *ss lol.

Hey for those of us who can tell the difference said:
Hey for those of us who can tell the difference


by not using it metaphorically, you have actually been able to test a game with the supposedly "gameplay physics" compared to one that you say has the "shiny" version? to tell the difference between an imaginative speculation and real time implementation? seriously, arent you blabbering enough already?

if so what is it? i'd be more than happy to play it lol.



Yeah, and had a specific question that you ignored.

said:


Yeah, and had a specific question that you ignored.



i ignored his specific question?

what part of my reply to him did you not understand. its the 3rd post from page 1, its under my tag wh3resmycar:

the PC being an open platform and physx (accelerated) is entirely dependent on a geforce gpu or the old ppu.
said:
the PC being an open platform and physx (accelerated) is entirely dependent on a geforce gpu or the old ppu.


is it illegal to post and respond to a post apart from the OP? puhlease. you sound so close to being an ape indeed. either that or you just hate nvidia so much lol.

ps: i'd do terry hatcher still. i mean she's terry hatcher with or without the fake boobs.
April 28, 2009 8:58:26 AM

holy ***...
this thread keeps on popping up in my inbox... and this thread consisted only of like, 5 members, not the 20 or more i envisioned...

I was just saying the physics stuff by the GPU looks pretty impressive to me, i would like to see more games implement this kind of physics in one way or another, and it doesn't necessarily have to be PhysX. It's only one way in a lot of ways for developers to immerse us in their games, why not?

It's like being politically correct if you don't want to offend the fanboys of both sides, don't call it PhysX because Nvidia made it, or whatever...


I don't understand the fanboy bullshit, just enjoy your games with whatever GPU you got, may it be Nvidia or AMD(ATI) or even Larrabee, whatever the hell Intel is doing.
!